Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
First Person Shooters (Games) PC Games (Games) Entertainment Games

Tim Sweeney Talks Unreal Engine 3 42

An anonymous reader writes "Following the recent unveiling of Epic's Unreal Engine 3, Beyond3D has interviewed Tim Sweeney of Epic about the next-gen videogame engine. The discussion is mainly about the 3D requirements, but they also touch on other technologies that are used or required: 'Off-the-shelf 32-bit Windows can only tractably access 2GB of user RAM per process. UT2003, which shipped in 2002, installed more than 2GB of data for the game, though at that time it was never all loaded into memory at once. It doesn't exactly take a leap of faith to see scenarios in 2005-2006 where a single game level or visible scene will require >2GB RAM at full detail.'"
This discussion has been archived. No new comments can be posted.

Tim Sweeney Talks Unreal Engine 3

Comments Filter:
  • Memory and Windows (Score:3, Informative)

    by ADRA ( 37398 ) on Tuesday June 22, 2004 @01:38PM (#9497735)
    I thought that I read the memory addressing in 64bit windows was also set to 32bit addresses. Would this means that current 64bit windows binaries also limit this? It'd be pretty cheasy to have multiple processes with IPC's to fully load the games, or anything into memory.
    • "It'd be pretty cheasy to have multiple processes with IPC's to fully load the games, or anything into memory."

      Not for us Dual Processor owners. ;)
    • by Foolhardy ( 664051 ) <csmith32&gmail,com> on Tuesday June 22, 2004 @02:58PM (#9498738)
      Ummm, no. All pointers in 64 bit programs are 64 bit. The current amount of that address space devoted to user-process memory is 512TB. See this [microsoft.com]; it's about win64 on Itanium, but I'm sure AMD64 is the same.
      Perhaps you are thinking of PAE [microsoft.com] on 32-bit systems?
      Windows is fully capable of providing real 64-bit addressing. It even causes driver problems; you can't use 32 bit drivers in 64 bit Windows.
      Current versions of OSX, OTOH, can't. They use memory windowing similar to PAE.
  • Well.... (Score:2, Funny)

    by El_Ge_Ex ( 218107 )
    It doesn't exactly take a leap of faith to see scenarios in 2005-2006 where a single game level or visible scene will require >2GB RAM at full detail.

    That might help explain DNF [3drealms.com]

    (ducks...)

  • by shadowcabbit ( 466253 ) <cx@thefurry o n e . net> on Tuesday June 22, 2004 @01:56PM (#9497996) Journal
    UT2003, which shipped in 2002, installed more than 2GB of data for the game, though at that time it was never all loaded into memory at once.

    This makes sense. I was able to run UT2K3 without a problem, but after installing UT2K4 I've been playing less solely because the game is a bit jerkier, takes forever to load initially, and is less reliable (I get "hardware failures"). I have a suspicion that this is very much related to RAM usage. I'd love to see an accurate depiction of how detail settings affect RAM usage-- ie on such and such a detail level, you use X amount of RAM. How about a patch for the UI to optionally show this? I know it would be useful for about, oh, a thousand users tops, but knowing how much leeway I have in my detail settings would be a damn nifty thing to have.
    • by Vaevictis666 ( 680137 ) on Tuesday June 22, 2004 @02:02PM (#9498072)
      How about using the console?

      From Planet Unreal [planetunreal.com],

      MEMSTAT - Displays Windows memory usage
      STAT ALL - Shows all stats
      STAT AUDIO - Shows audio stats
      STAT FPS - Displays your frames per second
      STAT GAME - Displays game stats
      STAT HARDWARE - Shows hardware stats
      STAT NET - Shows network game play stats
      STAT NONE - Turns off all stats
      STAT RENDER - Displays rendering statistics

      You probably want to use memstat. While in the game, hit the backquote key (often called the tilde key, ~) to bring down the console. Type in the command, hit enter.

      I tend to avoid stat all because it just crowds the screen, but stat fps is useful for determining the effects of display settings as well (for performance)

    • by netfool ( 623800 ) * on Tuesday June 22, 2004 @02:07PM (#9498121) Homepage
      If you have some time to waste and extra RAM, try installing UT on a Ram Disk [arsoft-online.de].

      I did this with Quake 3 a couple years ago and it worked great. I didn't exactly see a huge gameplay performance increase, but the levels loaded instantly.

  • by MiceHead ( 723398 ) on Tuesday June 22, 2004 @02:08PM (#9498136) Homepage
    Tim Sweeney will, in my mind, forever be linked to the wonderful ZZT [autofish.net]. This interview [tripod.com], given some time before Unreal 2 is an interesting contrast to the one posted above [beyond3d.com]. In particular, he talks about ease-of-creation:

    Hercules: You moved onto to other, bigger projects long ago. It must be good to know that the first thing you ever created is still used/played a lot. Does ZZT still cross your mind, sometimes?

    Tim Sweeney: Yes, one of the interesting things to do is contrast ZZT and Unreal, and look at how incredibly far we've come in graphics quality in that time. But also to see how little the industry has progressed -- or maybe even gone backwards in some respects... So, how will game development be 10 years from now? If levels take six months to build, and compiles take 5 hours each, and it costs $20 million to develop a game, then developing games won't be fun or even possible anymore.


    I'm a fan of creation tools that are accessible to anyone who can play the game. (Casual players who may not be technically inclined.) As a developer, I'm hoping that we [dejobaan.com] will be among the first to offer something that lets even the most casual user plink around. As a player, I'm hoping that Sweeney has retained this philosophy, and that future Epic offerings let us build -- at least a little bit -- with the same ease that ZZT did.
  • It doesn't exactly take a leap of faith to see scenarios in 2005-2006 where a single game level or visible scene will require >2GB RAM at full detail.

    <Insert obligatory Microsoft Longhorn joke here>

    Kidding aside, this really isn't surprising in the slightest. Games and memory have all been exhibiting 'Moore's Law'-like performance. Heck, most high end rigs nowadays could easily install and play, in memory, an entire game less than a few years old.


    --LordPixie
    • One of the interesting things from Counter-strike : Condition Zero, is that you can play the game straight from CD ; it installs and plays in memory, and does that without a hitch.

      Taken the HL engine is highly outdated, it was still very funny for me to see.

    • Didn't they say longhorn will require over 2GB of RAM just to start up?

      Lets face it. By 2008 PC games will need 10GB of RAM, quad itanium 5s, DVD Blu-Ray drives, 3D Monitors, sidewinder joysticks, headset and a credit card for each and every play. The gameplay will be just as shallow as it is now, but the graphics will be 'unreal', heh, and we will all flock to buy them.

      While all this is going on the 14 year old Super Mario World is still stellar, Quake is still being played, and Micro Machines 2 is STILL
  • by KDR_11k ( 778916 ) on Tuesday June 22, 2004 @02:14PM (#9498199)
    That's all nice and well, but who actually makes the content that fills up those 2GB? You'd need a pretty large team and several months or years to make that much stuff, if you need it per room it wouldn't surprise me if future games were as long as movies or had a level design like Halo or Metroid 1 (that is, you have room 1 ten times then room 2 ten times then a few room 1's and maybe a room 3 with a really big monster for a little variety).
    This is going to hurt gaming. We're already seeing shorter games and copied&pasted rooms simply because the effort to make those rooms is too high.
    I have a feeling that despite having lower sales, making a 2d game with a tiny team in a few months might actually have larger profit margins than top-end development.
    Also, as always, higher costs mean more need for the games to actually sell means publishers won't allow as many risky games to be made since taking a risk on one could blast ther entire company.
    • by NanoGator ( 522640 ) on Tuesday June 22, 2004 @02:32PM (#9498411) Homepage Journal
      "That's all nice and well, but who actually makes the content that fills up those 2GB? You'd need a pretty large team and several months or years to make that much stuff..."

      2D and 3D artists make the content that fills that space. The thing to remember is that it isn't necessarily a linear relationship between how much arist time is needed and how much RAM is being taken up. Using 2x the texture size, for example, doesn't take twice as long to generate. A lot of time spent on making 3D art is in shrinking things down to meet the requirements.

      Check out this image I made here [reflectionsoldiers.com]. (Note: That's not a game model.) *All* of the textures were originally generated at 3072^2 resolution. They were too high for my tiny gigabyte of RAM, so I had to knock them down to 2048^2. If I had started at 2048, it wouldn't have been much faster to generate them. The source imagery was big enough in either resolution, so short of the extra processing time it'd have taken, it would have been pretty much the same.

      The real time spent will be in making something more ambitious. Twice as long? I doubt it. Maybe one day when the game machine has specs that exceed the artist abilities, but we are generations away from that. The tools we have today are pretty darned cool, and they're only going to get better as each generation goes by.

      In short, these companies already have the talent *today* to put 2 gigs worth of content on the screen.
    • This is going to hurt gaming. We're already seeing shorter games and copied&pasted rooms simply because the effort to make those rooms is too high.

      I don't think it'll be quite that bad. Nowadays you already have game designers making large levels that have to be subdivided into smaller levels with load screens in between, and things like textures and models start off as highly detailed but are reduced down significantly for the final game.

    • Give an artist a number and he'll easily double it with content. I've never seen an artist have problems filling up space. The sort of problem will be solved. There's always a company out there that's will to make a content creation tool to help out. If you're seeing copied and pasted rooms, that's more due to a poor developer than due to a space limitation.

      You do raise some better points at the end of your post. A good 2D team could have a larger profit margin, if they're lucky to make any money at a
      • If you're seeing copied and pasted rooms, that's more due to a poor developer than due to a space limitation.

        I'm a single-player mapper for Half-Life in my spare time. Stuff I've done has been fairly well received [telefragged.com]. And I can tell people this - map design for single-player games is difficult. I can spend a week perfecting something that'll last the player ten seconds. A simple room can take days to build, and this in on the original Half-Life where a suitably textured cuboid can be just about anything.

        In
    • "I have a feeling that despite having lower sales, making a 2d game with a tiny team in a few months might actually have larger profit margins than top-end development."

      Even more so, if you write the engine to a solid, expandable, and powerful (while easy to build upon, ofcourse) 2d game, then have multipl in-house, independent teams write games off of it, and lease the engine to other companies... You could have a team of 7-10 people, using an engine your company wrote, acting as their own independent gam
  • will the AI be still as dum as today? scripted and all that boring stuff? all those MiBs go to graphics ?
  • I've been wondering about this: is there a project to create a competitive OSS game engine? Or any commercial games produced with such a thing?

    Is it because it's freakin' hard?
    • Re:OSS Engines? (Score:4, Informative)

      by shadwwulf ( 145057 ) on Tuesday June 22, 2004 @03:20PM (#9499019) Homepage
      Crystal Space [sf.net] fits the bill in my opinion.

      It is being used for a couple of commercial level games from what I understand.

    • Re:OSS Engines? (Score:4, Informative)

      by Moonshadow ( 84117 ) on Tuesday June 22, 2004 @06:14PM (#9501045)
      Another good one is Ogre [ogre3d.org]. It's purely a rendering engine, which lets you choose your collision/sound/networking/whatever else libraries, but there are a few engine frameworks springing up around it. It's fast, very clean, and capable of a lot of current generation effects (well, it has full shader support, so I guess it supports most anything you can code a shader for). If C# is your flavor, Ogre has a cousin called Axiom [sourceforge.net] that is just as functional. Axiom is intended to be a game engine, but is very much in its infancy, so there isn't too much besides (rock solid) rendering in place there yet. Still, though, both are very clean and excellently designed, and are both well worth a look.
  • Hey Tim! (Score:5, Funny)

    by superultra ( 670002 ) on Tuesday June 22, 2004 @03:06PM (#9498835) Homepage
    -cough-96k [slashdot.org]-cough-
  • Intense (Score:2, Funny)

    by zaphodchak ( 644557 )
    Holy crap. Actually, i have to clean out my case. My MX440 SE (PCI) just wet itself. I've heard talk that U3 will want 1 gig of VRAM for full detail, which needless to say, doesn't really exist (for mortal consumers) yet.
  • by oman_ ( 147713 )
    2 Gig of data loaded into ram to run a game?!?

    I think while game developers a whining about rediculous resource limitations the creative developers will be doing sensible things like creating algorithmic game assets using iterative fractals or some other more advanced techniques. In the end you'll have products that are smaller, faster, and cheaper to produce.

    I guess the lack of creativity isn't surprising considering that Epic is still making the same old FPS games.
  • by Anonymous Coward
    And they think I'm going to have 2 gigs of ram in just two years? Not at these prices! Hell, I'm still running a Radeon 7500! I've been waiting to upgrade until Half Life 2 and Doom 3 come out. No sense in buying a new card before then.

    They're nuts. I develop games. I don't see any need to use that much ram for textures. Look at what games like Metal Gear on the PS2 look like, and they've only got a fraction of the ram that PC game developers have.

    Before they switch to high res texture,s maybe they

"Everything should be made as simple as possible, but not simpler." -- Albert Einstein

Working...