Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Quake First Person Shooters (Games) Hardware

Carmack's QuakeCon Keynote Detailed 309

TheRaindog writes "In addition to announcing the Quake III source code's impending release, John Carmack's QuakeCon 2005 keynote also covered the programmer's thoughts on Microsoft and Sony's next-gen consoles, physics acceleration in games, and what he'd like to see from new graphics hardware."
This discussion has been archived. No new comments can be posted.

Carmack's QuakeCon Keynote Detailed

Comments Filter:
  • Kenote? (Score:2, Funny)

    by rasjani ( 97395 )
    mmmmmm.. Kenote! Delicious! [salive drool] mmmmmm.
  • Procedural textures (Score:5, Interesting)

    by mnemonic_ ( 164550 ) <jamec@u m i ch.edu> on Tuesday August 16, 2005 @04:54AM (#13328824) Homepage Journal
    I was a bit taken aback by Carmack's opposition to procedural textures. No, they can't do everything but they can be real timesavers when you need to add some overall realistic looking details. Things like dirt, "roughness" and stains can be done effectively using Brownian noise and the like, and you've got the infinite resolution, low-memory features of procedurally generated data. It's efficient and looks good, especially when I used it to create realistic terrain [umich.edu].

    Of course procedural textures can never replace hand-painted detail, but layering on some infinite-resolution noise-detail onto a finite sized bitmap texture really brings materials to life.
    • by MaestroSartori ( 146297 ) on Tuesday August 16, 2005 @05:10AM (#13328863) Homepage
      The argument generally is, as far as I know, that it's overkill for the current generation of hardware. Rather than procedural noise generated realtime, a few pregenerated detail noise textures can do the job with a fraction of the gpu time. It's pretty hard to tell the difference with a decent artist doing the noise maps, really.

      Maybe during the next-gen consoles' lifespan we'll start seeing more procedural stuff. It'll become more important as we start pushing more polys and going down the High Definition route, I think.

      (I'm more interested in offline procedural content generation, personally - automatically generated cities, it's the way of the future! :D)
      • by m50d ( 797211 )
        automatically generated cities, it's the way of the future!

        If you think that, play Daggerfall. Play it anyway actually, it's a great game - but it still shows that generated cities are a really bad idea.

        • by MaestroSartori ( 146297 ) on Tuesday August 16, 2005 @05:54AM (#13328979) Homepage
          Oh, that game :D

          I should probably explain further. My approach would be to generate the basic street layouts, buildings, and maybe even internal floor & room layouts procedurally, say in a Maya/Max plugin. This would act as the basis for artists/designers to then tweak and adjust to produce something good, hopefully in a fraction of the time.

          Using control maps (for population density, affluence, terrain, etc) it should be possible to have fairly fine control over how the city is generated. Add to that a decent set of rules to govern the generation, and a big stock library textures/shaders to give a nice looking generic output, that should give a decent start point.

          I know some of the guys who worked on GTA3/VC/SA, and one of their big problems was generating the sheer amount of content to make these large play areas. Starting with a pre-populated one and using it as a base might let them concentrate on making it good...
        • We used to call that game 'Lunchfall' there was soimething screwed up with the raycasting engine that caused vertigo after extended sessions.

        • If you think that, play Daggerfall. Play it anyway actually, it's a great game - but it still shows that generated cities are a really bad idea.

          Not really - it shows that one particular implementation didn't work well. There are certainly bad ways and good ways to do something like this - saying that because it failed once it isn't a good idea doesn't make any sense to me.

          Your comment is about the same as saying "Look at the Apple Lisa - it shows that GUIs are a really bad idea."
    • by Andy_R ( 114137 ) on Tuesday August 16, 2005 @05:30AM (#13328925) Homepage Journal
      Procedural textures can go a lot further than you think. Take a look at how far you can go with Artmatic Pro [uisoftware.com], a 'procedural graphics synthesiser' for the Mac, written by the original author of the Bryce landscape generator, and it's landscape-generating cousin Artmatic Voyager [uisoftware.com]. This can generate entire procedural planets, with no detail loss if you zoom into look at a single inch-wide rock [uisoftware.com]. This entire planet is decribed in a few k!
    • I think you guys misunderstood. He is talking about procedural bitmap texture generation vs. tiled. Not procedural displacement. Which is a whole different monster. Procedural bitmap generation has issues in realtime, procedural displacement is great.
    • I think Carmack's opposition to procedural textures is for practical, not technical reasons. Developing good-looking shaders requires math and programming skills that most artists do not have. You'd have to tie up a software developer to write the shaders (and possibly an artist too, if the developer doesn't have a good artistic "eye"). So from a manpower perspective, it makes more sense just to have a bunch of artists cranking out texture maps in Photoshop.
    • Well, for disc-delivered content, I can see his objection - why waste image quality? However, for downloaded content, I want the textures crammed in as small space as possible. I'm sick of connecting to UT2k4 maps and having to wait through the first 2 points of a 5 point CTF game because the mapper decided that the base textures just weren't good enough and loaded in a bunch've new texture data that takes a dog's age to download.

      Player-editable content needs to be small for others to use it. Mods need to
      • "because the mapper decided that the base textures just weren't good enough and loaded in a bunch've new texture data that takes a dog's age to download."

        even with big a** textures, the server admin should know enough about what he is doing to make the downloads http, which is a hell of a lot faster than the in-game bandwidth cap of say 10kb/s
  • by Anonymous Coward on Tuesday August 16, 2005 @05:08AM (#13328854)
    Lets talk about Jon Carmack. Jon is the legendary programmer of such classic PC games as Wolfenstein, Doom, Duke nukem 3d, Quake 1, 2, and 3, unreal, and the upcoming doom3. Jon has single handedly created the genre known as the first-person-shooter. He has also popularized the OpenGL 3d format over Microsoft's competing Direct3d format, as well as caused public interest in 3d cards when he first released accelerated quake for the s3 virge chipset. Jon carmack has redefined gaming on PC's.

    Now stop for a moment and think, What would have happened if Albert Einstein had worked creating amazing pinball games instead of creating the theory of relativity? Humanity would suffer! Jon carmack is unfortunately doing JUST THIS, using his gifts at computer coding to create games instead of furthering the knowledge of humanity. Carmack could have been working for NASA or the US military, but instead he simply sits around coding violent computer games.

    Is this a waste of a special and rare talent? Sadly, the answer is yes.

    Unfortunately, it doesn't stop there. Not only is Jon carmack not contributing to society, he is causing it's downfall. What was the main reason for the mass murder of dozens of people in columbine? Doom. It's always the same story: Troubled youth plays doom or quake, he arms himself to the teeth, he kills his classmates. This has happened hundreds of times in the US alone. Carmack is not only wasting his talents and intelligence; he is single-handedly causing the deaths of many young men and women. How does he sleep at night?

    Carmack is a classic example of a very talented and intelligent human being that is bent on total world destruction. Incredibly, he has made millions of dollars getting people hooked on psychotic games where they compete on the internet to see who can dismember the most people. I believe there is something morally wrong when millions of people have computerized murder fantasies, and we have Jon Carmack to thank. Carmack has used his superior intellect to create mayhem in society. Many people play games such as quake so much that their minds are permanently warped. A cousin of mine has been in therapy for 6 months after he lost a 'death match' and became catatonic.

    It is unfortunate that most people do not realize how much this man has damaged all the things we have worked hard for in America. Jon has wasted his intelligence, caused the deaths of innocent children, and warped this country forever. To top it off, he got rich in the process and is revered by millions of computer users worldwide. Perhaps one day the US government will see the light and confine Jon Carmack somewhere with no computers so he can no longer use his intelligence to wreak havoc on society.
    • Interesting troll there would mod you funny if i had points ,
      However you're missing out on a few things, what is the point in having a perfect world if you can't have fun and The advances in real time 3d graphics rendering continue to help in fields beyond the gaming world , medi
    • Not to mention that he is quite openly proclaiming what his intentions are -- what other is the 'Id' in Id Games than a reference to the Freudian Id [wikipedia.org]?
    • by CrazedWalrus ( 901897 ) on Tuesday August 16, 2005 @05:51AM (#13328966) Journal
      All kidding aside, it's amazing how "recreational activities" end up pushing the limits and levels of technology to a point that it can eventually be used for more "serious" purposes. Examples?

      Pr0n had a lot to do with pushing the massive webserver throughput / broadband increases we've seen in the past several years.

      Gaming is directly responsible for the graphics technology that can later be used in training simulations for going to Mars.

      Of course, if NASA uses the Quake engine for training for trips to Mars, they may also need to equip the astronauts with railguns...
    • Is this a waste of a special and rare talent? Sadly, the answer is yes.

      No, there are many as talented and more talented people working for NASA and the DoD. Most of these people don't opt for video games and in other fields of programming you don't get "rockstar" press. You are merely looking at one of the bigger fish in a small pond.

    • by dolmen.fr ( 583400 ) on Tuesday August 16, 2005 @08:11AM (#13329480) Homepage
      Carmack could have been working for NASA or the US military, but instead he simply sits around coding violent computer games.

      Don't worry so much. Carmack's talents are not wasted. He is already in the space business with his hobby: he's leading Armadillo Aerospace [armadilloaerospace.com] to work "on computer-controlled hydrogen peroxide rocket vehicles, with an eye towards manned suborbital vehicle development in the coming years".
    • "Strange how much human accomplishment and progress comes from contemplation of the irrelevant."
      - Scott Kim

    • Jon is the legendary programmer of such classic PC games as Wolfenstein, Doom, Duke nukem 3d,

      Ken Silverman created the Build engine for Duke Nukem 3D, not Carmack. In fact, Carmack has never worked for 3D Realms.

      Just a minor detail, everything else I can't speak about.
    • Was this generated from a form letter?
  • by canozmen ( 898239 ) on Tuesday August 16, 2005 @05:09AM (#13328860) Journal
    Although Mr.Carmack says physics in game engines isn't easily scalable for level of detail, there is ongoing research about this producing good results. I remember a video from last years SIGGRAPH that had hundreds of plastic chairs falling from the sky, and bouncing realistically. The important part was it employed a level-of-detail hierarchy for interacting parts (i.e. an object doesn't have much physical detail if you don't touch it), but it will be some time before we can see such techniques in real time games.
    • that kind of trick is actually pretty old. almost all existing physics engines use similar tricks to speed up calculations.
    • Funniest. Subject line. Ever.
    • I think 'they' should concentrate most of their resources on just making natural animal movements realistic. We have walls that look like walls. We have shadows that look like shadows. We have toppling barrels that look like toppling barrels. We don't have animals that move like natural animals.
    • by aarku ( 151823 ) on Tuesday August 16, 2005 @06:39AM (#13329102) Journal
      As a game developer, I'll say it'll come sooner than you think. Engines such as Unity [otee.dk] will support Aegia's PPU when it comes out as it already uses the Novodex engine. From there it would take about 15 minutes to set up, tops. Expect some awesome things to come from little Indie developers.
    • what he meant (Score:5, Insightful)

      by Anonymous Coward on Tuesday August 16, 2005 @07:22AM (#13329250)
      Although Mr.Carmack says physics in game engines isn't easily scalable for level of detail, there is ongoing research about this producing good results. I remember a video from last years SIGGRAPH that had hundreds of plastic chairs falling from the sky, and bouncing realistically. The important part was it employed a level-of-detail hierarchy for interacting parts (i.e. an object doesn't have much physical detail if you don't touch it), but it will be some time before we can see such techniques in real time games.

      I think you're misunderstanding his point. When he talks about level-of-detail, this has more to do with game design than with algorithms. What he's claiming is that detailed physics has much more of an effect on actual gameplay than detailed rending does, and that it's harder to write a game which graceful downgrades the player's physical interaction with the world. But a graceful downgrading is necessary for people who don't have a fancy physics-accelerating card.

      For example, you can take an older game and change its appearance by giving it higher resolution textures, more detailed meshes for the AI models etc., without having to redesign the actual gameplay. (e.g., the SHTUP [att.net] and Rebirth [wanadoo.fr] mods for System Shock 2).

      These steps are independent of each other and independent of the rest of the game. They can simply be dropped in, or not. The point is that if it's that straightforward to take a game forward in technology, it's even easier to go in reverse. So the player can choose low texture detail, etc., and the game may look worse, but it will still play the same.

      The game physics on the other hand has historically been more closely connected to the way the player interacts with the world.. so it has a big effect on level design. If Half-Life 2 had a 'simple physics' option that would somehow revert the game physics to something equivalent to the physics in the original Half-Life (ignoring aside the difficulty in implementing such an option) then some areas would have to be substantially redesigned so that they would remain playable for people using the simple physics.

      This is of course what he means by peripheral elements "such as flowing water" being accelerated. But I have two criticisms of this.

      1) Yes, physics acceleration may affect mainly peripheral elements of the game. But in some ways, the same could be said about improved textures, filtering, etc. If it's done well, it can significantly improve the overall experience. If it's done poorly, the player will hardly notice.

      2) As long as it's an upgrade of the basic design, it will probably be okay to let it affect critical elements as well. E.g.: due to the engine upgrade in the port of Half-Life to the Source engine, movable crates and such have a more realistic response than in the original implementation. It's not a big improvement, since the levels were really designed with that in mind. But it doesn't hurt.

      For me, the real question is whether improved physics would really make a game more enjoyable. I think this depends more on graphics than on anything else. As objects are made to look more realistic, it becomes more satisfying for them to have real-seeming interactions.

      If graphics get much better, accelerated physics will be important. But if for some reason graphics tend to stabilize (due to the end of Moore's Law, long load times caused by slow disk access, or whatever), then the usefulness of improving game physics is more questionable.
  • by Aphax ( 727653 ) on Tuesday August 16, 2005 @05:13AM (#13328872) Homepage
    I found his views on dual core processers fascinating. Until now I had always believed they could give major boosts in performance in games as soon as the developers made their games multi-threaded. Maybe I should put off buying that dual-core cpu for a bit longer.
    • by fistynuts ( 457323 ) on Tuesday August 16, 2005 @05:27AM (#13328916)
      > ...as soon as the developers made their games multi-threaded

      This is considerably more difficult than one would think. Games typically have to perform tasks in a particular order, for example (extremely simplified): get inputs, move player, move AI players, move other objects, check for collisions, update parameters, display the next frame, loop.

      Quite where we add this 2nd thread is difficult. Everything must happen in the same order in order for things like collision detection to function correctly. If we start a second thread to, say, calculate AI decisions and move the AI characters according to those decisions, we have to wait for that thread to complete before we can display the next frame. So it ends up that there are no advantages to utilising that second thread.

      Now, I'm sure there are game developers on here who know how to utilise threads in games in a successful way. It'd be cool if one of them could inform the rest of us what the heck we're supposed to be doing with them :)
      • by el_womble ( 779715 ) on Tuesday August 16, 2005 @06:01AM (#13329002) Homepage
        Or you could stop thinking of it like that and start thinking of it as: Thread 1: Wait for input
        1. Add to unprocessed que
        2. Grep for coded expressions
        3. add symbol to character action queue
        Thread 2: Charactor Thread
        1. read action queue
        2. publish action
        3. recieve reaction
        4. update state
        Thread 3-100: AI Threads
        1. Read viewable universe state
        2. Process against goals
        3. publish action
        4. recieve reaction
        5. update state
        Please don't read this too literally, it only a slashdot post, but this is meta-outline of how I'd start thinking about the game universe in a multi cpu system. Of course it would run like shit on a single CPU (all those context switches (ugh)), but it would really utilize a multicore system.
        • by Anonymous Coward
          Thread 3-100: AI Threads
          1. Read viewable universe state


          I think that would introduce all of the issues multiplayer games have with network lag right into the game engine. If the AI characters aren't all working from the same data set (because it's changing while they're "thinking,") you're bound to have some pretty weird and difficult-to-debug timing issues. Even simple single-threaded code has a lot of wacky and unpredictable timing behavior on a PC, compared to actual real-time syst
          • You are right of course... I'm not as bright as Carmack, and I'm sure he's spent ALOT longer thinking about this than me.

            The most obvious problem with my solution is to think of it like a Games Workshop game. There has to be some sort of turn basis, otherwise the winner is the one who can role the dice fastest.

            But taking that metaphire further, there are no idle cycles in a games workshop style game. Even though only one player can update the universe at a time, everybody else is crunching numbers, trying t
        • Yes but can you image coding that in C? It would be a freakin' nightmare. I think that's John's real problem with threads and multi-core systems. It's probably why he's also down on ps3... for that you basically have to write small independent modules that work well in parallel; it's kinda the origami mindset... a Japanese developer might develop a beautiful, self-contained, self-directed NCP for ps3 whereas John's western mindset would make a page of AI code that gets everything sorta-ok.

          What's needed f
          • Definately. One of the first things you realise when you start concurrent programming is that 100% CPU usage is not possible. There are too many other bottlenecks. This really hurts the C/Assembler programmers mind set, but opens up spare cycles for a runtime based language like Java and .Net - although these are terrible languages for game prgramming.

            Could this be the age of LISP or Haskel? Everytime I sit down and think about a language that could easily be handled by multicores I start designing LISP. (T
      • by robnauta ( 716284 ) on Tuesday August 16, 2005 @06:08AM (#13329021)
        This is considerably more difficult than one would think. Games typically have to perform tasks in a particular order, for example (extremely simplified): get inputs, move player, move AI players, move other objects, check for collisions, update parameters, display the next frame, loop.

        Quite where we add this 2nd thread is difficult. Everything must happen in the same order in order for things like collision detection to function correctly.

        Not neccesarily. One big problem with games is that the typical order (beginscene/render/endscene/present) is implemented with a busy-wait loop in the present part. This is the part where all data has been sent to the graphics card and the driver waits in a loop until it gets a 'scene completed' message from the card. This is why games always run at 100% CPU.

        Games that don't use threading well (only threading for network/input/sound) put stuff in the loop you describe. Draw a scene, the driver waits for an 'OK', then you update the player, update the AI characters, do collision, calculate all new positions and start drawing. Because the drawing takes eg. 10 ms per frame for 100 FPS developers limit the AI/collision part to run in something like 1 ms or else the frame rate starts dropping. So the real AT would be limited to say 10% of the CPU time.

        For example the 'move AI' part could be a bunch of threads, calculating new positions based on direction, collision etc.

        Right now games like DOOM3 typically only display a few NPC's at the same time because of the timing problem. If the move AI thread can just keep running on the second CPU while the first CPU waits within the driver a game could support a few 100 enemies on-screen.

        Strategy games with complicated pathfinding with hundreds of units on-screen like Warcraft 3 or Age of Mythology would profit enormously, if programmed for multithreading.

        • "Right now games like DOOM3 typically only display a few NPC's at the same time because of the timing problem."

          Um, i don't believe that is why Doom3 has so few enemies at a time... The AI isn't that spectacular and companies like Croteam have games with 100s of enemies on screen with just as good AI implemented...

          "Strategy games with complicated pathfinding with hundreds of units on-screen like Warcraft 3 or Age of Mythology would profit enormously, if programmed for multithreading."

          This I could believe...
      • Multithreading could do away with loading screens. If the player approaches the end of the level, the other thread could start loading the data for the next level. Imagine huge worlds that are full of detail, uninterrupted by pesky loading screens.
      • I'm not a game programmer, but I do a lot of multi-threaded development. In multi-threading you when you need to guarantee that things happen in a particular order, you use a specific multi-threading technique such as a semaphore. Without these techniques, thread-1 can get more CPU time than thread-2. So multi-threading adds logic complexity and overhead.

        In some multi-threaded systems, each thread can act independently of all other threads. It doesn't matter whether thread-1 gets ahead of thread-2. In
    • I think he has just spent too much time on single-CPU systems. A lot of things in a game can be done in parallel, for example:
      • Each monster / NPC can run their AI in parallel and interact with the game system via the same API as the player.
      • Collision detection can be performed in boxes smaller than the entire scene, all in parallel.
      These two are just off the top of my head, but I'm sure there are more things that can be added.
    • He's not the only major player who thinks that.

      Check out Gabe Newell's comments [bit-tech.net] (One of the key developers of Half-Life 2). He also thinks multi-core/cpu machines aren't going to be bringing a lot extra to the table for game machines for some time.

      When Newell and Carmack, the lead developers of the two hottest game engines out there, agree on this point, you realize we might not be taking that leap forward in gaming that we all thought we were going to.

  • by erwincoumans ( 865613 ) on Tuesday August 16, 2005 @05:15AM (#13328874)
    His love for graphics is nice, but pity he lack s physics programming skills :) That's why Jan Paul van Waveren takes care of it, in Doom 3 etc. Physics Middleware will be of big importance for next-gen consoles, and it will rock the world :) http://www.continuousphysics.com/Bullet/phpBB2/ind ex.php [continuousphysics.com]
  • John Carmack's QuakeCon 2005 keynote also covered the programmer's thoughts

    "Ladies and Gentlemen,

    ...I want more funky graphics things...

    ...I wonder if I left the gas on...

    ...My leg itches...

    ...That guy looks tired..."

  • Interesting (Score:2, Interesting)

    by ribblem ( 886342 )
    Carmack's other wish-list item was that some attention be paid to the problems with handling small batches of data on today's GPUs. He said the graphics companies' typical answers to these problems, large batches and instancing, don't make for great games.

    John Carmack's past pleas for graphics hardware changes have led to a number of developments, including the incorporation of floating-point color formats into DirectX 9-class graphics chips. We'll have to watch and see how the graphics companies addres
    • Jon Carmack is a typical male programmer. Like all males, he can only think of one thing at a time like a cold hearless single-threaded computer brain. That's why he can only write programs that way. There should be more female games programmers. Females have a compassionate multi-threaded mind which can be both more intuitive and more practical. Female programmers would naturally relate to multi-threading and reap the benefits in processing power.

      However unfortunately all the games would be about buying

    • I would make one or more physics threads, one or more AI threads, a sound thread, a rendering thread, a resource managment thread, and perhaps a culling thread which assisted the VPU with geometry occlusion if the CPU is ahead of the VPU. I'd also put in a semaphore queue mechanism so some of these could get a frame or two ahead without syncing.

      All of these threads, however, may need to interact with each other. When objects collide, you get a callback and that callback may trigger rendering, AI, sound, et
      • "All of these threads, however, may need to interact with each other. When objects collide, you get a callback and that callback may trigger rendering, AI, sound, etc. AI needs to use physics data for line-of-sight checks and so on."

        Communication between agents isn't exactly a new problem. Queues (bounded or unbounded) could be used to implement asynchronous message passing. IIRC, bounded queues dont even need atomic operations.
        • It's not exactly like putting large amounts of data in a queue is a free operation. For a Cell-based machine, of course, they will have to do it. On a dual-core x86 it's not that obvious. We also have to remember the added latency by doing things some frames ahead. That may hurt gaming experience more than the framerate itself.
          • Good point. Still, "putting large amounts of data in a queue" might not happen that often: the (lightweight, not necessarily system, so we can mostly ignore the cost of context switches) threading can be scheduled to help ensure that the queues are emptied af quickly as possible. We can also queue references to large chunks of data instead of the data itself. Of course that may require a more functional way of doing things, in addition to a more complex GC. Fortunately, concurrent or realtime (not sure abou
  • 256 MB RAM is definitely not enough for games with demand for such extreme graphics and realism (did he say physics?)!

    I doubt that the next generation games will look like movies; except for some graphic demos like the Unreal Engine 3.

    Here's an old quote from Tim Sweeney:

    "Off-the-shelf 32-bit Windows can only tractably access 2GB of user RAM per process. UT2003, which shipped in 2002, installed more than 2GB of data for the game, though at that time it was never all loaded into memory at once. It does
  • Kenote? (Score:2, Funny)

    by Arivia ( 783328 )
    Is this the KDE project's competitor to Microsoft OneNote or something?

  • irst and foremost on that list was full virtualization of texture mapping in graphics hardware. Carmack decried the "fallacy" that "procedural synthesis will be worth a damn," arguing that programmers spending hours creating procedural shaders isn't the best way forward. Instead, he said, tools should unleash artists. He called the current tiled texture analogy a crude form of compression, and argued that true unique texturing in graphics would be a massive leap in visual fidelity over current pract
    • It sounds to me like he wants the engine to be able to say "okay, here's the textures for this level," but specify a "stack" of texture sets, with each one slightly different, so that when rendering ten trees, it cycles through that texture in the stack and they all have different leaf patterns or whatever.
  • I wonder just what will be included...
    Will it include the source code to Q3 Arena and Q3 Team Arena?
    Will it include the ports to other platforms? (i.e. linux, mac)
    Will this release mean that other Quake 3 engine games can go Open Source too? (e.g. Return To Castle Wolfienstien, Enemy Territory etc)

The 11 is for people with the pride of a 10 and the pocketbook of an 8. -- R.B. Greenberg [referring to PDPs?]

Working...