Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Graphics First Person Shooters (Games) Quake Software Entertainment Games

Quake III Gets Real Time Ray-Tracing Treatment 116

Ozh writes "Did you ever wonder what you could do with a cluster of 20 AMD XP 1800s? Some German students and videogame fans did, and their answer has been what they call 'ray-tracing egoshooters', an entirely raytraced game engine which 'runs about 20 fps@36 GHz in 512x512 with 4xFSAA'. The first game to get this treatment is Quake 3 Arena : the screenshots look slightly better than the original 3D engine but the video (56 Mb, 3'19) is quite dramatic."
This discussion has been archived. No new comments can be posted.

Quake III Gets Real Time Ray-Tracing Treatment

Comments Filter:
  • by reality-bytes ( 119275 ) on Monday June 07, 2004 @07:09PM (#9361402) Homepage
    Its a bit hard to tell from the page whether this makes full use of the GPUs per box in the cluster like Chromium [sourceforge.net]

    They do also mention that it can render entirely in software over the network at 20FPS - not bad considering that each fram portion of the data has to pass across presumable 2 machines before it is passed to the display!
  • Kinda cool (Score:5, Interesting)

    by Anonymous Coward on Monday June 07, 2004 @07:15PM (#9361430)
    That's pretty nice. Unfortunately most of the effect can be simulated using tricks and still run on a regular computer. Especially with all the stuff you can do on the GPU now.

    They need to soften the shadows also. Either by using tricks or radiosity. Right now it looks kinda meh...

    Interesting effort though.
    • yeah radiosity, good idea, so how many comps for 20 fps then? One should be able to cache the shadows on the inanimate objects though.
    • Re:Kinda cool (Score:5, Informative)

      by Have Blue ( 616 ) on Monday June 07, 2004 @09:12PM (#9362048) Homepage
      The real-time curved reflective surfaces (true reflection, not cube mapping) and displacement surfaces currently cannot be done on 3D cards.
    • HEY ... when you are done playing games (i.e. leave at night ... if you don't stay up all night playing! ;-) ... can you turn that cluster of 20 AMD XP 1800s to do some processing for a "do-gooder" cause such as Folding@HOME - my Google Compute [powder2glass.com] team would love to get a boost from that type of horsepower! ;-)
      • You want to calculate stuff fast? Maybe you should ask google.

        Perhaps Google will have a Google free day, where for the whole day their 100K computers do Folding@HOME or something else other than google search. :).
  • by 42forty-two42 ( 532340 ) <bdonlan AT gmail DOT com> on Monday June 07, 2004 @07:22PM (#9361458) Homepage Journal
    Freecache link [freecache.org]. This should hopefully be faster. Anyone have a torrent?
  • Somebody didn't notice the frameset...
    Screen Shots [uni-sb.de]
    Downloads (video) [uni-sb.de]
  • by Ender Ryan ( 79406 ) on Monday June 07, 2004 @07:22PM (#9361461) Journal
    Ok, so this is pretty neat. The screenshots look pretty nice, and I'm downloading the video(ISDN, bleh), but what I really find interesting is the mention of a hardware raytracing GPU, and a link to a working prototype.

    So my question is, for those of us who don't know the first thing about 3D graphics, what are the pros and cons of a raytracing GPU, compared to the polygon pushers we currently know and love.

    • by xenocide2 ( 231786 ) on Monday June 07, 2004 @07:41PM (#9361570) Homepage
      Well, its known to be slower in implementation. It also forces lighting on the rendering. After all, we're tracing rays of light, here. But in theory, raytracing is where the future resides. I'm told, though I don't fully understand the logic behind it, that raytracing should scale to higher poly scenes than rasterizing. The big problem it seems is that its difficult to find hacks to speed up rendering. Most speed enhancements are data restructuring, like subregion partitioning. The idea being that if a ray doesn't intersect a region, it can't intersect anything in the subregion and you don't need to test it.

      The real benefit is free occulsion culling, shadows, lighting, reflections, and essentially a physical simulation of how things actually work in life. There's been a few boards prototyped to do ray-tracing. Just google for Real-Time Raytracing. The paper behind it suggests that a hardware raytracer scales nearly linearly with the amount of tracer units behind it. These days its difficult to take a hardware prototype and beat the market standard with a wholly different paradigm, especially when the benchmark is OpenGL based. OpenGL only provides for 4 light sources, and little point. The prototype that exists is incredibly large and not well suited to current small desktop cases. But given the right set of talent, this is an interesting concept that could prove to take over poly pushers eventually.
      • by 0x0d0a ( 568518 ) on Monday June 07, 2004 @08:19PM (#9361777) Journal
        I have a comment a bit further down with some more explanation.

        I remember seeing an SGI demo of real-time raytracing. They used a kinda neat technique -- the application was a modelling app, and so they "faded" the new image in by rendering random pixels. It let them get away with far fewer FPS.
        • by photon317 ( 208409 ) on Monday June 07, 2004 @11:56PM (#9362976)

          A freind of mine wrote a real time raytracing engine as an assembly demo on an 80386 back in 1993 or so. Doing "real time raytracing" isn't that hard, it's doing it with complex objects, lots of light sources, and high resolution, that becomes a problem. IIRC, his was in 320x240, and was only rendering half the lines on the screen (effectively 160x240), and the scene was just moving through a dullish rocky martian landscape with a setting sun as the only lightsource.

          My point being, it's not a great feat to do realtime raytracing, it's just a great feat to harness enough hardware power or come up with enough optimization tricks (without cheating and make it of lesser quality than a real raytrace) to do big nice-looking things with it.
          • Have a look at this real time RT demo (http://www.pouet.net/prod.php?which=9461 [pouet.net]) from 10 years later. Breakpoint'03. :)
      • by mr3038 ( 121693 ) on Tuesday June 08, 2004 @04:16AM (#9363862)
        [Raytracing] also forces lighting on the rendering. After all, we're tracing rays of light, here.

        Umm.. no. In practice, no raytracer traces rays from the light source because very few of those rays would ever hit the camera. Instead, all raytracers do it backwards: backtrace the ray that would come from the top left pixel orientation towards the camera lens. When the ray hits an object (say, a wall), compute backtraces from that location. If you don't need realistic lightning, hitting a wall could always return preset amount of light (mixed with the object texture, of course) from that wall and no scattering of the ray.

        The problem with full hardware raytracers is that the hardware should be able to hold whole scene or there'll be problems with some ray directions. GPU and the board on which it recides would limit the complexity of the scene, unlike with OpenGL which may render as complex scenes as the whole system can store (part of the scene can be streamed from the hard drive...)

        I think the future will be a mix of both systems. Raytracer for curved, reflective surfaces. Multipass raster engine for everything else.

        After looking through the video clip, it seems clear to me that the most important improvement in current games is better shadows. How many reflective surfaces there're in your environment? I'd say the glass is only one I'd miss reflections from and if that makes the difference between 2fps and 200fps, the lets forget the real reflections and use environment cube mapping instead.

        • I can see reflections right now from a can of soda, a glass plate, 2 monitors, the mouse, the individual keys on the keyboard, and the side of my computer (all of it Apple hardware, so it has that smooth plastic finish), the spines of several dozen CDs, a framed painting, and my watch. The real world is still a vastly more complex place than a game engine.

          Also, FWIW, there are more advanced (and expensive) forms of ray tracing that do involve tracing rays from light sources into the environment, to simula
    • by cgenman ( 325138 ) on Monday June 07, 2004 @08:55PM (#9361956) Homepage
      what are the pros and cons of a raytracing GPU, compared to the polygon pushers we currently know and love.

      Raytracing is generally more expensive than traditional polygon based graphics. You get more realistic curvature, far more realistic lighting, (including incidental light, diffuse light, etc), reflections, deflections / transparencies (such as those glass balls everyone loves), etc, etc, etc.

      When Pixar goes rendering, Pixar raytraces. When Cameron goes rendering, Cameron raytraces.

      The downside is that raytracing is a total resource hog. Essentially, for every pixel on the screen you trace the path of the light backwards, discovering every incidental surface and light source that might be effecting it along the way.

      Polygon algorithims put stuff immediately to the screen, only going so far as to cull the faces that aren't visible to the camera. This is a lot more efficient for today's graphics, and will be far into the future.

      And every time we get a step closer to using realtime raytracing, we get better polygon altorithims. First we had flat polygons, then we had colored vertexes, now we texture a character based upon averages of the normals of the surrounding vertexts, creating seamless skins. Originally we had no light, then a baked in faked lighting, now we have multiple light sources with multiple faked shadows on a baked environment. Glass and mirrors, once unheard of in a videogame, are now common. We even sample textures over a given area to try and get a more accurate per pixel representation.

      So to answer your question, a raytracing GPU would have to be bloody powerful to do what you can do today with a polygon engine in realtime. Again, everyone thinks we'll get there someday, and there is no doubt in my mind that we will, but a realtime raytraced commercial game is such a distant possibility as to be a lifelong aspiration.

      • When Pixar goes rendering, Pixar raytraces. When Cameron goes rendering, Cameron raytraces.

        No, they don't. At least not when they can help it. Renderman didn't even have raytracing capabilities last time I looked, which admittedly was a while ago - for scenes that absolutely need it, "frankenrenders" using mental ray or BMRT were the order of the day.

        You can do a hell of a lot with reflection mapping and custom shaders in a lot less time.

        • by Anonymous Coward
          Pixar used raytracing for the first time in Nemo (sparingly.) Then released a version of Renderman that has ray tracing this year.
        • No, they don't. At least not when they can help it. Renderman didn't even have raytracing capabilities last time I looked, which admittedly was a while ago

          From what I understand of Renderman (haven't used it personally), I believe it breaks up your scene into polygons that are of subpixel size, rendering with that. A major attraction of it is its shader language, whose flexibility we are starting to see to some extent in hardware now.

      • by Anonymous Coward
        There's a prototype [uni-sb.de] of a ray tracing GPU. Small, but fast.
    • > So my question is, for those of us who don't know the first thing about 3D
      > graphics, what are the pros and cons of a raytracing GPU, compared to the
      > polygon pushers we currently know and love.

      Raytracing requires a good deal more out of your hardware. They're running a
      twenty-node cluster and only getting 20 fps, and I bet they're not even doing
      some of the fancier tricks raytracing is capable of doing. So the downside is,
      raytracing is slower, a *lot* slower on the same hardware.

      The advantages
      • 20 years, in that time if mores law holds the average processor will be 2^(20/1.5) times more powerful, which is to say 10321x more powerful than today, even if it doesn't hold true, I suspect were more likely to see ray-tracing in 10 years, after all 10 years ago, were talking 1994... we'd only just got Doom from ID, no one had dreamed yet of looking up in a FPS :)
        • > 20 years, in that time if mores law holds the average processor will be
          > 2^(20/1.5) times more powerful, which is to say 10321x more powerful than today

          Yes, but meanwhile expectations (regarding resolution, object complexity, and so
          on and so forth) also keep going up, as existing game technology develops. It
          is not enough to have the processing power to raytrace something that would
          look very impressive today.

          20 years could be longer than is required, but it's very hard to tell ahead of
          time how lon
    • by j1ggl3x ( 701715 ) on Tuesday June 08, 2004 @03:20AM (#9363696)
      People have been giving good pros and cons to ray tracing, but I haven't seen a comment explain what it is. Now it's been awhile since my graphics course (so this may be terribly wrong), but from what I remember raytracing works something like this:

      1. Imagine a ray shooting out of your eye through each pixel on your screen. So ray shoots into the 3D world and it might hit an object. The purpose of the ray is to collect light information.

      2. If it hits an object, it will bounce off at a certain angle (depending on the object). After it bounces around a couple times, it might eventually hit a light source or you might set a limit to how many times it can bounce. Each time it bounces off an object, it might lose some intensity depending on the surface of the object.

      3. After all the bouncing, it collects light information (depending on what it hits, the surface, the lighting) and now that pixel now has more accurate light info for rendering.

      What this allows is much more realistic mirrors, reflections, lightings, shadows, etc. but as you can imagine, bouncing off all those rays takes lots and lots of computation. Radiosity was mentioned, and that's basically shooting millions of rays FROM the light sources first (instead of from the users eye to each pixel). But again, lots of calculations.

      If you're wondering how current games look so good without raytracing, it's due to lots of clever hacks and simulations for lighting/shading. Raytracing is kind of a brute-force/realistic method. Hope that helps someone...
      • > Radiosity was mentioned, and that's basically shooting millions of rays FROM the light sources first (instead of from the users eye to each pixel). But again, lots of calculations.

        Actually, that would be light tracing, which is rather different from radiosity. In radiosity, the scene is divided into small surfaces (patches) of approximately the same size, then the lighting distribution by diffuse interreflection is calculated. This does not have anything to do with ray tracing (ok, you can do it stoch
  • Radiosity! (Score:2, Informative)

    by BRSloth ( 578824 )
    Man, I can't believe they didn't used radiosity [uni-dortmund.de] to render those images. Yes, I know it takes a lot more of CPU power, but I would surely steal other people computers just to play Quake that way :)
    • Re:Radiosity! (Score:5, Informative)

      by j1m+5n0w ( 749199 ) on Monday June 07, 2004 @09:11PM (#9362040) Homepage Journal

      Radiosity would dramatically increase the computational complexity.

      Polygonal rendering: O(N), where n=number of triangles
      Ray tracing: O(log N), where n=number of objects (assuming a good bounding volume heirarchy)
      Photon mapping: O(P log max(P, N)), where P=number of photons, which generally must be inserted into a kd-tree, and N=number of objects
      Radiosity: O(N^2), where n=number of triangles

      Ray tracing could conceivably make a game faster, if the scenes are complicated enough. Radiosity, on the other hand, is very very slow. Photon mapping [ucsd.edu] might be a better choice - it traces rays from the light source, and stores photons at the object intersection points, which are then used by the ray tracing step to approximate global illumination.

      -jim

      • It been a while since I've had anything to do with graphics and rendering etc, so this might be a dumb thing to say...

        I was under the impression that ray tracing and radiosity weren't exclusive techniques, and using both produced excellent results.

        ie ray tracing is excellent for reflections and refractions but tends to look too harsh when dealing with soft shadows and ambient light which is where radiosity works well.

        Corrections welcome :)
        • I was under the impression that ray tracing and radiosity weren't exclusive techniques, and using both produced excellent results.

          Yes, radiosity and photon mapping are both usually used to calculate indirect lighting. Usually they're used in conjunction with ray tracing or polygonal rendering.

          -jim

      • Polygon rendering is also log(N) if you give it the same affordances for visibility preprocessing that you're apparently providing for raytracing, or use a hybrid approach like raycasted visibility for polygon rendering.

        IMO, the best approach would be a hybrid of polygon+raytracing; use polygon rendering with shaders (and subdivision surfaces or some other form of arbitrary-precision surface tesselation) for most things, and then raytracing for reflections/refractions. There's a lot of things which pixel

      • to compute global illumination levels ahead of time (as opposed to using ambience). Then you can use raytracing in realtime to capture dynamic effects.
        • [you use radiosity] to compute global illumination levels ahead of time (as opposed to using ambience). Then you can use raytracing in realtime to capture dynamic effects.

          Where's the fun in that? I'd like to see dynamic global illumination (It might take awhile before computers are fast enough, and people figure out the right algorithms, though).

          Photon mapping can be precomputed as well, and can simulate a number of effects that can't be simulated with radiosity (subsurface scattering, caustics off o

      • Re:Radiosity! (Score:1, Interesting)

        by Anonymous Coward
        Let's make it somewhat more accurate by introducing another variable S, the screen size in pixels.

        Polygonal: O(max(S, N))
        Ray tracing: O(S log N)
        Photons: O(max(S, P) log max(P, N))
        Radiosity: O(max(S, N^2))

        Note that if N ~= S polygons keep linear behaviour while ray tracing becomes linearithmic.
  • by complete loony ( 663508 ) <(moc.liamg) (ta) (namekaL.ymereJ)> on Monday June 07, 2004 @07:26PM (#9361472)
    Great so now I have to lug 20 pc's to a LAN party to get a decent frame rate?
    And I was just thinking about my next upgrade for HL2/Doom3.
    Imagine a cluster... oh wait.. um, so is it running linux? and where is the source code?
  • Soon... (Score:4, Insightful)

    by eyeball ( 17206 ) on Monday June 07, 2004 @07:28PM (#9361482) Journal
    According to Moore's law, we should get this power in our desktops in about 4 and a half years from now.
    • Re:Soon... (Score:3, Informative)

      by 0x0d0a ( 568518 )
      No.

      20 computers running.

      You have one computer.

      log(20)/log(2) * 1.5 yrs = ~6.5 yrs

      A bit longer, if you want a full 75 fps (or 60, if we're all using LCDs in 7 years).

      Plus, technically Moore's law relates to transistor count, not processing power.

      I'm interested in when we can do this [irtc.org] in a game in real-time. l(2hrs*3600secs/hr*60fps)/l(2) * 1.5yrs = 28 years before we see this in real-time (though that's using Pov-Ray, which could probably be sped up a lot if it's made into a game engine rather than a g
      • I'm interested in when we can do this in a game in real-time. l(2hrs*3600secs/hr*60fps)/l(2) * 1.5yrs = 28 years before we see this in real-time (though that's using Pov-Ray, which could probably be sped up a lot if it's made into a game engine rather than a general purpose graphics architecture.
        That didn't use Pov-Ray, it used 3D Studio Max.
      • 20 computers running.

        You have one computer.

        log(20)/log(2) * 1.5 yrs = ~6.5 yrs


        Yeah, but those 20 computers are only 1.8 GHz each. Desktops can currently go twice that fast, so take off 1.5 years or so from that total assuming that you go with the best commercially-available desktop technology.

        Rob
      • Keep in mind, this is all just rendering in real-time. If you want the reactions to feel right, you've got to be modeling the physical interactions as well.

        Doom and Quake aren't bad, but they still leave a lot to be desired.

        And don't get me started on how corny the sounds are...

        • Keep in mind, this is all just rendering in real-time. If you want the reactions to feel right, you've got to be modeling the physical interactions as well.

          [clip]

          And don't get me started on how corny the sounds are...


          Aureal tried commercializing "wave-tracing" at the consumer level via A3D -- actually tracing out the paths of sound waves.

          Unfortunately, they got hijacked by the incumbent (Creative, who didn't have a lot of incentive to get involved in a research war) who introduced a small reverb model
          • wave-tracing's a nice start, but it's still just recorded clips.

            I can't find it now, but I've seen at least one gradschool project on generating a new sound from an interaction, based on things like the interacting materials, the speed and location of the impact, etc etc. It was darn cool stuff.

            -Zipwow
  • by superultra ( 670002 ) on Monday June 07, 2004 @07:50PM (#9361615) Homepage
    Dammit, don't give Carmack any ideas! I'll barely be able to run Doom 3!
  • freecache link to the 60MB movie posted to /. frontpage [freecache.org]. sheesh.

    pulling down at about 200kb/s

  • Wow. (Score:1, Funny)

    by ikkonoishi ( 674762 )
    Imagine what you could do with a beowulf cluster of these things... oh wait... nevermind.

    It seems my question is already answered.
    • Re:Wow. (Score:1, Informative)

      by Anonymous Coward
      That isn't redundant.

      What is featured in the article IS a beowulf cluster.
  • Looks worse to me (Score:3, Interesting)

    by Jerf ( 17166 ) on Monday June 07, 2004 @08:31PM (#9361823) Journal
    Is it just me, or did that look much worse than standard Q3?

    Q3 isn't designed, let alone optimized, for raytracing, so that's not a major surprise, but I still expected an improvement, not a downgrade.

    I think a custom demo is called for.

    The tech sure is hella cool, though.
    • Looks much jerkier as well. In the AVI, the gun seems to jump around way too much.
      • by schotty ( 519567 )
        Yeah, it should --- Its running at 20fps. anything below 60ish and you see jitter and skippiness. Hell I am pretty bad at noticing crap like that. My bottom is around 40fps. OTOH, I have a bud whose eyes are so finicky, anything less than 75-80, and he can see.
    • "Is it just me, or did that look much worse than standard Q3?"

      Yes. The reason why is they turned off one of the texturing passes. Q3 uses an extra drawing pass with B&W textures to make 'shadows' on the walls. Why'd they do this? I imagine because of the lighting effects in the game. If a flash goes off, you don't want the shadowed area staying dark. Just don't draw that shadow texture, and you get nice brightly lit wall caused by a rocket. (I apologize if that doesn't make sense. For a cle
  • That video is one of the most impressive things I've seen (as related to PC video/games). The scene that looks like a night club is astounding, and the scenes with the mirrors are extremely cool.

    I can't wait to see this technology in production.

  • ed2k link (Score:3, Informative)

    by 0x0d0a ( 568518 ) on Monday June 07, 2004 @09:30PM (#9362127) Journal
    ed2k link [google.com] (It sucks that Slashcode is broken WRT ed2k direct links.)
  • They're hosting screengrabs and a big video file, and the site's still holding up? Well now we all know what they did with all those machines they used for the demo!
  • And some people [gamezero.com] 10 years ago thought that ray traced games were going to be on LAST generation CONSOLES.
  • Slow already? (Score:2, Informative)

    by sean1121 ( 614907 )
    Heres a mirror of the movie.
    http://mirror.openbarr.com/20040509_egoshooters_q3 rt.avi [openbarr.com]
  • ... can be found here [gamestar.de].
  • I guess I'll start saving for that alienware system after all..
  • Missings some things (Score:4, Interesting)

    by The Moving Shadow ( 603653 ) on Tuesday June 08, 2004 @03:45AM (#9363765)
    Okay i know raytracing provides far more realistic visual representations of a 3D modelled scene than actual scanline polygon rendering. But - and here comes the but - i miss a lot of things in this raytraced Quake movie. All the shadows are really really crisps, one would expect that when light bounces off walls and objects a few times its reflected light would soften those crisps shadows. E.g. it would result in softened gradual shadows.

    I guess they limited the path of the ray they calculated so it bounced only two or three times off an object before they stopped calculating it. (If they stopped after one pass you wouldn't have seen those reflective glass balls like you did, which need multiple passes to look like they do).

    I also miss colour bleeding on the surfaces. E.g. when you have - let's say - a white surface next to a red surface, some of the red will bleed on the white because light coming from the red surface will fall on the white surface and light it in a red hue. You would have seen this with a proper raytracing engine where the light bounces multiple times from an object and where the colour of the light is affected by the colour of the object.

    I think those are the main reasons why the video doesn't look as realistic as i hoped for. (Then again how realistic is walking through a building where they have decorated the place with gruesome wallpaper taken from a horror movie and gigantic brains on mechanic spider legs walk around... ;) )
    • are provided by a straight raytracer (soft shadows, color bleeding, etc.)

      Because when a ray hits a surface, it only spawns a reflection ray and rays to any (point) light sources. Therefore shadows will have sharp edges, either a particular light at a particular point is visible or it is not.
      Moreover, the contribution of non-light sources to a surfaces' illumination is not modeled (light from OTHER surfaces).
      To accomplish this, once must create a large number of trial rays in a pattern around the reflection
  • This looks worse than the original engine. I've played Q3 since it came out, and those screenshots are crappy compared to what the game really looks like... So big deal, someone used tons of processors and some good coding to create graphics that look worse than the original, for a few thousand times the CPU and $$ cost! Sometimes I wonder why people get excited over this stuff.
  • As no-one else has mentioned it I'll be the first to point out that if you want to try raytracing for free on just about any platform you'll want POVRAY [povray.org].
  • Feel free to mod me as redundant, but I'm sick of all the incorrect assumptions regarding rendering.

    Raytracing overview. (a simple implementation)

    Raytracing involves the casting of rays from the camera's eye, through a pixel in the screen. For each object in the scene (assuming no partitioning optimizations) we calculate whether the ray intersects the object. If it does, we determine the distance from the 'screen' to that object (consider it the Z depth) and save this for later. Once we have checked every

"Conversion, fastidious Goddess, loves blood better than brick, and feasts most subtly on the human will." -- Virginia Woolf, "Mrs. Dalloway"

Working...