How Quake Wars Met the Ray Tracer 158
An anonymous reader writes "Intel released the article 'Quake Wars Gets Ray Traced' (PDF) which details the development efforts of the research team that applied a real-time ray tracer to Enemy Territory: Quake Wars. It describes the benefits and challenges of transparency textures with this rendering technology. Further insight is given into what special effects
are most costly. Examples of glass and a 3D water implementation are shown. The outlook hints into the area of freely programmable many-core processors, like Intel's upcoming Larrabee, that might be able to handle such a workload." We mentioned the ray-traced Quake Wars last in June; the PDF here delves into the implementation details, rather than just showing a demo, and explains what parts of the game give the most difficulty in going from rasterization to ray-tracing.
Never ending chase... (Score:5, Insightful)
Yet another ray tracing article and yet again all the same problems as before. Doing yesterday games in ray tracing is all nifty, but also kind of pointless. For one we already played them, but more importantly, it doesn't actually use the strength of ray tracing. Rendering a tree build out of texture quads is a nice accomplishment, but wasn't the whole point of ray tracing that one can have a million polygons and no longer need such hacks? So show me a realistic tree instead of trying to replicate the limitations of rasterization.
I am still waiting for a game/demo that actually is build from the ground up with ray tracing in mind and by that I mean one that actually looks good, just a few shiny spheres might have been impressive back on the Amiga some 20 years ago, not any more.
Re:before after pictures (Score:3, Insightful)
it's got more obvious special effects but the other one looks for more realistic.
Re:before after pictures (Score:4, Insightful)
Re:Never ending chase... (Score:5, Insightful)
You aren't going to see that kind of thing in a game for many reasons that boil down to that ray tracing isn't ready to do realtime. To make a game that used ray tracing would pretty much doom it to failure.
One problem you have is that the graphics hardware out there isn't built for ray tracing, it's built for rasterization. Now while I'm sure you can write your own ray tracer on the newer hardware that does GPGPU stuff, I'm also sure it wouldn't run as well. Reason is that current graphics cards are purpose built rasterizers. They are designed to do that as fast as possible. So you are left with writing your own ray tracing engine in software, either on the CPU or GPU. This is not going to be fast, especially since ray tracking is fairly computationally intensive.
Well then you hit the next problem: Pixels. Ray tracers do NOT scale well with resolution. Each pixel has to have it's own ray cast. If you want to do anti-aliasing, then you have to do more rays for that. This is why ray tracing demos tend towards low resolutions. It is much faster the less pixels you have to do. Ok well that doesn't compare favorably against the rasterizers. They scale extremely well with resolution, and also in terms of anti-aliasing. Many of them can do 4xFSAA with next to no performance penalty, and can do it at full HD resolutions. Not the case with your ray tracer. If it can render 40 FPS at 1920x1200 with no AA, it'll be just 10 FPS with 4x AA since it now has to do 4 rays per pixel.
So you aren't going to see it happen any time soon. The net result wouldn't look as good as the equivalent rasterized game. It won't be the sort of thing you see until either there starts to be purpose built ray tracing hardware (GPUs may start be made for both) or general purpose processors are so fast it makes no real difference.
Intel is all up on this because they see GPUs as a threat to their computation market. However, as this demonstrates here, there really isn't an advantage at this time. You throw a positively massive system at it and you get poor performance. Even if you redid the game so it used extremely high geometry, nobody would give a shit. IT would run way too slow on any normal computer.
Re:Bandwidth & processing, quantum effects? (Score:2, Insightful)
Reality, by definition, is "dirty". We have dust, we have imperfections in every surface, no matter how carefully machined. Houses are never truly square, roads are never perfectly level, and points in a corner are always rounded. Always.
Computers, by definition, are "clean". Squares are always truly square, roads are as perfectly level as they were designed to be, and corners are always razor sharp, no matter how much you "zoom in".
The problem with modern graphics systems is they are computed to extreme levels of precision. If they incorporated a sort of fundamental randomness, if they were intrinsically uncertain, they just might be able to really approximate reality, which is messy, ugly, and imperfect.
You seem to be confusing texture irregularity with material consistency. A house wall is not perfectly "razor sharp", but no matter how many times you look at it, they do not suffer from "randomness" or are in any way "uncertain". At least not if you are not looking at a sub-atomic level. Also, the bandwidth would not be that high, if you take into account that human eyes have very little resolution, and thus an extreme amount of detail at a distance would be pretty much irrelevant.
Was it more fun? (Score:4, Insightful)
So was the ray-traced version of the game more fun? Or am I missing the point of games?
Re:Never ending chase... (Score:3, Insightful)
I know I use my share of the foul words in the english language, but think about this - everyone would take your comment more seriously if you didn't use them; at least not to the excess seen in your post.
Re:Never ending chase... (Score:3, Insightful)
Re:Was it more fun? (Score:3, Insightful)
In this case, yea. You were missing the point. It wasn't meant to make the game more fun. It was meant to show that ray tracing was possible on a fairly modern game. It's like modding a toaster to run BSD, or adding a laser turret to your mailbox: a substantial reason for doing it is to see if it's possible.
Mod parent up insightful... (Score:4, Insightful)
Intel isn't trying to do ray tracing. Really, their point is to find a way to make GPUs unnecessary since it is a threat to the CPU market.
They can call it "ray tracing extensions" to the I7 or I8 CPU. It's not like the x86/x86_64 instruction sets are some kind of blushing virgin whose precious architectural purity would be violated by adding instructions like "RT_LOAD_MESH" and "RT_LOAD_SHADER"...
What bothers me is how nVidia is missing the boat.