Wolfenstein Gets Ray Traced 184
An anonymous reader writes "After showcasing Quake Wars: Ray Traced a few years ago, Intel is now showing their latest graphics research project using Wolfenstein game content. The new and cool special effects are actually displayed on a laptop using a cloud-based gaming approach with servers that have an Intel Knights Ferry card (many-core) inside. Their blog post has a video and screenshots."
I don't get it (Score:4, Insightful)
Why build a ray tracer into a 4th game after doing it for Q3, Q4 and ET:QW. Why don't they focus on improving already existing raytracing code into the first 3 games.
I donnow but it seems like they're keeping themselves busy for the sake of looking busy.
Y
Nothing to see here (Score:5, Insightful)
Sign of the times... (Score:5, Insightful)
What's the point? (Score:2, Insightful)
I know they just started but still... what is the point of this? There is no upsides to rendering. It's slower (you need 4 servers), it looks worse (they had no antialiasing, ugly smoke, no complex lightning). You can do some things like reflections and refractions and portlas bit easier than with other methods but most of the time you don't need 100% correct reflections/refractions (simplified models work quite nice) and security cameras where implemented in Duke Nukem 3D on i486 machines without problems.
Other than selling Intel chips I see no purpose for this project.
Re:What's the point? (Score:4, Insightful)
Cloud gaming and latency (Score:3, Insightful)
The very idea of using the cloud to render a FPS is preposterous and will never work in practice, for obvious latency reasons.
Re:Cloud gaming and latency (Score:3, Insightful)
The very idea of using the cloud to render a FPS is preposterous and will never work in practice, for obvious latency reasons.
How else will you start training for the moment when that computing capacity is on every PC?
You use the cloud, ignore the lag and build an engine ready for the generation of computers that will come in five or ten years. You'll lose a lot of your investigation, but anyone who starts studying RT at that point will be years behind you.
Re:Poor ray tracing (Score:3, Insightful)
The worst example of 3D I've seen so far would be the "shadows on a mirror" trick - nice.
Re:Cloud gaming and latency (Score:3, Insightful)
Do you have anything to back up that "runaway success" claim? As far as I can tell it's been shunned by hardcore gamers due >100ms input lag, and I've not seen anything about it having huge takeup.
Ahh Youth (Score:5, Insightful)
"The surveillance station. At a wall in the game you see twelve screens that each show a different location of the level. This can be used by the player to get a tactical gaming advantage. Have you ever seen something similiar in a current game? Again - probably not"
Yes, In Duke Nukem 3D... over 15 years ago. And again in a bout 40 other FPS games that followed including the Unreal series, more then a few Quake maps especially in capture and control maps.
"There is nothing more amusing to watch then some young kid discover something old and think it is new" - That quote in action.
Re:Moore's Law is NOT a tool (Score:2, Insightful)
Re:So many (Score:3, Insightful)
Rasterizers scale on O(triangles) while raytracers scale on O(pixels * log triangles). I dont remember if it was Microsoft Research or something out of Intel, but 5 or so years ago they did some scalability testing and concluded that about 1 million polygons was the sweet-spot where raytracing and rasterization were about equal in efficiency using the per iteration constants derived in their testing.
This was based on visible geometry only, so no pretending that the fact that rasterizers can use logarithmic data structures for hidden surface removal, that that makes any bit of difference.
Since then, triangle counts have remained about the same in games (with more per-pixel processing being done to simulate more geometry,) but the number of pixels have quadrupled as higher and higher resolution displays have become common. Yet they are reaching the limits with the fakes that can be done with shaders, and resolution is probably not going to go through another quadrupling, so raytracing really is comming.. just not quite yet.
When the polygon counts do get high enough, there will be no looking back. Raytracing will be here to stay after that because of the way it scales. At 1 million polygons, a raytracer spends 20 iterations per ray cast using a logarithmic structure.. doubling the number of polygons to 2 million only adds 1 more iteration.. or about 5% more processing power required, and doubling again only adds another ~4.5%, and so on.. meanwhile each doubling of polygons on the rasterizer literally doubles the processing power required.
Re:Cloud gaming and latency (Score:3, Insightful)
I don't know if latency is any sort of a problem. You're talking of a LAN connection. This technology is not meant to render stuff somewhere out there on the intertubes. It needs to be in the same building, or on the same campus.