Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!


Forgot your password?
Graphics Intel Games

Wolfenstein Gets Ray Traced 184

An anonymous reader writes "After showcasing Quake Wars: Ray Traced a few years ago, Intel is now showing their latest graphics research project using Wolfenstein game content. The new and cool special effects are actually displayed on a laptop using a cloud-based gaming approach with servers that have an Intel Knights Ferry card (many-core) inside. Their blog post has a video and screenshots."
This discussion has been archived. No new comments can be posted.

Wolfenstein Gets Ray Traced

Comments Filter:
  • I don't get it (Score:4, Insightful)

    by Yuioup ( 452151 ) on Tuesday September 14, 2010 @03:47AM (#33570568)

    Why build a ray tracer into a 4th game after doing it for Q3, Q4 and ET:QW. Why don't they focus on improving already existing raytracing code into the first 3 games.

    I donnow but it seems like they're keeping themselves busy for the sake of looking busy.


  • by dsavi ( 1540343 ) on Tuesday September 14, 2010 @03:57AM (#33570602) Homepage
    It's rendered in the cloud. If they managed to actually get more bang for the buck- i.e. made this run on conventional hardware- Then I'd be interested. They're just doing something that has been done before, albeit maybe not in real time (But you never know, seeing these new OpenCL apps), running it on high-end servers, and piping it into a small laptop. I'm not sure how much of an achievement this is, we've all heard of gaming in the cloud before.
  • by rh2600 ( 530311 ) on Tuesday September 14, 2010 @04:13AM (#33570694) Homepage
    When a laptop packing a multi-GHz 64bit CPU with gigs of RAM gets called a thin client...
  • What's the point? (Score:2, Insightful)

    by pacinpm ( 631330 ) <.pacinpm. .at. .gmail.com.> on Tuesday September 14, 2010 @04:25AM (#33570732)

    I know they just started but still... what is the point of this? There is no upsides to rendering. It's slower (you need 4 servers), it looks worse (they had no antialiasing, ugly smoke, no complex lightning). You can do some things like reflections and refractions and portlas bit easier than with other methods but most of the time you don't need 100% correct reflections/refractions (simplified models work quite nice) and security cameras where implemented in Duke Nukem 3D on i486 machines without problems.

    Other than selling Intel chips I see no purpose for this project.

  • by retroStick ( 1040570 ) on Tuesday September 14, 2010 @04:40AM (#33570790)
    As someone who has dabbled with raytracing before, I would have to agree. It's an interesting tech demo of something that's possible, but not really of practical use. For instance, they showed the chandelier with a million polys - that's all well and good, but it's on the ceiling! If the game was actually being played, the player would never get close enough to see those clever refractions. (And even if they did, the demo shows the frame rate would drop to around 17-20 FPS).
  • by loufoque ( 1400831 ) on Tuesday September 14, 2010 @04:44AM (#33570808)

    The very idea of using the cloud to render a FPS is preposterous and will never work in practice, for obvious latency reasons.

  • by Thanshin ( 1188877 ) on Tuesday September 14, 2010 @04:53AM (#33570844)

    The very idea of using the cloud to render a FPS is preposterous and will never work in practice, for obvious latency reasons.

    How else will you start training for the moment when that computing capacity is on every PC?

    You use the cloud, ignore the lag and build an engine ready for the generation of computers that will come in five or ten years. You'll lose a lot of your investigation, but anyone who starts studying RT at that point will be years behind you.

  • by ciderbrew ( 1860166 ) on Tuesday September 14, 2010 @05:36AM (#33571032)
    This sounds like a John Lasseter I saw ages ago. Those guys are scientists not 3D artists. They can't see why it's wrong. It's job done when the maths work. I've not idea why they don't hire in a guy, most of these problems have been identified and fixed in the pre-rendered market years ago. Maybe extra lights kills the frame rate too much.

    The worst example of 3D I've seen so far would be the "shadows on a mirror" trick - nice.
  • by Fross ( 83754 ) on Tuesday September 14, 2010 @06:33AM (#33571342)

    Do you have anything to back up that "runaway success" claim? As far as I can tell it's been shunned by hardcore gamers due >100ms input lag, and I've not seen anything about it having huge takeup.

  • Ahh Youth (Score:5, Insightful)

    by kenp2002 ( 545495 ) on Tuesday September 14, 2010 @07:06AM (#33571532) Homepage Journal

    "The surveillance station. At a wall in the game you see twelve screens that each show a different location of the level. This can be used by the player to get a tactical gaming advantage. Have you ever seen something similiar in a current game? Again - probably not"

    Yes, In Duke Nukem 3D... over 15 years ago. And again in a bout 40 other FPS games that followed including the Unreal series, more then a few Quake maps especially in capture and control maps.

    "There is nothing more amusing to watch then some young kid discover something old and think it is new" - That quote in action.

  • by ksandom ( 718283 ) on Tuesday September 14, 2010 @07:26AM (#33571658) Homepage
    Moore's Law has become an expectation, and thus a design method from a marketing point of view. This is particularly visible in harddisks where they release a harddisk that has been designed to scale up, but only contains a single platter, then a little over a year and a half later, the same hard disk is released with a second platter. The expectation allows them to get ahead, while the previous iteration is slowly allowed to get to it's full potential. Then they work on the next thing and while the current platform grows.
  • Re:So many (Score:3, Insightful)

    by Rockoon ( 1252108 ) on Tuesday September 14, 2010 @07:38AM (#33571752)
    This is the true advantages of raytracing. A rasterizer would have to deal with each and every triangle in that chandelier.

    Rasterizers scale on O(triangles) while raytracers scale on O(pixels * log triangles). I dont remember if it was Microsoft Research or something out of Intel, but 5 or so years ago they did some scalability testing and concluded that about 1 million polygons was the sweet-spot where raytracing and rasterization were about equal in efficiency using the per iteration constants derived in their testing.

    This was based on visible geometry only, so no pretending that the fact that rasterizers can use logarithmic data structures for hidden surface removal, that that makes any bit of difference.

    Since then, triangle counts have remained about the same in games (with more per-pixel processing being done to simulate more geometry,) but the number of pixels have quadrupled as higher and higher resolution displays have become common. Yet they are reaching the limits with the fakes that can be done with shaders, and resolution is probably not going to go through another quadrupling, so raytracing really is comming.. just not quite yet.

    When the polygon counts do get high enough, there will be no looking back. Raytracing will be here to stay after that because of the way it scales. At 1 million polygons, a raytracer spends 20 iterations per ray cast using a logarithmic structure.. doubling the number of polygons to 2 million only adds 1 more iteration.. or about 5% more processing power required, and doubling again only adds another ~4.5%, and so on.. meanwhile each doubling of polygons on the rasterizer literally doubles the processing power required.
  • by tibit ( 1762298 ) on Tuesday September 14, 2010 @08:51AM (#33572362)

    I don't know if latency is any sort of a problem. You're talking of a LAN connection. This technology is not meant to render stuff somewhere out there on the intertubes. It needs to be in the same building, or on the same campus.

You can measure a programmer's perspective by noting his attitude on the continuing viability of FORTRAN. -- Alan Perlis