Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Cloud Graphics Intel Games Technology

With 8 Cards, Wolfenstein Ray Traced 7.7x Faster 97

An anonymous reader writes "As Slashdot readers know Intel's research project on ray tracing for games has recently been shown at 1080p, using eight Knights Ferry cards with Intel's 'Many Integrated Core' architecture. Now a white paper goes into more detail, documenting near-linear scaling for the cloud setup with 8 cards, and gives details on the implementation of 'smart anti-aliasing.' It will be interesting to see how many instances of Intel's next MIC iteration — dubbed Knights Corner, with 50+ cores — will be required for the same workload."
This discussion has been archived. No new comments can be posted.

With 8 Cards, Wolfenstein Ray Traced 7.7x Faster

Comments Filter:
  • The scaling with 8 cards is very near linear. These cards are going to be great if we ever see them at retail.
  • and yet (Score:3, Insightful)

    by Osgeld ( 1900440 ) on Saturday March 03, 2012 @04:46AM (#39229899)

    it honestly looks like an old game to me, yes there are some impressive features, but I really have to look for them in the images, something that is not going to happen at 60Hz (and if its not running at real speed who cares, that is a movie which can take its sweet ass time rendering frame by frame)

    • Wolfenstein is an old game. The ray-traced version is being used as a "Utah Teapot", a standard object to develop and compare rendering techniques.

    • Re:and yet (Score:5, Insightful)

      by rhyder128k ( 1051042 ) on Saturday March 03, 2012 @06:25AM (#39230071) Homepage

      Welcome to the current generation of ray traced game engine demos. They're poor looking. Apart from the metal sphere floating above the water and the odd glass refraction effect.

      You'd think that they could hack something impressive looking together, particularly as they are competing against whatever it is the next generation of polygon rendering will come up with.

      • I'm sorry Michael but you have a poor understanding of rendering technology. They are showing off technology to potential partners and customers. They are not proposing an engine that utilizes the technology which will go to the consumer. It won't look pretty until it is developed as an end product.

        What is important is that these rays are being cast in a relatively efficient manner allowing for realtime feedback. An engineer doesn't care specifically what those rays may be used for but just that they
        • There's another aspect of raytracing that many don't even get. It's the Military aspects, such as being able to efficiently calculate the origin of a shot (backtrace). For things like this (think about the final action in Last StarFighter) and you suddenly realize that Intel isn't working on this for gamers but for the Military. Lots more money to be made when you consider all of the CiC systems that would benefit from the ability to backtrace incoming fire and take it out with the appropriate weapon (StarS

          • You've been drinking too much. Or too little, I forget how it is with you. Anyway, you haven't been drinking the exact right amount.

      • Re:and yet (Score:5, Insightful)

        by JoeMerchant ( 803320 ) on Saturday March 03, 2012 @10:17AM (#39230759)

        Rent the Pixar Shorts DVD and watch what they did before Toy Story 1. Red's Dream, Tin Toy and Andre'B are all pretty crappy looking short films that demonstrate what a couple of guys in a lab could do at the time - when viewing them, you're supposed to imagine what could be done by a larger studio with funding, not bash them because a couple of guys in a lab given 3-6 months aren't producing something competitive with hundreds of people given millions of dollars and a couple of years.

        • That Intel is not just some small outfit, and they are the ones who want to push this change from rasterization to ray tracing. Rasterization works great and looks good and is what run well on all the GPUs out there today. Makes AMD and nVidia happy, they make billions doing it. Intel is unhappy, they want you spending less, or rather none, on those products, more in Intel products. So they are on about ray tracing. Something that GPUs aren't as good at.

          Well guess what? To convince people the change is wort

          • That Intel is not just some small outfit,...

            Yes, but... this is apparently not a big part of their greater plans at the moment. Not everything that has the Intel name on it is given billion dollar backing.

            I think that the realtime raytracing thing is coming, not this year, probably not with 22nm processes, but by the time 6nm processes and 3D packaging are here, there are going to be way more than 8 cards worth of transistors on a single chip.

            • Have to compare that to what will be available form nVidia and AMD though. There really isn't a "right" rendering technology, people are not all in with ray tracing even in the high end world. 3Dsmax uses a scanline renderer by default, there are plugins for it like the Indigo Renderer which uses basically uses various Monte-Carlo methods to get really realistic images.

              In terms of realtime rendering it will be whatever can give the best perceived quality on the least amount of hardware. Maybe that'll end up

              • The evolution I have seen, for better or worse, over the last 30 years is from impossible to barely possible to practical to so-easy you can do it with stupid simple algorithms, and most people do because the hardware is cheaper than writing clever software.

                Clever software will always have a great economy of scale, but when people have the equivalent of a 1990s supercomputer in their cell phone running for 7 days on a battery that weighs 20 grams, clever software won't matter as much as it used to.

                Ray traci

                • The problem is that ray tracing doesn't do the trick. As I pointed out in another post, ray tracing sucks at indirect lighting. Since you tracing back from the display to sources, it only does direct lighting well. Thing is, most of the lighting we see in the real world is indirect. So you've got three choices:

                  1) Deal with poor lighting. Suboptimal, particularly since rasterization isn't so problematic with this. You can handle indirect lighting a number of ways and have it work fairly well.

                  2) Use a trick.

                  • I'm fine with the idea of "do it simple" if the hardware can handle it. However we are a long, long way from that in graphics.

                    My bigger point is just nothing Intel has produced has convinced me that ray tracing is a better way to go. Never mind that they are still talking about hardware that doesn't exist.

                    Ray tracing may not (or, eventually with photon mapping, may) be the way to go. If by long, long way you mean 8 years, then, yes, I'd agree.

                    At my age, 8 years goes pretty quick, and even when I was younger, I only replaced my computers at most every 4 years, I've had a couple of systems for 8 or more years.

                    And, if they weren't talking about hardware that didn't exist, I'd be pretty bored - the existing stuff is pretty well understood, and yes, on the existing stuff, realtime ray tracing is pretty sucky com

    • Because there are so many brand new games with tons of mod tools and an open source engines.
    • You're only saying this because you don't have a clue how 3D engines work. A raytraced game could potentially look "real" unlike current games which continue to just look like more and more sophisticated animated cartoons. Google raytraced images sometime, it's hard to tell that they are CG and not real a lot of times.
      • by Osgeld ( 1900440 )

        I know exactly how this works, and I have written a couple crappy little ray-tracers in the past. ray-tracing is one of those things like the space program ... yea it could do a lot, but it doesn't, because in reality its not very practical and not very useful. Displaying pixels on a grid your always going to have an margin of display error, and who cares if you can see its a 100% perfect circle as long as the computer knows and correctly calculates it.

      • Raytracing falls down bigtime in the lighting department. It can't handle indirect lighting well and you get this situation of everything looking too perfect and shiny. Reflective metal spheres it is great at. Human flesh, not so much.

        Now there are solutions, of course. You do photon mapping or raidosity and you can get some good illumination that can handle diffuse lighting, caustics, and that kind of shit. However ray tracing by itself? not so much. Problem is none of that other shit is free. You don't ju

        • It IS the be-all end-all of computer graphics. Indirect lighting is only a problem due to the limitations of CPU speed. Specifically, when you set up a render you set it up with the number of "bounces" a ray will make. When you're doing live video, those bounces are set to about 3... it's hard to get ambient lighting with that. Is a Raytracing engine the solution to computer graphics right now? Probably not. In 100 years when computers are likely smarter than we are and have us hosted in matrix-like virtual
    • This is all still proof of concept. Just the fact that you can raytrace an image like this is impressive. Once realtime raytracing is a reality, then more advanced shading systems will be developed which do more interesting things with those cast rays. An example shown in the article is physically based refraction for glass and water. A more challenging application would be subsurface scattering "light penetrating a surface and bouncing around before it bounces back out, i.e wax," or light dispersion " c
      • The real advantage to ray tracing is how it scales only logarithmically with scene geometry.

        Games (and so forth) are using more and more on-screen polygons, which scales linearly with a rasterization but logarithmically with ray tracing. Ray tracing will inevitably be as efficient for the same quality as rasterization if things continue as they do, and from then on rasterization will never be able to keep up (just like bubble sort cant keep up with any O(N log N) sort for sufficiently large N)

        But the re
  • intel 3d (Score:5, Funny)

    by maestroX ( 1061960 ) on Saturday March 03, 2012 @04:53AM (#39229917)
    so, how does it stack up against a Riva TNT2?
    (ducks)
    • How was your fillrate with TNT2 on 1080p resolution? :-)

  • Comment removed based on user account deletion
  • by Anonymous Coward

    Jesus guys, how many Slashdot articles do I have to go back through until I can find the original Wolfenstein thing?

    http://blogs.intel.com/research/2010/09/12/wolfenstein_gets_ray_traced_/
    http://www.wolfrt.de/

  • by gentryx ( 759438 ) on Saturday March 03, 2012 @05:22AM (#39229957) Homepage Journal
    ...which is why it's easy to scale [wikipedia.org] up. Thus the speedup isn't that impressive. Scalability on tightly coupled apps would be much more interesting.
  • ... if it's all in the same datacentre?
    • by Anonymous Coward

      It's a "cloud" because people don't understand what it means and because it makes it sound better to marketing.

      For airsoft it's "Lipoly Ready" which means nothing, all it means is it is physically possible to attach a lipoly battery to it that is it.

      For food it's "healthy", "natural" etc, than you read the lable and you find out the so called natural ingredients make up almost nothing of the product.

      For cars I can't even think of something as they try to pull too many things, and I look into the car specs a

    • by gl4ss ( 559668 )

      they forgot where in the datacentre the machines where.

      then it qualifies as cloud.

  • ..read the article, got disappointed. it's a reboot they're raytracing :.

    and couldn't find a video(youtube has an older vid..).

    and one of the links in the article is broken.

    shoddy. now someone do a hack to make onlives servers do this parallel setup..

  • A really cool article, but why do they spin it as a 'cloud' setup?

    In my experience, the gamers who care about such beautiful graphics are happy to spend a few grand on hardware. They are not happy with jitter due to the internet connection, or waiting in line for a server.

  • by shish ( 588640 ) on Saturday March 03, 2012 @08:43AM (#39230449) Homepage
    Apparently this is the newer wolfenstein games; I wanted to see what 8 GPUs worth of fancy effects could do to the original pre-Doom Wolfenstein :(
  • I'm impressed by the raytracing speeds and all but is it surprising that it has near linear scaling? Raytracing is very well suited for parallel processing and scaling is nearly linear on CPU's if the software is well optimized and you're on a good network.
    • hmmm... optimised software. Read: custom code for massively parallel clusters. Oh, yeah. :)
      Good network. Read: 2-ary-4-tree with twin redundant fibre switching. Or for home users with a bit of spare cash rather than a University department with EOY budget to blow, several lengths of cat5, some PCI Gigabit ethernet cards and redundant Gigabit switchgear (what I did with a pair of DLink 24-port Gigabit switches and a boatload of surplus cat5 patch cables. Oh, yeah, that's one fast network).

      IAAG (I Am A Geek).

  • I want my visual porn, not an endless link farm of old /. articles.

One man's constant is another man's variable. -- A.J. Perlis

Working...