With 8 Cards, Wolfenstein Ray Traced 7.7x Faster 97
An anonymous reader writes "As Slashdot readers know Intel's research project on ray tracing for games has recently been shown at 1080p, using eight Knights Ferry cards with Intel's 'Many Integrated Core' architecture. Now a white paper goes into more detail, documenting near-linear scaling for the cloud setup with 8 cards, and gives details on the implementation of 'smart anti-aliasing.' It will be interesting to see how many instances of Intel's next MIC iteration — dubbed Knights Corner, with 50+ cores — will be required for the same workload."
Nice scaling (Score:2)
Re: (Score:3)
Yeah, that makes sense.
Re: (Score:1)
My thought was that if it takes 8 Intel GPUs that you should be able to run it only at most 4 nVidia or AMD GPUs.
Re:Nice scaling (Score:4, Informative)
Wrong. Nearly the entire raster pipeline would be ignored for ray tracing, and you don't really need a lot of shading units for the rest (don't need to multitexture in the background and whatnot). The main use for the GPUs in ray tracing would be collision detection, which could be written into shaders as long as the entire scene was loaded into each GPU's memory, so Wolfenstein is actually a good choice - a large scene would have problems because of memory constraints. Ray tracing works very well with lots of parallel CPUs, but usually is memory constrained (dependent on memory access more than anything else) in that scenario, so splitting it off onto multiple GPUs is a way to remove that constraint, but basically it still works like a lot of parallel CPUs accessing the same scene in memory.
Comment removed (Score:4, Informative)
Re:Nice scaling (Score:5, Insightful)
Spot on! ION was the best thing that ever happened to the Atom platform. Really, it was the only thing that made it into a usable HTPC or ultra-low-power desktop. They really need to stop shitting down NVidia's throat because they are precisely the kind of aggressive, performance-driven company that would fit alongside Intel's model.
Re: (Score:3)
Re: (Score:2)
Thanks for the tip! I'll give them a try, I've been wanting a low-power office/surf machine anyway.
Re:Nice scaling (Score:5, Informative)
Note that these are raytracing cards, not rendering. Raytracing is a very different technique which can do cool effects like refraction through glass (shown in the chandeliers and scopes), jawdropping water, and realistic lighting effects that rendering cards simply cannot do.
It's also much more demanding on hardware. One of the big drawbacks is it requires a lot of scattered reads out of memory making caching much less effective. You need tons of bandwidth to low latency memory to make it happen. We're still a very long ways out from having this possible in reasonably-priced consumer GPUs.
Rag on Intel for their integrated graphics if you want (though I consider them a good non-gaming graphics chip with very good open source support), but these cards are not related to those in any way. These are full-featured x86/x64 processors with 32 cores per die. In other words, they created a 256-core system capable of software-raytracing the whole thing at high resolutions.
That is quite an accomplishment, and rest assured, it is top-tier performance in the raytracing world. This isn't meant to be a practical gaming system; this is pretty clearly being done by Intel to show off the benefits of their many-cores processors, and it is an impressive show.
To the GP: They're using Wolfenstein because it's one of very few games that has a ray-traced variant, and it exists only because Intel created it as a testbed. More on that here: https://en.wikipedia.org/wiki/Wolfenstein:_Ray_Traced [wikipedia.org]
Re:Nice scaling (Score:5, Insightful)
The problem with these demos is, they use ray tracing like we did in 1980 (i.e. Whitted style). All computations are highly coherent and efficient. As soon as you want to have more natural rendering, with diffuse illumination etc. Parellization doesn't scale proportionally anymore. Rays become heavily incoherent, memory access scatters and you get cache misses etc. So the real feat would have been if tey show 7.7x speed with diffuse global illumination.
Re: (Score:2)
And this is illustrated beautifully by the pretty shoddy visuals that Intel are showing off. These graphics would have been jaw dropping 15 years or so ago, but frankly today they look amateur and desperately outdated. Reflection and refraction are nice gimmicks but it'll be a rare game that actually makes use of them to improve the gameplay. For all the other titles out there these effects are usually faked to a high enough quality that it doesn't make much of a difference to the gamer.
If Intel really w
Re: (Score:2)
Judging from my limited experience with 3dsmax and real diffuse material pipelines I would suggest that state of the art RT algos won't come into the real time scene for ohh at least two decades. That is for real implementations. AFAIK You still can `hack` reflection and refraction behaviors to kind of simulate true diffuse refractions.
I can remember blowing render jobs' render times into thousandfolds with misuse of diffuse reflections.
Re: (Score:2)
Re: (Score:1)
Note that these are raytracing cards, not rendering. Raytracing is a very different technique which can do .... and realistic lighting effects that rendering cards simply cannot do.
Nope. Raytraced images are some of the least realistic. Once you get past doing glass balls on a checkered floor it's downhill all the way.
Raytracing has it completely backwards, light doesn't work that way in the real world.
Re:Nice scaling (Score:5, Informative)
I believe you're mistaken. Raytracing IS the technique where you're tracing light much the way it happens in the real world. The techniques usually used in GPUs are quite backward. It hasn't really been all that downhill, though; they've gotten pretty good at faking a lot of the effects, but when it comes to things like shadows, local lighting, radiosity, and refraction, Raytracing is where it's at.
Examples:
https://upload.wikimedia.org/wikipedia/commons/e/ec/Glasses_800_edit.png [wikimedia.org]
http://hof.povray.org/images/chado_Big.jpg [povray.org]
http://hof.povray.org/micra1_09.html [povray.org]
http://hof.povray.org/images/warm_up_Big.jpg [povray.org]
http://hof.povray.org/images/kitchen.jpg [povray.org]
All of those are from POV-Ray. There are plenty more in their gallery over here:
http://hof.povray.org/ [povray.org]
Feel free to send some counterexamples of other techniques doing it better.
Re: (Score:2)
Re: (Score:1)
I believe you're mistaken. Raytracing IS the technique where you're tracing light much the way it happens in the real world.
Nope.
In the real world light comes from a light source, bounces around a bit, then a lucky few photons arrive at your eyes.
In raytracing you trace a ray from your eye to the world and try to get back to the light source. That's the opposite direction.
Re: (Score:3)
So what? It's essentially same end result with orders of magnitude less computation from wasted photons.
Where do you get that raytracing is "some of the least realistic"? It's one of the most realistic techniques that's actually in use, far better looking than anything that's currently done in realtime.
Re: (Score:1)
Re: (Score:1)
Re: (Score:2)
Raytracing definitely allows depth of field (a certain focal length). They're showing it in TFA, and in many of the examples I linked.
Re: (Score:2)
Nah, it's raytracing, you just scatter the rays you shoot for each pixel taking into account the lens's circle of confusion (and shoot more rays overall), with biases for things like the number of leaves on the camera's iris for extra realism.
Most of the time a 2D DoF effect using a rendered zbuffer is just fine, but raytracing will give you proper defocusing of reflections and refractions, as well as showing objects that would be completely obscured in the in-focus render.
Like a poster above said, though,
Re: (Score:1)
Re: (Score:2)
If you use the simplistic style of raytracing then yes, but there are many additions which make it possible to do extremely realistic scenes.
The fundamental problem is that ray tracing is only half of the puzzle. Typically you trace from each pixel on a "screen" into the 3d scene and look at where that ray intersects with an object. You then calculate the color of the object at that point and this becomes the color of that pixel on the screen. (In a real scenario you typically calculate multiple rays per pi
Re: (Score:2)
Hybrid engines which combine raytracing, radiosity and photon mapping can give very good results, yes.
OTOH saying that raytracing alone gives realistic lighting is very naive.
Re: (Score:1)
Re: (Score:2)
I think the term you're looking for regarding "normal" graphics cards is "rasterising". Both rasterisation and ray-tracing are examples of rendering, which is the general term for turning data into an image. (Typically 3D data onto a 2D screen.)
Intel have been trying this technique of putting x86 cores on a board for quite some time now. But they still seem to be struggling to figure out a good use for them. One thing traditional GPUs have going for them is that they are rather dumb and limited in their cap
Re: (Score:2)
You are correct. I guess I learned some of the terminology wrong back in the day. :)
Re:Nice scaling (Score:4, Interesting)
It's also much more demanding on hardware. One of the big drawbacks is it requires a lot of scattered reads out of memory making caching much less effective. You need tons of bandwidth to low latency memory to make it happen. We're still a very long ways out from having this possible in reasonably-priced consumer GPUs.
Yes, it is exactly what Intel Mic card are awesome for. They are generic x86 core with 4-way SMT and a buttload of memory bandwidth. I worked with Knight Ferry prototypes and studied the scalability of the worst case of algorithms for scattered memory access: graph algorithms. (The paper will be published soon but the preprint is available at http://bmi.osu.edu/hpc/papers/Saule12-MTAAP.pdf [osu.edu] .) Basically, we achieve close to optimal scalability on most of our tests.
These MIC card are designed to scale in good cases (compact memory and SIMDizable operations such as dense matrix vector multiplication, or image processing) but almost in the bad cases (lots of indirections, accessing caches lines in pathological scenarios such as sparse matrix vector multiplication, graph algorithms.)
I am excited to get a hold on the commercial card (we worked on prototypes) to make a CPU/GPU/MIC comparison.
Re: (Score:2)
Intel would have destroyed NVIDIA. They give up on graphics every few years and they only make enough for the lowend market. They similarly would have killed any progress with NVIDIA after a few years. It only would have caught them up for awhile.
Intel doesn't get graphics. It's so bad, I recommended an AMD A4 yesterday over an ATOM build because of the GPU.
The point isn't really raytracing (Score:2)
Raytracing is an example of an embarrassingly parallel vector math problem. It's not the only such example nor the only use these cards are being put to. They're being used in thermodynamic, aerodynamic and hydrodynamic modelling of systems for computer design, for mineral exploration, for climate modelling. It would not surprise me if NASA has a cluster with them for certain space physics applications. No doubt for financial modelling too.
The point of displaying the cards doing real-time 1080p raytrac
Re: (Score:3)
Actually, I remember Intel doing a lot of work on a new Wolf 3D engine specifically designed for excellent scaling, rather than raw performance on a single GPU.
This isn't Wolf3D. It's a much better engine (especially if you are Intel), with Wolf3D content.
Re: (Score:2)
and yet (Score:3, Insightful)
it honestly looks like an old game to me, yes there are some impressive features, but I really have to look for them in the images, something that is not going to happen at 60Hz (and if its not running at real speed who cares, that is a movie which can take its sweet ass time rendering frame by frame)
This isn't a fucking game announcement (Score:3)
Wolfenstein is an old game. The ray-traced version is being used as a "Utah Teapot", a standard object to develop and compare rendering techniques.
Re:and yet (Score:5, Insightful)
Welcome to the current generation of ray traced game engine demos. They're poor looking. Apart from the metal sphere floating above the water and the odd glass refraction effect.
You'd think that they could hack something impressive looking together, particularly as they are competing against whatever it is the next generation of polygon rendering will come up with.
Re: (Score:2)
What is important is that these rays are being cast in a relatively efficient manner allowing for realtime feedback. An engineer doesn't care specifically what those rays may be used for but just that they
Re: (Score:2)
There's another aspect of raytracing that many don't even get. It's the Military aspects, such as being able to efficiently calculate the origin of a shot (backtrace). For things like this (think about the final action in Last StarFighter) and you suddenly realize that Intel isn't working on this for gamers but for the Military. Lots more money to be made when you consider all of the CiC systems that would benefit from the ability to backtrace incoming fire and take it out with the appropriate weapon (StarS
Re: (Score:2)
You've been drinking too much. Or too little, I forget how it is with you. Anyway, you haven't been drinking the exact right amount.
Re:and yet (Score:5, Insightful)
Rent the Pixar Shorts DVD and watch what they did before Toy Story 1. Red's Dream, Tin Toy and Andre'B are all pretty crappy looking short films that demonstrate what a couple of guys in a lab could do at the time - when viewing them, you're supposed to imagine what could be done by a larger studio with funding, not bash them because a couple of guys in a lab given 3-6 months aren't producing something competitive with hundreds of people given millions of dollars and a couple of years.
Re: (Score:2)
Also, while Red's Dream didn't win any awards, Luxo Jr. from the previous year was a nominee for the best animated short Oscar, and Tin Toy from the following year won. So yes, cutting-edge.
Ummmm except (Score:3)
That Intel is not just some small outfit, and they are the ones who want to push this change from rasterization to ray tracing. Rasterization works great and looks good and is what run well on all the GPUs out there today. Makes AMD and nVidia happy, they make billions doing it. Intel is unhappy, they want you spending less, or rather none, on those products, more in Intel products. So they are on about ray tracing. Something that GPUs aren't as good at.
Well guess what? To convince people the change is wort
Re: (Score:2)
That Intel is not just some small outfit,...
Yes, but... this is apparently not a big part of their greater plans at the moment. Not everything that has the Intel name on it is given billion dollar backing.
I think that the realtime raytracing thing is coming, not this year, probably not with 22nm processes, but by the time 6nm processes and 3D packaging are here, there are going to be way more than 8 cards worth of transistors on a single chip.
Re: (Score:2)
Have to compare that to what will be available form nVidia and AMD though. There really isn't a "right" rendering technology, people are not all in with ray tracing even in the high end world. 3Dsmax uses a scanline renderer by default, there are plugins for it like the Indigo Renderer which uses basically uses various Monte-Carlo methods to get really realistic images.
In terms of realtime rendering it will be whatever can give the best perceived quality on the least amount of hardware. Maybe that'll end up
Re: (Score:2)
The evolution I have seen, for better or worse, over the last 30 years is from impossible to barely possible to practical to so-easy you can do it with stupid simple algorithms, and most people do because the hardware is cheaper than writing clever software.
Clever software will always have a great economy of scale, but when people have the equivalent of a 1990s supercomputer in their cell phone running for 7 days on a battery that weighs 20 grams, clever software won't matter as much as it used to.
Ray traci
Re: (Score:2)
The problem is that ray tracing doesn't do the trick. As I pointed out in another post, ray tracing sucks at indirect lighting. Since you tracing back from the display to sources, it only does direct lighting well. Thing is, most of the lighting we see in the real world is indirect. So you've got three choices:
1) Deal with poor lighting. Suboptimal, particularly since rasterization isn't so problematic with this. You can handle indirect lighting a number of ways and have it work fairly well.
2) Use a trick.
Re: (Score:2)
I'm fine with the idea of "do it simple" if the hardware can handle it. However we are a long, long way from that in graphics.
My bigger point is just nothing Intel has produced has convinced me that ray tracing is a better way to go. Never mind that they are still talking about hardware that doesn't exist.
Ray tracing may not (or, eventually with photon mapping, may) be the way to go. If by long, long way you mean 8 years, then, yes, I'd agree.
At my age, 8 years goes pretty quick, and even when I was younger, I only replaced my computers at most every 4 years, I've had a couple of systems for 8 or more years.
And, if they weren't talking about hardware that didn't exist, I'd be pretty bored - the existing stuff is pretty well understood, and yes, on the existing stuff, realtime ray tracing is pretty sucky com
Re: (Score:2)
Re: (Score:3)
Re: (Score:1)
I know exactly how this works, and I have written a couple crappy little ray-tracers in the past. ray-tracing is one of those things like the space program ... yea it could do a lot, but it doesn't, because in reality its not very practical and not very useful. Displaying pixels on a grid your always going to have an margin of display error, and who cares if you can see its a 100% perfect circle as long as the computer knows and correctly calculates it.
No it is pretty easy (Score:2)
Raytracing falls down bigtime in the lighting department. It can't handle indirect lighting well and you get this situation of everything looking too perfect and shiny. Reflective metal spheres it is great at. Human flesh, not so much.
Now there are solutions, of course. You do photon mapping or raidosity and you can get some good illumination that can handle diffuse lighting, caustics, and that kind of shit. However ray tracing by itself? not so much. Problem is none of that other shit is free. You don't ju
Re: (Score:2)
Re: (Score:1)
Re: (Score:1)
Games (and so forth) are using more and more on-screen polygons, which scales linearly with a rasterization but logarithmically with ray tracing. Ray tracing will inevitably be as efficient for the same quality as rasterization if things continue as they do, and from then on rasterization will never be able to keep up (just like bubble sort cant keep up with any O(N log N) sort for sufficiently large N)
But the re
intel 3d (Score:5, Funny)
(ducks)
Re: (Score:2)
Hell, I had a Voodoo Rush and I think it was faster tan a new Intel 3d card.
Re: (Score:2)
I've still got a 32MB Voodoo 3 3k (given to me as surplus to requirements - also the guy couldn't get the driver to work on NT, which I managed to get going on Slackware 8)... still works, too. I'm using it as a head for one of my thin clients.
Another client has a NVidia Riva TNT2 Model 64 32MB dual head AGP (my first AGP card).
The third has an ATI Rage Pro 8MB (upgraded from 4MB). This was the first PCI graphics card I ever bought.
Ridiculously old cards, but they still work as advertised - which is plenty
Re: (Score:2)
Weren't the Voodoo 3 cards 3D accel only? How did you manage to push frame buffers through them?
Re: (Score:1)
No, the voodoo 3 was both 2d/3d. It was the first of their cards that wasn't 3d only. (well, in the main line. I think there was a variant of 1 or 2 that was less powerful but also did 2d. Not sure about that though. )
Re: (Score:2)
Re: (Score:2)
How was your fillrate with TNT2 on 1080p resolution? :-)
Re: (Score:1)
Nested links are nested (Score:1)
Jesus guys, how many Slashdot articles do I have to go back through until I can find the original Wolfenstein thing?
http://blogs.intel.com/research/2010/09/12/wolfenstein_gets_ray_traced_/
http://www.wolfrt.de/
Raytracing is embarrassingly parallel (Score:3)
Re: (Score:1)
Intel doesn't make games. Intel makes hardware. You can use that hardware to play great games, or you can use the same hardware to play bad games. GPUs cannot help with the story, the replayability or the installation, but they can help with the graphics.
Re: (Score:2)
And with the physics simulation.
Re: (Score:1)
Maybe in a few more years when games give up on supporting the current mid-range GPU's....
Re: (Score:1)
And game makers will look at this demo as something they want to do eventually. It's good to be clear with what we want.
While game makers will probably look at the demo, they will certainly not look at a random comment of an Anonymous Coward in a Slashdot discussion about that demo.
How is it a "cloud" ... (Score:2)
Re: (Score:1)
It's a "cloud" because people don't understand what it means and because it makes it sound better to marketing.
For airsoft it's "Lipoly Ready" which means nothing, all it means is it is physically possible to attach a lipoly battery to it that is it.
For food it's "healthy", "natural" etc, than you read the lable and you find out the so called natural ingredients make up almost nothing of the product.
For cars I can't even think of something as they try to pull too many things, and I look into the car specs a
Re: (Score:2)
they forgot where in the datacentre the machines where.
then it qualifies as cloud.
came for a 1080p vid of wolfenstein 3d raytraced.. (Score:2)
..read the article, got disappointed. it's a reboot they're raytracing :.
and couldn't find a video(youtube has an older vid..).
and one of the links in the article is broken.
shoddy. now someone do a hack to make onlives servers do this parallel setup..
Desperate attempt to make it seem feasible (Score:1)
A really cool article, but why do they spin it as a 'cloud' setup?
In my experience, the gamers who care about such beautiful graphics are happy to spend a few grand on hardware. They are not happy with jitter due to the internet connection, or waiting in line for a server.
Not the original :( (Score:5, Funny)
Is it really that special? (Score:1)
Re: (Score:2)
hmmm... optimised software. Read: custom code for massively parallel clusters. Oh, yeah. :)
Good network. Read: 2-ary-4-tree with twin redundant fibre switching. Or for home users with a bit of spare cash rather than a University department with EOY budget to blow, several lengths of cat5, some PCI Gigabit ethernet cards and redundant Gigabit switchgear (what I did with a pair of DLink 24-port Gigabit switches and a boatload of surplus cat5 patch cables. Oh, yeah, that's one fast network).
IAAG (I Am A Geek).
What's with the X deep links? (Score:2)