Crytek Bashes Intel's Ray Tracing Plans 151
Vigile writes "Despite all good intentions, Intel continues to see a lot of its work on ray tracing countered not only by their competition, as you'd expect, but also by the very developers that Intel is going to depend on for success in the gaming market. The first major developer to speak on the Intel Larrabee and ray tracing debate was id Software's John Carmack, who basically said that Intel's current plans weren't likely to be implemented soon or ever. This time Cevat Yerli, one of the Crytek developers responsible for the graphically impressive titles Far Cry and Crysis, sees at least 3-5 more years of pure rasterization technology before moving to a hybrid rendering compromise. Intel has previously eschewed the idea of mixed rendering, but with more and more developers chiming in for it, it's likely where gaming will move."
Ray-Tracing Extremely CPU Intensive (Score:4, Insightful)
Re: (Score:2)
Re: (Score:2)
Because the easiest way to increase the power of a computer nowadays is by adding more processor cores. A quad core processor is about as expensive as a mid-range graphics card. A dual processor motherboard and a pair of low end quad core processors is probably about as much money as one high end graphics card.
Re:Ray-Tracing Extremely CPU Intensive (Score:5, Informative)
Re: (Score:3)
Re:Ray-Tracing Extremely CPU Intensive (Score:5, Insightful)
I don't think anyone denies that ray tracing is lovely etc., but its a question of whether it is remotely feasible to do it on the current generation of CPUs or GPUs. If it takes a cluster of Cell processors (basically super fast number shovels) to render a simple scene you can bet we are some way off from it being viable yet.
Maybe in the mean time it is more suitable for lighting / reflection effects and is used in conjunction with traditional techniques.
Re: (Score:2)
Re: (Score:3, Insightful)
If I were to mention a number, I would either want at least ~72 frames per second (where the eye/brain would have a hard time discerning between individual frames) or at least match the sync of an ordinary LCD screen at 60 fps.
Re: (Score:2, Informative)
Why would one want 30 framed per second? If I were to mention a number, I would either want at least ~72 frames per second (where the eye/brain would have a hard time discerning between individual frames) or at least match the sync of an ordinary LCD screen at 60 fps.
That is not usefull at all. 30 frames per second suffice to make the eye see something as "moving" instead of taking small steps, what You describe as "where the eye/brain would have a hard time discerning between individual frames". The reason that one sees flickering on a crt is that the phosphor dots "cool down" after being hit by the electron beam, the dots have to be hit time after time. To prevent this from giving a flickering screen, the frequency by which the pixels are "activated" has to have a ce
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:3, Informative)
You are wrong. I can tell the different between a game running at 30 fps and 60 fps because games rendering does not have temporal aliasing that movies do.
Re: (Score:3, Interesting)
you want at least 30 frames per second and even that isn't considered great by many gamers.
I've always wondered about the need for a solid 60 fps in every game. A lot of games, especially console games of late, are going for that cinematic experience, and as theatrical movies themselves run at 24 fps, maybe all it would take is today's prettiest graphics and a really sophisticated use of motion-blur to make a good game running at that mere 24 fps. Maybe for first-person shooters and racing games, you want that almost hyper-real 60 fps of unblurred, crystal clear action, but for those other actio
Re: (Score:2, Interesting)
In video games, the frame rate is also the rate at which everything else (physics, etc.) is calculated.
Re:Ray-Tracing Extremely CPU Intensive (Score:5, Interesting)
Re: (Score:2)
Re: (Score:2)
So at 60, 30, or 1000fps, you still move the same speed.
Re: (Score:3, Informative)
It depends on the game. For example, the first releases of Quake 3 had different physics depending on your framerate, due to integer clamping of player positions. They fixed the issue in later patches by adding an option to force everyone to run at 125 Hz, but by default it is off.
This allows a couple jumps that are not possible UNLESS you are running at 125 Hz, such as the megahealth jump on q3dm13.
This guide has more information: http://ucguides.savagehelp.com/Quake3/FAQFPSJumps.html [savagehelp.com]
Re: (Score:2)
Both framerate and the jumping issue are related to the amount of time that has passed.
At 120fps, the time difference is 8.3...ms, at 60fps it is 16.6...ms, etc. Flooring both to get interger values gives you 8 and 16, rounding 8 and 17. Either way, these bits tend to accumulate and start giving wildly different values. In three f
Re:Ray-Tracing Extremely CPU Intensive (Score:5, Informative)
Yes its not a 'wise decision', but not all decisions can be made based on whats most logical..sometimes you need to cut corners based on what will work fastest or easiest.
In quake your movespeed and your ability to move/accelerate in the air is based entirely on your fps. Some trick jumps can't be done without a certain framerate.
In quake3 that changes more into your jump height, but the same end result -- Some jumps require certain fps to become possible.
In any HL based game your ability to slide up a steep wall instead of slide down it is impacted by your fps (and also the servers framerate).
In TFC hwguy assault cannon and a few other weapons would fire more often with higher fps.
In Natural Selection(1.x) how quick your jetpack fuel replenishes is based on your fps. Enough FPS and you could fly forever.
Theres more, but the tl;dr version: Any game that uses quake's "player.think()" system to do calculations will fire off more
Re: (Score:2)
It is on the server, which is why this isn't a matter of hacking the network and replacing a "jpfuel=10" with "jpfuel=100". It's a matter of tricking the server into running the "regenerateJpFuel()" function for you more often than anyone else. This would happen if its ran in the player.think() function instead of in some
Re:Ray-Tracing Extremely CPU Intensive (Score:4, Interesting)
Re:Ray-Tracing Extremely CPU Intensive (Score:5, Insightful)
Bullshit. Just the same as raster graphics, the amount of time you spend per frame on ray-tracing is adjusted to your needs and desires. Take, say, a Pixar film. Those are mostly done with raster graphics, with key effects done with ray-tracing. How much time do you reckon it takes to render each of one of those films' frames? (Pixar films are all drawn with Photorealistic Renderman, which is based on the REYES algorithm, which reads like a fancy raster engine)
The part about computational power is another fine display of complete misrepresentation of reality. Raster graphics are this fast nowadays for two major reasons. The most obvious is because graphics cards entire massively parallel processors specialized in drawing raster graphics. It's pretty damn obvious that, given two techniques for the same result, the one for which you use a specialized processor will always be faster, which doesn't produce evidence that a technique is inherently faster than the other. The second, less obvious, is that raster graphics have been the focus of lots of research in recent years, which makes it a much more mature technology than ray-tracing. Once again, a more mature technology translates into better results, even if the core technique has no such advantage. What Intel is supposedly aiming for here is getting the specialized hardware and mindshare going for ray-tracing, which might lead to real-time ray tracing becoming a viable alternative for raster graphics.
Re: (Score:2)
hours per frame? (Score:2)
There are some fine real-time ray tracers out there that have interactive frame-rates with moderately complex scenes on reasonable hardware. Try the arauna [igad.nhtv.nl] demo, for instance. (Note: you probably need windows to run it, wine didn't work for me.) There are others as well; Arauna just happens to be one I've tried out recently. I got about 20fps or so on a friend's dual-core laptop at a resolution somewhere around 640x480. Not fast enough to throw out the GPU just yet, but usable. Somewhere between N-64
dynamic scenes (Score:2)
I would say it's the other way around; ray tracers aren't particularly good at dynamic scenes because then you have to rebuild the acceleration structure when something moves.
That said, there has been a lot of progress lately into addressing this. The Bounding Interval Heirarchy (bih) for instance can be updated very quickly using an in-place sort very much akin to quicksort, and produces trees that are, in general, a bit slower to t
Stop motion movies (Score:4, Interesting)
Enter ILM and go motion [wikipedia.org]. Instead of filming static scenes, the clay was moved slightly during the shot to create a blurry frame. This blurry frame made the scene seem more realistic. The blur is what the eye picks up in the movie frame, so an actor walking in a scene is not a set of pinpoint focus shots but a series of blurs as the man moves.
Ray tracing is great for static scenes. But movement is the key to games that require this much detail, and so each frame should not be beautifully rendered framebuffers, but a mix of several framebuffers over the span of one frame. Star Wars did it great. Most computer games, not so much.
Re: (Score:2, Interesting)
Where did you get that idea?
Ray tracing can do selective motion blur very inexpensively. You test against a bounding sphere a triangl's motion span, then interpolate the ray along an approximation of the triangle's path.
That's a very bad analogy you're using...
Re: (Score:1)
Where did you get that idea?
Are you saying it's not?
Re: (Score:1)
Radiosity is better.
Re: (Score:2)
Re: (Score:1)
Re: (Score:2)
Re: (Score:2)
ftp://ftp.alvyray.com/Acrobat/6_Pixel.pdf [alvyray.com] (from 1995 no less)
Re: (Score:1)
Re: (Score:2)
A
Re: (Score:2)
Re: (Score:2)
Ray tracing is great for static scenes. But movement is the key to games that require this much detail, and so each frame should not be beautifully rendered framebuffers, but a mix of several framebuffers over the span of one frame.
No, no, no! Mixing several framebuffers together gives you *lousy* motion blur. You'll get severe artifacts from each pixel using the same set of uniform samples in the time domain -- very fast moving objects can appear cloned in multiple places, for example.
Honestly, ray tracing has been getting motion blur right since 1984. [pixar.com] Not to mention that it can even simulate the effect of camera shutters. [ieee.org]
you've got it arse about face. (Score:4, Funny)
Re: (Score:2)
Re:you've got it arse about face. (Score:5, Interesting)
In the case of a company like Intel who's pushing a new technology, the developers are the customers. It's not Joe Consumer who's going to be buying into Intel's technology. (At least not until there are games that support it.) It's going to be the developers. Developers who will be taking a gamble on a next generation technology in hopes that they lead the pack. And as history has proven, the first out of the gate often earns the most money. (At least in the short term.)
Of course, history has also proven that new technologies often fail. Thus the risk is commiserate with the reward. There may be a lot to gain, but there is also a lot to lose. A lot of dollar signs, that is.
Re:First out of the gate? (Score:5, Insightful)
No. That dubious distinction belongs to Classmates.com, a site launching in 1995 that did quite well for itself and is still going strong. (Oddly.)
Neverwinter Nights, Ultima Online, and Everquest (nay, Evercrack!) were all highly successful and made their creators a lot of money in the short term.
Consider what? Ford went gangbuster when it released the Model T to the market. In the short term, Ford's assembly-line approach effectively handed them the market. Toyota and Honda weren't competitors for nearly 80 years!
Re: (Score:2)
As for Ageia, I don't think they actually broke open a new market. All they did was sell a semiconductor that nobody wanted. So they would actually go in the failure category rather than the success category.
The best person to ask? (Score:5, Insightful)
Re: (Score:2)
I saw a demo for Quake 4 done with ray tracing even with 4x Quadcores the game was unplayable.. here is the demo. [youtube.com] Going with ray tracing will definitely not make any game less resource hungry.
Re: (Score:3, Insightful)
Re: (Score:2)
This is why I don't understand why there is a huge debate on this. It's not like GPUs will suddenly vanish because of raytracing. They just won't be mainstream, which may be the reason.
Re: (Score:2)
Re: (Score:1)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Obviously, this scenario doesn't apply to everyone. Some people will always need faster CPUs. It could easily a
Re: (Score:1)
Is he the same developer who made a game (Crysis) so resource hungry that no gaming platform can handle it?
Are you kidding? Nobody wants to play the 100th Doom clone other than for replay value. For a 'wow' factor, a game needs something new, something that was never done before, or never done that good. A never-seen-before feeling of immersion, a great, unique storyline, artwork that makes existing stuff look old, and sometimes... unique technical features.
To enable innovative technical features, you often need more processing power, whether from CPU, GPU or elsewhere. And for that reason, any game that
Re:The best person to ask? (Score:4, Funny)
Re: (Score:1)
And yes, slightly. Give it three months and there'll be plenty of systems that can run the game very well. (Alas, mine will not be one of them. I hope to game at least a year more on my current rig)
Re: (Score:2)
Then again, we are talking about the 'gamer' crowd, with their window-modded monitors and magic smoke pumps.
Re: (Score:1)
Re: (Score:2)
Re: (Score:2)
Seriously? Are you this stupid? It's only that resource hungry if you want to have every god-damn feature enabled. Should they have chopped out all of the extra-pretty features so it looked and ran as well as Half-Life 2? Then it would run on four-year-old hardware (like it does now, if you turn stuff off!) but the people that do have fast hardware wouldn't get any benefit. And as you beef up your computer, you'll be able to continue to get extra enjoyment out of the game for years as you dial it up.
Please refer to another clearly "very stupid" poster who has replied to my post above - The Post [slashdot.org]
And the fact that it can't have every feature running is kind of my point, it isn't that he can make new really pretty features, it is that he isn't the best placed person on how to optimize them for actual game play, there may be more valuable opinions out there. I am not dissing the act of chasing the carrot, I am attacking those who release buggy software which is way too ambitious about the hardware's abil
Re: (Score:3, Insightful)
So not an April fool then? (Score:3, Interesting)
Re: (Score:1)
Re: (Score:3, Informative)
Re: (Score:2)
There is still alot of confusion around that DX11 "announcement". Time for somebody to set it right!
Crytek? talking of bad performance? (Score:1)
why bash? (Score:5, Insightful)
Re: (Score:3, Insightful)
So why waste it on ray tracing which adds little benefit over current techniques when it could be spent on so many other things?
There are other ways of producing global illumination which is much faster then ray tracing. It's pointless because it's like taking a step back just because we can now bute force simple scenes.
Ray Tracing will still be slow on global illumination anyway. The more reflections you have the longer it takes, so it's not going to look as good
Re: (Score:3, Insightful)
Sure it requires (a lot) more cpu-power, but development wise it should all be much more straight-forward. Build the scene and have it rendered.
Right now I'm under the impression that each tim
Re: (Score:3, Insightful)
And it would still require just as much tweaking to make it look good, and make it fast.
Re: (Score:3, Insightful)
Progress doesn't always come in leaps and bounds. Sometimes, it's about baby steps.
Re: (Score:2)
And with the limitations that trying to render in realtime imposes on you, it's no wonder game developers aren't interested.
Ray tracing is good if you want to render silver balls on infinite checkerboards. For real scenes, it's no
Re: (Score:2)
Ray-tracing is just a particularly inefficient hack.
Re: (Score:2)
Ray Tracing will still be slow on global illumination anyway. The more reflections you have the longer it takes, so it's not going to look as good too.
Look as good as what? The magical non-raytracing global illumination algorithm that you are hiding from the rest of the world? I don't think "global illumination" means what you think it means. Reality check: ray tracing methods are the only way to correctly compute GI today. And by "correctly", I mean "does not fail miserably on the first non-trivial case encountered". Look up Ingo Wald's work, and Eric Veach while you're at it. If you can do it better, then do us a favor and write it up in a SIGGRA
Re: (Score:2)
Be smarter, not more forceful! (Score:1)
Why he's not into ray tracing... (Score:1, Offtopic)
Well... duh! (Score:5, Insightful)
Keep also in mind that Intel proposes this as a future way of doing rendering. Their hardware is not even here yet. Given this, any prediction below 3 years would be quite surprising.
Re: (Score:2, Interesting)
Re: (Score:2)
At some point when the 3D models get sufficiently complex then ray tracing will become a lot more attractive. With enough complexity you can model the small details that are currently faked with textures. Those small details would be hard t
Re:Well... duh! (Score:4, Informative)
The point of raytracing is that instead of having a 100,000 polygons cloth animation to raster, you could have a smoother result with about 1000 control points on a mathematical surface.
Today, game makers and modelers have the habit of breaking everything into triangles because of rasterization but the raytracing approach isn't limited to triangles; it can use any shape for which a collision with a ray can be computed. It is a very powerful approach but new tools have to be developed to use it to its full extent.
1. Consoles 2. ??? = Ray Tracing! 3. Profit? (Score:5, Interesting)
The problem with ray tracing, as Carmack said, is that it will always be much slower than raster-based graphics with a given amount of computing power. He pointed out that there's nothing impressive about Intel's demo of a game from two generations ago running sort of acceptably at moderate resolution on an overpowered Intel demo system. He said that they'll never be able to produce a ray traced engine competitive with the state of the art raster-based games, so the ray tracing, while technically satisfying, will in every case offer poor performance for inferior graphics.
All of this boils down to a time lag. If raster graphics can do something in 2008, ray tracing can do it in 2012, etc. What if raster graphics stopped progressing for four years? Then ray tracing would have a chance to catch up, perhaps leading to new engines and APIs based on ray tracing, which would ensure long term use.
But wait...raster graphics have already been at a standstill for two years, for the first time since their inception. When the 360 came out and then the 8800 line showed up to put it firmly in its technical place, gaming graphics capabilities suddenly stopped. Not only did nVidia have its first unassailable lead over ATI in a long time, but suddenly the PC gaming market finally showed very strong signs of finally dying. Most of the remaining PC game developers shifted development to consoles, leading to (again as Carmack pointed out) a stationary graphical hardware target for new games. The overall number of PC gamers managed to stay high, but literally almost all of them were playing World of Warcraft, which has very low graphics card requirements.
Now two years have gone by, and WoW still dominates PC gaming, while only a few games have shown up that really push current hardware, with few people buying them. It's a pity that the most graphically impressive game is also quite mediocre when it comes to gameplay. There's very little market pressure on nVidia outside of the small enthusiast community, and they've managed to milk a 4x hardware lead over consoles for an unprecedented length of time. The graphics card industry used to beat the living crap out of Moore's Law, but now they've managed to eek out a 10% improvement in over two years, which is just sad. The next generation parts may or may not be coming soon, may or may not bring a large performance boost, and may or may not have any software at all to really justify their purchase.
Going waaaaay back to the beginning, CPU speeds over this same time period have been keeping up with their normal exponential increase in power. At this rate, it would only take two more generations of PC gaming failure for ray tracing on the CPU to catch up with rastering on the GPU, and if that happens, it could end up going to consoles. Hell, it might even be good for PC gaming's health. Currently most console players have a PC, but with its Intel integrated graphics it's only suited to playing games from 6-8 years ago. Already those same PCs can probably match that with ray tracing. If games were only dependent on CPU speed, they'd be a lot more accessible and easily played by a much larger part of the population.
Re: (Score:2)
So you're saying there is no market pressure on nvidia because everyone keeps playing WoW and are happy with their current gfx card?
How exactly is the lack of need for better gfx going to create a market for raytracing? In this situation the only reason to switch to raytracing is when your gfx card brakes down and you already own a cpu with 128 cores.
Also, the cpu is not idle in games, there are other things to do besides rendering, like collision detection and AI.
Hey, someone read my weird rant. Good for me.
Yes, the scenario I describe is one in which no graphics stagnate, removing demand for higher end discrete graphics cards until eventually CPUs catch up and can meet gaming needs without a GPU. If everyone can get the same experience pegged at 60 fps with just the CPU, why pay nVidia for a graphics card?
In case you didn't notice, I hate this scenario. I think ray tracing is never going to be technically competitive. I also think that my story is weak because
Intel pushing ray tracing... is like Exxon ... (Score:1)
-----
Aside from the horrible metaphor to explain my point. I am basically saying that it sounds very much like ray tracing is something Intel wants everyone to use
simplicity wins (Score:4, Insightful)
Re: (Score:2)
Perhaps OT (Score:3, Interesting)
I played the Crysis demo on a recent graphics card, and was suitably impressed for ten minutes. After that, it was the same old boring FPS that I stopped playing five years ago. Graphics seem stuck in the exponential curve of the uncanny valley, where incremental improvements in rendering add nothing to the image except to heighten that sense of 'almost there' that signals to the brain that it's *not* photorealism.
This isn't meant to be the same old "it's the gameplay, stupid" rant that we get here. It's simply to question why any real work is being done on rendering engines when we seem to long have passed the point of diminishing returns.
Re: (Score:2)
Not working on new technology for rendering would be something akin to the patent office declaring everything had already been invented so they might as well close down... which didn't happen.
Personally, having grown up on 8-bit systems (Atari mostly), then 16-, 32- and now 64-Bit processors... in my lifetime... I can't
Re: (Score:2)
I'd rather developers concentrated on making games look better with artwork, animation, modelling, scenery etc, rather than just throwing endless buzzwords at the
Re: (Score:2)
Uncharted especially has beautiful animations and excellent textures.
Re: (Score:2)
Because people keep spending money to buy the new tech. It sells. There's money in it. No other reason is necessary for the companies to want to improve. (There may be other reasons, but that one is sufficient.)
A word about raytracing purism. (Score:5, Interesting)
Further, raytracing cannot handle advanced refraction and reflection effects, like the surface of water causing uneven illumination at the bottom of a pool, or a bright red ball casting a red spot on a white piece of paper, without preemptive "photon mapping", which is another cheat.
In short, we have not been able improve upon the original raytracing algorithms without "cheating reality". Modern raytracing that includes photon mapping is a hybrid anyway. So the raytracing purists really have nothing to stand on until there's enough hardware to accurately calculate the paths of quadrillions of photons at high resolution sixty times a second. I'm not saying we won't get there, I'm saying probably not within this decade.
The reality is, the only advantage raytracing has over rasterization is its ability to compute reflection, refraction, and some atmospheric effects (e.g. a spotlight or a laser causing a visible halo in its path) with "physical" accuracy. The capabilities of rasterization have grown leaps and bounds since the 1960s, roughly linearly in proportion to available hardware.
Purists be damned. A hybrid of each technique utilizing what it's good at (raytracing for reflection, refraction, and atmospheric halos, rasterization for drawing the physical objects, "photon mapping" for advanced reflection and refraction effects) is likely the best approach here.
Re: (Score:2)
Re: (Score:2)
The cheats that must be enabled for rasterization-based reflection and refraction can be overcome simply and elegantly using raytracing algorithms. So rather than argue for rendering method purism,
That timeline sounds perfect... (Score:2)
Growth of technology. (Score:2)
While some of today's games certainly look impressive they've still got a long way to go because they can be deemed realistic. Actually, I find photo-realism to be bland. It's kind of like photo-realistic paintings. Certainly, the techniqu
Hrm... eight cores... in-order only processing... (Score:2)
Same reason banks want you to invest (Score:2, Insightful)
Bankers push investments, not because it benefits you, but because it benefits them! Intel, as a corporation, is interested in your money, not your best interests.
mmm ray tracing (Score:2)
Re: (Score:2)