Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Intel Graphics Software Entertainment Games

Crytek Bashes Intel's Ray Tracing Plans 151

Vigile writes "Despite all good intentions, Intel continues to see a lot of its work on ray tracing countered not only by their competition, as you'd expect, but also by the very developers that Intel is going to depend on for success in the gaming market. The first major developer to speak on the Intel Larrabee and ray tracing debate was id Software's John Carmack, who basically said that Intel's current plans weren't likely to be implemented soon or ever. This time Cevat Yerli, one of the Crytek developers responsible for the graphically impressive titles Far Cry and Crysis, sees at least 3-5 more years of pure rasterization technology before moving to a hybrid rendering compromise. Intel has previously eschewed the idea of mixed rendering, but with more and more developers chiming in for it, it's likely where gaming will move."
This discussion has been archived. No new comments can be posted.

Crytek Bashes Intel's Ray Tracing Plans

Comments Filter:
  • by foxalopex ( 522681 ) on Friday April 11, 2008 @09:09AM (#23035194)
    It's no surprise that Intel is being bashed over their idea of real-time CPU ray-tracing. As anyone who has ever ray-traced will realize it's extremely slow. At times you're talking about HOURS PER FRAME while realistically you want at least 30 frames per second and even that isn't considered great by many gamers. It's going to take a HUGE and I mean HUGE increase in computation power before that happens. Rasterization techniques are tremendously faster and they look nearly as good as Ray-tracing for the most part. Considering that we're yet to reach a point in Rasterization where we don't need more processing power (Crysis in high resolution.) I don't see us moving away from it yet. The day when we declare that we have graphics cards more powerful than we need for Rasterization is when we start moving towards ray-tracing. That day isn't anytime soon unfortunately.
  • by Don_dumb ( 927108 ) on Friday April 11, 2008 @09:13AM (#23035234)

    Cevat Yerli, one of the Crytek developers responsible for the graphically impressive titles Far Cry and Crysis
    Is he the same developer who made a game (Crysis) so resource hungry that no gaming platform can handle it? Shouldn't we be asking someone who knows how to make a game look great on current hardware, such as Valve perhaps?
  • why bash? (Score:5, Insightful)

    by damnfuct ( 861910 ) on Friday April 11, 2008 @09:32AM (#23035424)
    Yeah, so it's going to take 3-5 years before anything real comes out of it. Do you think that process of using high-k hafnium in the 45 nm microprocessors was developed overnight? I'm sure intel is used to the R&D cycle, and 3-5 years is not unheard of. Besides, how much longer can you use rasterization "band aids" to address rending issues (reflections, shadows, light sources)? Rasterization is just a hack to try to implement features that simply "fall out" of ray tracing. Sure it's going to take computational power, but we're not going to be using pentium 75's.
  • Re:why bash? (Score:3, Insightful)

    by LingNoi ( 1066278 ) on Friday April 11, 2008 @09:45AM (#23035584)

    Sure it's going to take computational power
    So why waste it on ray tracing which adds little benefit over current techniques when it could be spent on so many other things?

    There are other ways of producing global illumination which is much faster then ray tracing. It's pointless because it's like taking a step back just because we can now bute force simple scenes.

    Ray Tracing will still be slow on global illumination anyway. The more reflections you have the longer it takes, so it's not going to look as good too.
  • by Yetihehe ( 971185 ) on Friday April 11, 2008 @09:50AM (#23035624)
    I would really like to see quake4 with pixel shading on just cpu. You people forget that current games use specialisted graphic processors which currently even go to 128 shading units working in parallel. What if I had 128 specialized raytracing units? We should see results THEN
  • by -noefordeg- ( 697342 ) on Friday April 11, 2008 @10:02AM (#23035774)
    Why would one want 30 framed per second?

    If I were to mention a number, I would either want at least ~72 frames per second (where the eye/brain would have a hard time discerning between individual frames) or at least match the sync of an ordinary LCD screen at 60 fps.
  • Re:why bash? (Score:3, Insightful)

    by deroby ( 568773 ) <deroby@yucom.be> on Friday April 11, 2008 @10:09AM (#23035874)
    On the other hand, ray-tracing would be much less of a hack. Things simply look great the way they are, not because you niftily put a semi-transparent super-texturized with shader magic polygon in that corner of the field of view whenever the light source is like that and the so-called mirror is on that position etc...

    Sure it requires (a lot) more cpu-power, but development wise it should all be much more straight-forward. Build the scene and have it rendered.

    Right now I'm under the impression that each time you want wow-factor, things are like : build scene, render scene, look for awkward stuff caused by incomplete technology, add tweaks to scene, render again... Repeat process until it all looks good from all corners. If not feasible within given time frame : either prevent user from walking out of the prepared spaces, drop idea altogether or leave it in half-baked and blame it on the drivers.
  • Well... duh! (Score:5, Insightful)

    by Yvanhoe ( 564877 ) on Friday April 11, 2008 @10:22AM (#23036028) Journal
    Carmack didn't really bashed it, neither did Crytek. They just make it clear that you can't have rasterization on day N and have raytracing on day N+1. A 3-5 years transition period is very reasonable. Using raytracing optimally requires to change the whole data structure of the virtual world. It would require making new modeling tools, new rendering engines, integrating new possibilities into the game design.
    Keep also in mind that Intel proposes this as a future way of doing rendering. Their hardware is not even here yet. Given this, any prediction below 3 years would be quite surprising.
  • Re:why bash? (Score:3, Insightful)

    by Goaway ( 82658 ) on Friday April 11, 2008 @10:57AM (#23036518) Homepage
    Ray-tracing is nearly just as much of a hack as rasterizing polygons is. It's miles away from anything like a realistic model of lighting.

    And it would still require just as much tweaking to make it look good, and make it fast.
  • by DrXym ( 126579 ) on Friday April 11, 2008 @11:01AM (#23036570)
    Those PS3 tech demos are cool but could more accurately be called ray casting. They bounce a primary and maybe a secondary ray off some fairly simple scenes. I expect if you looked close up there would be jaggies all over the shop, and things like reflection & shadows would be brutal. Proper ray tracing requires sub pixel sampling with jitter and recursion to look even remotely acceptable.

    I don't think anyone denies that ray tracing is lovely etc., but its a question of whether it is remotely feasible to do it on the current generation of CPUs or GPUs. If it takes a cluster of Cell processors (basically super fast number shovels) to render a simple scene you can bet we are some way off from it being viable yet.

    Maybe in the mean time it is more suitable for lighting / reflection effects and is used in conjunction with traditional techniques.

  • simplicity wins (Score:4, Insightful)

    by sgt scrub ( 869860 ) <[saintium] [at] [yahoo.com]> on Friday April 11, 2008 @11:08AM (#23036642)
    Like all technology races, simplicity wins. If Intel provides tools that make it easier to develop ray tracing games, the GPU will be displaced.
  • Re:why bash? (Score:3, Insightful)

    by steelfood ( 895457 ) on Friday April 11, 2008 @11:59AM (#23037324)
    It's a good first step to true global illumination.

    Progress doesn't always come in leaps and bounds. Sometimes, it's about baby steps.
  • Your comment doesn't make a lick of sense. I mentioned that the early entrants into a new market make the most money in the short-term. You then try to refute my argument with a long-term argument. Logic error. Danger Will Robinson. Danger!

    Was MySpace the first social networking site?

    No. That dubious distinction belongs to Classmates.com, a site launching in 1995 that did quite well for itself and is still going strong. (Oddly.)

    Was World of Warcraft the first MMORPG?

    Neverwinter Nights, Ultima Online, and Everquest (nay, Evercrack!) were all highly successful and made their creators a lot of money in the short term.

    Consider Ford versus Toyota/Honda/etc.

    Consider what? Ford went gangbuster when it released the Model T to the market. In the short term, Ford's assembly-line approach effectively handed them the market. Toyota and Honda weren't competitors for nearly 80 years!
  • by xouumalperxe ( 815707 ) on Friday April 11, 2008 @12:18PM (#23037548)

    Bullshit. Just the same as raster graphics, the amount of time you spend per frame on ray-tracing is adjusted to your needs and desires. Take, say, a Pixar film. Those are mostly done with raster graphics, with key effects done with ray-tracing. How much time do you reckon it takes to render each of one of those films' frames? (Pixar films are all drawn with Photorealistic Renderman, which is based on the REYES algorithm, which reads like a fancy raster engine)

    The part about computational power is another fine display of complete misrepresentation of reality. Raster graphics are this fast nowadays for two major reasons. The most obvious is because graphics cards entire massively parallel processors specialized in drawing raster graphics. It's pretty damn obvious that, given two techniques for the same result, the one for which you use a specialized processor will always be faster, which doesn't produce evidence that a technique is inherently faster than the other. The second, less obvious, is that raster graphics have been the focus of lots of research in recent years, which makes it a much more mature technology than ray-tracing. Once again, a more mature technology translates into better results, even if the core technique has no such advantage. What Intel is supposedly aiming for here is getting the specialized hardware and mindshare going for ray-tracing, which might lead to real-time ray tracing becoming a viable alternative for raster graphics.

  • by monoqlith ( 610041 ) on Friday April 11, 2008 @12:27PM (#23037648)
    It would seem so at first, yes. But then, I would argue, the person who has made a game that was meant to run on hardware that doesn't exist yet might be more qualified to comment on rendering methods that run on hardware that doesn't exist yet.
  • by Anonymous Coward on Friday April 11, 2008 @01:59PM (#23038662)
    So you're saying there is no market pressure on nvidia because everyone keeps playing WoW and are happy with their current gfx card?
    How exactly is the lack of need for better gfx going to create a market for raytracing? In this situation the only reason to switch to raytracing is when your gfx card brakes down and you already own a cpu with 128 cores.

    Also, the cpu is not idle in games, there are other things to do besides rendering, like collision detection and AI.
  • by Anonymous Coward on Friday April 11, 2008 @02:05PM (#23038740)
    No. Movies use blurring and careful camera movement to make movies tolerable at low frame rates. They also double play frames at the theater. The human visual system doesn't have a framerate; different areas of visual perception respond to changes with different latencies.
  • by billcopc ( 196330 ) <vrillco@yahoo.com> on Friday April 11, 2008 @02:53PM (#23039480) Homepage
    Intel is pushing raytracing, not because it's the right thing to do, but rather because it directly benefits Intel by increasing demand for fast multi-core processors.

    Bankers push investments, not because it benefits you, but because it benefits them! Intel, as a corporation, is interested in your money, not your best interests.
  • by Anonymous Coward on Friday April 11, 2008 @04:57PM (#23041110)
    The accuracy with which you can maneuver is determined by framerate. Actual velocity is not. In Quake and all Quake related products, you tell the server how you want to move, and the server will update your position on each of its frame ticks proportional to time elapsed since it last calculated it, then send you a packet containing your position. Do you really think a player with 5 fps runs thirty times slower than a player with 150 fps? Nobody would take the game seriously. The player with 5 fps will jerkily move, as he's only telling the server where he wants to go 5 times a second (basically), but he and the player with 150 fps both have the same total speed. Likewise, Jetpack fuel is stored on the server. If it wasn't, it would have been hacked ten minutes after the game was released. In Quake style games, everything important is serverside.

    As to the issue of speedhacks, that's exploitation of code designed to make up for network lag. It too is completely independent of client fps.

Software production is assumed to be a line function, but it is run like a staff function. -- Paul Licker

Working...