Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Graphics Intel Games

Real-Time, Movie-Quality CGI For Games 184

An anonymous reader writes "An Intel-owned development team can now render CGI-quality graphics in real time. 'Their video clips show artists pulling together 3D elements like a jigsaw puzzle (see for example this video starting at about 3:38), making movie-level CG look as easy as following a recipe.' They hope that the simplicity of 'Project Offset' could ultimately give them the edge in the race to produce real-time graphics engines for games."
This discussion has been archived. No new comments can be posted.

Real-Time, Movie-Quality CGI For Games

Comments Filter:
  • Wow (Score:5, Funny)

    by binarylarry ( 1338699 ) on Monday February 22, 2010 @07:55PM (#31238376)

    They've discovered the hidden secrets to rendering Academy Award winning films such as "Gears of War" and "Crysis."

    Congrats Intel dev team!

    • Re:Wow (Score:5, Funny)

      by Yvan256 ( 722131 ) on Monday February 22, 2010 @08:44PM (#31238866) Homepage Journal

      Those aren't award winning films.

      They're award winning slideshows.

    • by socsoc ( 1116769 )
      I prefer the cutscenes of Super Mario Bros and ExciteBike. When are films gonna achieve that quality? I mean Avatar tried, but didn't quite nail it.
      • Re: (Score:2, Insightful)

        by Anonymous Coward

        And then there's South Park, which appears to have been created with PowerPoint.

    • Re:Wow (Score:4, Informative)

      by stephanruby ( 542433 ) on Tuesday February 23, 2010 @04:11AM (#31241950)
      This is lame. The guy doesn't even claim the video was made in real-time. He claims that the editing of the game can be done in real-time. That distinction is important, because most of the time I see someone demoing a 3-D editing tool on Youtube, they've accelerated the demo by a huge factor -- just to make the video look cool (and it does look cool that way, but it's also misleading). By the way, here is the same demo "teaser" [projectoffset.com] referenced through youtube, there is actually no need to have to wait for the 3 minutes and 38 seconds on that other video for the boring guy to stop droning on, it's essentially the same teaser (with the same building and the same shading) -- it's just been spliced into the interview in small pieces (as if to imply that the teaser was made at the same speed the interview was videotaped at).
    • Doesn't everyone know the secret behind rendering gears of war? I thought it was simply draw a black rectangle over your screen.

  • ... now they can pump out crappy movies that have quality CG faster than ever before?
    • Re: (Score:2, Interesting)

      by Anonymous Coward

      Hating on the current quality of movies/games/music automatically gets you karma points even if you haven't the least bit of idea of what you're talking about....

      • by biryokumaru ( 822262 ) * <biryokumaru@gmail.com> on Monday February 22, 2010 @09:51PM (#31239550)

        I hate it when people hate on people hating on something they hate just to get karma points just to get karma points.

        It's almost as bad as people hating on people hating on people hating on something they hate just to get karma points just to get karma points just to get karma points.

        Grammar works like nesting things, right?

      • by tyrione ( 134248 )

        Hating on the current quality of movies/games/music automatically gets you karma points even if you haven't the least bit of idea of what you're talking about....

        How you extended crappy movies to multiple genres to get an interesting ranking seems to be the sad state of Slashdot. The GP focused on movies but unfortunately didn't take the time to elaborate on what they meant by crappy. I'm betting they were singling out the shallow screenplays and low budgets towards casting being covered up by the Wow factor.

    • Re: (Score:3, Interesting)

      by Korin43 ( 881732 )
      Wouldn't this mean that this is just another step in the direction of letting anyone make movies (without needing a billion dollars with of computers and another billion dollars worth of actors)?
      • by biryokumaru ( 822262 ) * <biryokumaru@gmail.com> on Monday February 22, 2010 @10:58PM (#31240098)

        I read a really great short story once about a future where all films are made completely on computers, with AI actors. Then one guy starts filming movies with a real girl in them, just with computerized scenery, and doesn't tell anyone. It blows people away just how "real" his films feel compared to normal movies.

        Anyone else read that? It was pretty good.

        • Was the realistic nature of that live action film seen as a good thing? If so, I found the premise of the story a bit odd. I mean heck, we still create black & white, and silent films. I'm sure even if CG movies become the norm, as long as realism is seen as a positive, there will always be at least a niche that'll prefer and create movies with real actors.

        • I find it amusing that half of the comments on this story have devolved into one of two diametrically opposite opinions which neither party find contradictory to one another:

          1) "Films are looking all CG and crappy."
          2) "So what if games are looking great the graphics don't matter."

        • by Tukz ( 664339 )

          I may need to hand over my geek card, but is that a reference to something I, as a geek, should know about?

      • Re: (Score:3, Insightful)

        by Pseudonym ( 62607 )

        Anyone can already make movies without a billion dollars worth of computers and a billion dollars worth of actors. The difficulty is finding a million dollars worth of animators and fifty thousand dollars worth of screenwriters.

      • Wouldn't this mean that this is just another step in the direction of letting anyone make movies (without needing a billion dollars with of computers and another billion dollars worth of actors)?

        The big problem there is not graphics quality - that's already there, take a look at Fallout 3 or Arkham Asylum at maximum graphical settings - it's the quality of tools. You'd need "digital actors" able to move, react and emote without you having to put every eyebrwo into place manually. You'd also need good-quali

    • Here's the thing. Normal people don't want to spend hours and hours creating detailed 3D models in Blender or whatever. They just want the easiest way to turn their ideas into reality. Reducing the implementation time for a high quality end product, and eliminating the tedious tasks is a worthy goal. It's the same reason normal people don't program in assembly anymore. With the exception of some very specific programs, higher levels of abstraction are almost always better, and this is no exception.
      • It's been a while since I've heard from these guys. They are following a trend a lot of indie game developers have latched onto: what they lack in terms of budget can be made up with brains. The simple fact is we no longer need to have models crammed with millions of polygons in order to make high-quality assets. I shall have to make the obligatory demoscene reference, here; consider exhibit A: http://www.demoscene.tv/prod.php?id_prod=13374 [demoscene.tv] Also check out the works of Introversion, Eskil Steenberg, and (mor
  • Great... (Score:5, Funny)

    by Beelzebud ( 1361137 ) on Monday February 22, 2010 @07:57PM (#31238390)
    Now maybe they can get to work on shipping on-board graphics cards that can actually play games released within the past couple of years...
    • Re:Or... (Score:2, Insightful)

      Or maybe just start supporting OpenGL hardware acceleration? Any day now, Intel...
  • by Anonymous Coward

    How can there be any doubt that realtime rendering will approach the quality of today's offline rendering when computing power grows exponentially?

    • by binarylarry ( 1338699 ) on Monday February 22, 2010 @08:01PM (#31238448)

      Unfortunately, the faster the processors get, fancier rendering features become possible in the offline space as well.

      Realtime rendering will never be on par with offline rendering of the same vintage.

      • Re: (Score:3, Insightful)

        by ardor ( 673957 )

        However, there is a point where CGI is "good enough" for most purposes. Yes, the maximum scene complexity may grow, but even there you may reach a "good enough" point, where you can easily fake the bits that cannot be done. Example: an outdoor scene with a forest in the distance. If the scene is rather static, with little action, the forest in the background may be just a picture. If more movement is involved, but the forest is always far away, impostors can be used. These tricks are cheap to implement and

  • by Anonymous Coward on Monday February 22, 2010 @08:01PM (#31238446)

    now there we have an accurate statement: "Computer Generated Imagery" quality graphics

  • "Movie-Quality" (Score:4, Insightful)

    by nitehawk214 ( 222219 ) on Monday February 22, 2010 @08:05PM (#31238482)

    "Movie-Quality" is basically a worthless statement. Which movie? Avatar, Final Fantasy, Toy Story, Tron? The quality of digitally produced movies, and the quality of game graphics power are constantly moving targets.

    • Re: (Score:3, Funny)

      by Beelzebud ( 1361137 )
      It could even be a Sci-Fi Channel movie. I have games with better graphics.
      • True, and while the Meteor [projectoffset.com] video looked impressive, it's nothing we haven't seen before in Crysis, and it's no where near real people walking around and speaking. I think they jumped the boat a bit on "Movie Quality", needs a few more years.
        • Actually, I found the video to be much better than Crysis. Foliage is, IMO, one of the weakpoints of any outdoor game and even though Crysis did a fair job, the video had much more planty-plants.

          PS I didnt play Crysis until long after the release date, and did so on maxxed settings.
    • Re:"Movie-Quality" (Score:4, Insightful)

      by drinkypoo ( 153816 ) <drink@hyperlogos.org> on Monday February 22, 2010 @08:07PM (#31238500) Homepage Journal

      This is basically what I was going to say. The latest crop of "funny fuzzy animal" movies have graphics about as good as the best video games — the secret to making games look as good as movies is apparently to make movies look shitty. I just can't sit through a movie that doesn't look as good as playing a game. I also can't sit through a movie with a worse plot than nethack, but that's a separate issue. Unfortunately, the aforementioned movies suffer from both of these failings.

      • You seem to forget that those funny fuzzy animal movies are only marketing tools for the related computer games and McDonalds toys. The games are about the same in terms of look as the movies themselves because they're often based on movie assets.

        For example, see an interview with the Avatar game developers [worthplaying.com] where they talk about getting the models from Lightstorm Entertainment (who were responsible for the movie graphics).

      • Re: (Score:2, Offtopic)

        It is like the Heisenberg Uncertainty Principal. In order to determine a particles position to a high degree of accuracy you merely need to do a shitty job measuring its velocity.

      • by Xyde ( 415798 )

        Watch a recent PIxar movie in HD and come back and say that.

    • Re: (Score:3, Interesting)

      by MBCook ( 132727 )

      Can anyone tell me how close we are to being able to render Toy Story in real time? Say 1080p?

      I know the state of the art keeps moving, Avatar is far better looking than the original Toy Story, but with the limited visual "feature set" used in Toy Story, are we very far from being able to do something close looking in real time?

      Can we do it raster, now that we have so many GPU based effects?

      • Re:"Movie-Quality" (Score:4, Informative)

        by Anonymous Coward on Monday February 22, 2010 @08:33PM (#31238756)

        Not sure, but I can tell you that we're nowhere near rendering state of the art movie CGI in real time. Vertex and pixel shaders have enabled a class of effects that were previously impossible in real time, but those are all direct lighting effects or crude approximations of indirect lighting. Shadows are not really smooth, they're just blurred. Realistic smooth shadows depend on the size of the light source and are computationally prohibitive on current hardware under real time constraints. Movie-quality CGI includes a class of light interactions which is currently impossible in real time, for example caustics: A caustic is light which is reflected or refracted onto a surface which reflects diffusely. Light being refracted by the surface of a swimming pool is an effect which can be faked but not simulated in real time. Render farms use an algorithm called Photon Mapping to simulate this and other complicated light interactions. This algorithm is conceptually related to Raytracing but even more computationally intensive. It does not map well to the hardware which is currently used in the real time rendering pipeline.

      • Re: (Score:3, Interesting)

        by afidel ( 530433 )
        Toy Story isn't particularly difficult to render, even at the time you could render scenes with better quality in a matter of minutes so with a decade and a half of doubling every 18 months I'm pretty sure it could be done by your average gaming GPU in realtime. The biggest problem was sufficient memory for texture and model details but with 2GB of ram available on consumer level video card's I don't think that's such a big deal these days.
        • Re: (Score:3, Interesting)

          by Jonner ( 189691 )

          If Pixar had been able to render scenes with better quality in a matter of minutes, they wouldn't have needed over 100 machines [findarticles.com] in their render farm. In fact, each frame took "from two to 13 hours."

          • Re: (Score:3, Interesting)

            by afidel ( 530433 )
            Well, I was talking about a year after the movie came out, obviously the stuff *before* the movie came out that was used on the multi-year project would have been less powerful. Figure 120 minutes, three doublings of cpu power so divide by eight and you get 15 minutes. Increase ram and you can use better textures or more complex poly's.
            • Re: (Score:3, Informative)

              by afidel ( 530433 )
              Just found some numbers, the SPARC CPU in the SS 20's used for Toy Story were capable of 15 MFLOPS peak, an Alpha 21164 433 which came out about 6 months after the movie could do over 500 MFLOPS peak or about 30 times more. Even the PPro 200 could do 150 MFLOPS.
              • Re:"Movie-Quality" (Score:4, Informative)

                by beelsebob ( 529313 ) on Tuesday February 23, 2010 @04:44AM (#31242086)

                And just to put this in perspective, current GPUs manage somewhere in the region of 2TFlops, so assuming we can encode Pixar's raytracing/radiocity algorithm into OpenCL that will actually run on one of these cards and not drop to software, then the hard-to-render frames would still take 1.17 seconds to spit out. We need about another 2 orders of magnitude improvement before we're there. That will only take a few years from now though, so we're close, but no cigar.

          • But GPUs are about 100x faster than CPUs at rendering. Imperfect rendering, but with how much they've advanced, they'd do fine for something like Toy Story.

            Factor in the doubling of speed every X months, and a high end modern GPU could probably render Toy Story realtime 1600p no problem.

            The guy below you says those machines have a theoretical speed of 15mflops. Pretty soon GPUs will be approaching ~2-3tflops (theoretical), so estimating low... 1500000mflops / 15mflops = 100,000 times faster than each of th

            • 100x or even 100,000x faster isn't fast enough.

              The article cited an average 7 hour render time per frame.

              7 Hours = 420 minutes = 25,200 seconds @ 24fps = 604,800x faster in order to render in real-time.

              That even makes an enormous (and inaccurate assumption) that GPUs can handle PRMan quality sampling/rendering. It doesn't. Especially not at 100x faster than CPU speeds.

          • Re: (Score:3, Interesting)

            by Pseudonym ( 62607 )

            Blinn's Law states that the amount of time it takes to compute a frame of film remains constant over time, because audience expectation rises at the same speed as computer power.

            I think it was Tom Duff who commented that one eyeball in a modern Pixar film requires roughly the same amount of work as a frame of Toy Story.

      • Re:"Movie-Quality" (Score:4, Interesting)

        by Zerth ( 26112 ) on Monday February 22, 2010 @10:45PM (#31240022)

        According to this [fudzilla.com], the original Toy Story needed about 7 TFLOPS to render in real time, although I've seen higher estimates.

        87 dual-processor and 30 quad-processor 100-MHz SPARCstation 20s took 46 days to do ~75 minutes, so you need to be 883.2 times as fast to render in realtime. Anyone overclock a quadcore processor to 8 GHz? I suppose setup with 4 quadcore cpus @ 2GHz isn't out of reach.

        But then again, the machines might have been IO bound instead of CPU bound, needing to send 7.7 gigabytes per second.

    • Blake's 7.
  • I'd wager that their solution is way more CPU-intensive than GPU-intensive. Or maybe I'm just paranoid.
    • As long as it gets the job done it's an interesting innovation. Real time rendering of game or modern movie quality CGI would be a good thing regardless of how it's implemented.

    • I'd wager that their solution is way more CPU-intensive than GPU-intensive.

      I'd bet you're right... and you'll be able to do this stuff in realtime at home as soon as you have thousands of cores [cnet.com]. More seriously, though, a future without GPUs would be a good thing, if we could get the same performance (or better) without them. Why? Because in order to use the full power of a computer with a big GPU, you have to do two kinds of programming. A computer where all the powerful processing elements were identical would be much easier to fully utilize, and that means less wasted money.

  • by whiplashx ( 837931 ) on Monday February 22, 2010 @08:26PM (#31238686)

    4 or 5 years ago, it was basically comparable to Unreal 3. The motion blur was probably the best feature I saw. Fine graphics, but nothing really mind blowing. Having said that, I have not seen what they've done since Intel bought them, but I'm guessing its basically support for Intel's research projects.

    As a developer of modern console and PC games, My Professional Opinion is that there's nothing new to see here.

    • Exactly. I've been following now for about four years and they occasionally throw out a few interesting videos and such, but ultimately I haven't seen anything new from their team in quite some time. It was an interesting choice selling out to Intel of all places... I only hope they don't turn it into another Duke Nukem: Forever.

    • Re: (Score:3, Interesting)

      I used to work with Sam Mcgrath and I consider him an old friend. I was fortunate enough to be there from the very start of his new engine and see it develop back when there was no company or anything...

      He blew me away years ago with the very basics of its shader editing and render quality. I havent seen newer versions of it in years but... Sam was kicking ass from the start of it.. trust me.

      Sam is an incredibly talented coder, perhaps one of the best and most hard working out there. Sammy, best of luck to

  • Attention, developers: graphics are not the most important thing.

    For example, the two Sonic Adventure games for the Dreamcast were imperfect but very enjoyable. Now check Sonic The Hedgehog for PS3/X360. It looked far better, but it had craploads of game-breaking glitches, long loading times, overall poor design, so the reviews were mostly negative. Another example, Doom. Everyone loved the first two games... then came in Doom 3, that looked stunning, but played more like a survival horror game. How can som

    • Who said graphics are the most important thing? Why do people always defensively trot out this argument when advances are made in graphics?
    • Re: (Score:3, Funny)

      Oh, so *Doom 3* played like a survival horror game.

      I see.

      • Oh, so *Doom 3* played like a survival horror game.

        In the GP's defense, maybe they finally made that version bright enough so you could see what what going on in the game.

    • by Tynin ( 634655 )

      Doom. Everyone loved the first two games... then came in Doom 3, that looked stunning, but played more like a survival horror game. How can someone take such a wild, frantic, exhilarating series and make something so boring out of it?

      That really is a sad statement on how far survival horror games have fallen when someone thinks Doom 3 fits that genre. Doom 3 was just a crappy FPS... walk into dark room, shoot the bad guy that is always positioned in an out of the way corner, rinse and repeat. It was never a survival horror... if I told my wife what you said she would be quaking with fiery, and her ranting would be epic. She and I both miss the glory days of survival horror...

      That said, the rest of your point still stands and I agree w

  • So I went to the link in the summary to see the video, and I MUST be too jaded. It looked *exactly* like a level from Unreal Tournament 3. I love that game, so that's all well and good. I'm sure my laptop could render that youtube clip in realtime without a problem. It still seemed fake to me. The movement of the foliage was too "calculated", as was much of the debris when it fell. The camera motion was "too perfect" and looks exactly like what my camera moves look like in After Effects, which bear very lit

    • by gnud ( 934243 )
      The camera path can be set beforehand, and the scene can still be rendered in real time.
      • Agreed; that seems to be common practice in video games. The point I was getting at is that the paths seem a little "too perfect", and the motion itself seems a bit linear and calculated. I'm not saying that they need to have Michael Bay program the cameras, but for true photorealism, the camerawork needs to be less computationally convenient.

  • by poly_pusher ( 1004145 ) on Monday February 22, 2010 @09:05PM (#31239096)
    As stated by other posters, "film quality" is misleading. Primarily it refers to resolution and remember many cameras record at up to 4k, so the ability to render in real time at ultra-high res is definitely sought after.

    Currently, the big push in 3d rendering is towards physically based raytrace or pathtrace rendering.
    http://en.wikipedia.org/wiki/Path_tracing [wikipedia.org]
    http://en.wikipedia.org/wiki/Ray_tracing_(graphics) [wikipedia.org]
    Physically based rendering produces a much more accurate representation of how light interacts with and between surfaces. It has always taken a long time to render using physically based techniques due to the huge amount of calculations necessary to produce a grain free image. This has changed somewhat recently with multi-core systems and with GPGPU languages such as CUDA and OpenCL we are about to experience a big and sudden increase in performance regarding these rendering technologies.

    While this game looks great, the engine is by no means going to be capable of rendering scenes containing hundreds of millions of polygons, ultra-high res textures, physically accurate lighting and shaders, and high render resolution. We are still pretty far away from real-time physically-based rendering, which is the direction film is currently headed. So that would have to be what "Movie-Quality CGI" is defined as and this game does not live up to that definition.
    • Re: (Score:3, Interesting)

      It's also misleading because films can cheat. You can't see something from every angle and cameras don't always have to move through a space so a lot of what you see are flat cards carefully hand painted and positioned in 3D space.

      In the end what really holds back video games is their memory. A small scene can consume in excess of 8GB of memory. That's fine on the CPU where you have a lot of RAM and you can swap back and forth from the HDD. With a GPU you have to load everything into memory which is

      • Re: (Score:3, Interesting)

        by Xyde ( 415798 )

        >As long as games can't go through a post-process hand tuned by a team of artists for weeks

        Well I don't know anything about movie production, but I highly doubt they do this. Are you really saying they take their pristine movie output and begin to photoshop it and make adjustments at the frame level? Do you know how laborious that is when you could just, oh, i don't know, adjust the model you already have and rerender those frames? D

  • Pictures? (Score:2, Interesting)

    by incubbus13 ( 1631009 )

    Okay, so this is slightly off-topic, but something I've always wondered about.

    I can take a 12megapixel picture. And reduce it down to a 12k gif. Or 120k or whatever the compression results are.

    At that point, it's just a .gif. (or .jpg or whatever). The computer doesn't know it's any different than a .gif I created in MSPaint, right?

    So if I open GameMaker 7, and use that photo as one of the frames in my character's animation. By repetition, I could create a character moving and walking frame by frame.

    Right?

    • by h4rr4r ( 612664 )

      At one time people did that, see the famous game Myst.

      These days people like moving where ever they want.

    • Re:Pictures? (Score:5, Interesting)

      by am 2k ( 217885 ) on Monday February 22, 2010 @09:52PM (#31239564) Homepage

      Actually, that's how the characters in the older Myst games worked (except that they used this great new technology called "video camera" to get moving pictures into them).

      This was fine in those games, because the viewpoint was always fixed. That's a restriction you don't want to have in current games.

    • Because you can never anticipate every situation you'll need to photograph.

      Let's say you want a walk cycle of your character. Let's say a loop of about 1 second at 30fps. Now you have:
      30 frames.
      Now you want your character to turn so you need it to rotate. Now you have to shoot about 1080 more angles. So now you're up to:
      32,400 frames.
      Oops but now you also need to see this walking character from above or below. Let's say 200 degrees and assume nobody will ever see it from right below the ground. Ok n

    • The problem comes down to dynamic lighting and shading, animation and interaction.

      You cant fake lighting, shadows, and objects dynamically interacting with each other with image sequences. Its just impossible.

      I'm not sure you have thought this thought out fully.

  • Boof (Score:2, Interesting)

    by Windwraith ( 932426 )

    So this means we are going to see games with movie budgets and no gameplay at all...we already do, but the balance will detriment gameplay even further by reasoning of manpower.

  • Game graphics seem to be getting better and better while the games seem to be getting more and more dull. Mass Effect 2 and Bioshock 2 are hardly games at all anymore, they are little more than movies with a fast forward button.

  • CGI (Score:3, Funny)

    by Lorens ( 597774 ) on Tuesday February 23, 2010 @02:57AM (#31241540) Journal

    CGI is awful, they could at least have tried for EGA

  • by AlgorithMan ( 937244 ) on Tuesday February 23, 2010 @04:11AM (#31241946) Homepage
    FINALLY we can have CGI-quality in computer games!
    It was such a pain, when computers couldn't achieve the quality of COMPUTER GENERATED IMAGES
  • About avatar annoyed me, they somehow said that there where CGI in it that where so good that you couldn't tell it was CGI. Which isn't true at all if you ask me, all creatures and many plants looked as flawed as most CGI things generally do, heck, the uniform colors where actually worse than what I've seen in several animations earlier (for instance, the raven in the WCIII intro). Also, I'll never get why games should look real, I prefer them to look stylistic, that's more interesting.
  • by George_Ou ( 849225 ) on Tuesday February 23, 2010 @07:29AM (#31242782)
    Anyone can claim "movie quality" if we're going by Star Wars Episode 4 (original version) standards. The problem is that movies have obviously gotten a lot better though still not completely realistic. The fact is that real-time rendering will always be vastly inferior to slow rendering because you can throw at least 100 times the hardware and 100 times the time for movie making than any gaming computer. Furthermore, you only need 24 fps for movie making while you need a minimum of 60 fps with an average of 120 fps for a good gaming experience.

For God's sake, stop researching for a while and begin to think!

Working...