Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Graphics Intel Games

Real-Time, Movie-Quality CGI For Games 184

An anonymous reader writes "An Intel-owned development team can now render CGI-quality graphics in real time. 'Their video clips show artists pulling together 3D elements like a jigsaw puzzle (see for example this video starting at about 3:38), making movie-level CG look as easy as following a recipe.' They hope that the simplicity of 'Project Offset' could ultimately give them the edge in the race to produce real-time graphics engines for games."
This discussion has been archived. No new comments can be posted.

Real-Time, Movie-Quality CGI For Games

Comments Filter:
  • by WrongSizeGlass ( 838941 ) on Monday February 22, 2010 @07:57PM (#31238388)
    ... now they can pump out crappy movies that have quality CG faster than ever before?
  • by binarylarry ( 1338699 ) on Monday February 22, 2010 @08:01PM (#31238448)

    Unfortunately, the faster the processors get, fancier rendering features become possible in the offline space as well.

    Realtime rendering will never be on par with offline rendering of the same vintage.

  • "Movie-Quality" (Score:4, Insightful)

    by nitehawk214 ( 222219 ) on Monday February 22, 2010 @08:05PM (#31238482)

    "Movie-Quality" is basically a worthless statement. Which movie? Avatar, Final Fantasy, Toy Story, Tron? The quality of digitally produced movies, and the quality of game graphics power are constantly moving targets.

  • Re:"Movie-Quality" (Score:4, Insightful)

    by drinkypoo ( 153816 ) <drink@hyperlogos.org> on Monday February 22, 2010 @08:07PM (#31238500) Homepage Journal

    This is basically what I was going to say. The latest crop of "funny fuzzy animal" movies have graphics about as good as the best video games — the secret to making games look as good as movies is apparently to make movies look shitty. I just can't sit through a movie that doesn't look as good as playing a game. I also can't sit through a movie with a worse plot than nethack, but that's a separate issue. Unfortunately, the aforementioned movies suffer from both of these failings.

  • by PotatoFarmer ( 1250696 ) on Monday February 22, 2010 @08:41PM (#31238832)

    At what point will the hardware capabilities exceed the software we can write?

    Never. More hardware means programmers can get away with writing less efficient code.

  • by poly_pusher ( 1004145 ) on Monday February 22, 2010 @09:05PM (#31239096)
    As stated by other posters, "film quality" is misleading. Primarily it refers to resolution and remember many cameras record at up to 4k, so the ability to render in real time at ultra-high res is definitely sought after.

    Currently, the big push in 3d rendering is towards physically based raytrace or pathtrace rendering.
    http://en.wikipedia.org/wiki/Path_tracing [wikipedia.org]
    http://en.wikipedia.org/wiki/Ray_tracing_(graphics) [wikipedia.org]
    Physically based rendering produces a much more accurate representation of how light interacts with and between surfaces. It has always taken a long time to render using physically based techniques due to the huge amount of calculations necessary to produce a grain free image. This has changed somewhat recently with multi-core systems and with GPGPU languages such as CUDA and OpenCL we are about to experience a big and sudden increase in performance regarding these rendering technologies.

    While this game looks great, the engine is by no means going to be capable of rendering scenes containing hundreds of millions of polygons, ultra-high res textures, physically accurate lighting and shaders, and high render resolution. We are still pretty far away from real-time physically-based rendering, which is the direction film is currently headed. So that would have to be what "Movie-Quality CGI" is defined as and this game does not live up to that definition.
  • Re:Or... (Score:2, Insightful)

    by QuaveringGrape ( 1573239 ) on Monday February 22, 2010 @09:27PM (#31239310)
    Or maybe just start supporting OpenGL hardware acceleration? Any day now, Intel...
  • by Pseudonym ( 62607 ) on Monday February 22, 2010 @10:59PM (#31240106)

    Anyone can already make movies without a billion dollars worth of computers and a billion dollars worth of actors. The difficulty is finding a million dollars worth of animators and fifty thousand dollars worth of screenwriters.

  • Re:Wow (Score:2, Insightful)

    by Anonymous Coward on Monday February 22, 2010 @11:09PM (#31240168)

    And then there's South Park, which appears to have been created with PowerPoint.

  • by dredwerker ( 757816 ) on Tuesday February 23, 2010 @05:32AM (#31242280)

    I think those are Mike & Ike's...

    There's a meme that is gonna stick if only I had mod points :)

  • by ardor ( 673957 ) on Tuesday February 23, 2010 @05:44AM (#31242320)

    However, there is a point where CGI is "good enough" for most purposes. Yes, the maximum scene complexity may grow, but even there you may reach a "good enough" point, where you can easily fake the bits that cannot be done. Example: an outdoor scene with a forest in the distance. If the scene is rather static, with little action, the forest in the background may be just a picture. If more movement is involved, but the forest is always far away, impostors can be used. These tricks are cheap to implement and very effective.

    You don't need hyper-realistic spectral rendering for typical CGI movies. You could even get away with the CryEngine 2 or 3 for several low/mid budget flicks, provided you do some work on the animation. In my opinion, *animation* is the one factor that is still significantly better in offline rendering.

  • by George_Ou ( 849225 ) on Tuesday February 23, 2010 @07:29AM (#31242782)
    Anyone can claim "movie quality" if we're going by Star Wars Episode 4 (original version) standards. The problem is that movies have obviously gotten a lot better though still not completely realistic. The fact is that real-time rendering will always be vastly inferior to slow rendering because you can throw at least 100 times the hardware and 100 times the time for movie making than any gaming computer. Furthermore, you only need 24 fps for movie making while you need a minimum of 60 fps with an average of 120 fps for a good gaming experience.
  • by Anonymous Coward on Tuesday February 23, 2010 @10:08AM (#31243900)

    Yo Dawg, I heard you like posting hate, so I posted some hate to your hate post, so you can post hate while you post hate.....

He has not acquired a fortune; the fortune has acquired him. -- Bion

Working...