Real-Time, Movie-Quality CGI For Games 184
An anonymous reader writes "An Intel-owned development team can now render CGI-quality graphics in real time. 'Their video clips show artists pulling together 3D elements like a jigsaw puzzle (see for example this video starting at about 3:38), making movie-level CG look as easy as following a recipe.' They hope that the simplicity of 'Project Offset' could ultimately give them the edge in the race to produce real-time graphics engines for games."
Wow (Score:5, Funny)
They've discovered the hidden secrets to rendering Academy Award winning films such as "Gears of War" and "Crysis."
Congrats Intel dev team!
Re:Wow (Score:5, Funny)
Those aren't award winning films.
They're award winning slideshows.
Re: (Score:2)
Re: (Score:2, Insightful)
And then there's South Park, which appears to have been created with PowerPoint.
Re:Wow (Score:4, Informative)
Re: (Score:2)
Doesn't everyone know the secret behind rendering gears of war? I thought it was simply draw a black rectangle over your screen.
What this really means is ... (Score:2, Insightful)
Re: (Score:2, Interesting)
Hating on the current quality of movies/games/music automatically gets you karma points even if you haven't the least bit of idea of what you're talking about....
Re:What this really means is ... (Score:5, Funny)
I hate it when people hate on people hating on something they hate just to get karma points just to get karma points.
It's almost as bad as people hating on people hating on people hating on something they hate just to get karma points just to get karma points just to get karma points.
Grammar works like nesting things, right?
Re: (Score:2)
Grammar works like nesting things, right?
Don't anthropomorphize grammar works... they hate it when you do that.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
You need to include the brackets for readability.
Re: (Score:2)
Hating on the current quality of movies/games/music automatically gets you karma points even if you haven't the least bit of idea of what you're talking about....
How you extended crappy movies to multiple genres to get an interesting ranking seems to be the sad state of Slashdot. The GP focused on movies but unfortunately didn't take the time to elaborate on what they meant by crappy. I'm betting they were singling out the shallow screenplays and low budgets towards casting being covered up by the Wow factor.
Re: (Score:3, Interesting)
Re:What this really means is ... (Score:5, Interesting)
I read a really great short story once about a future where all films are made completely on computers, with AI actors. Then one guy starts filming movies with a real girl in them, just with computerized scenery, and doesn't tell anyone. It blows people away just how "real" his films feel compared to normal movies.
Anyone else read that? It was pretty good.
Re: (Score:2)
Was the realistic nature of that live action film seen as a good thing? If so, I found the premise of the story a bit odd. I mean heck, we still create black & white, and silent films. I'm sure even if CG movies become the norm, as long as realism is seen as a positive, there will always be at least a niche that'll prefer and create movies with real actors.
Re: (Score:2)
I find it amusing that half of the comments on this story have devolved into one of two diametrically opposite opinions which neither party find contradictory to one another:
1) "Films are looking all CG and crappy."
2) "So what if games are looking great the graphics don't matter."
Re: (Score:2)
I may need to hand over my geek card, but is that a reference to something I, as a geek, should know about?
Re: (Score:2)
Re: (Score:3, Insightful)
Anyone can already make movies without a billion dollars worth of computers and a billion dollars worth of actors. The difficulty is finding a million dollars worth of animators and fifty thousand dollars worth of screenwriters.
Re: (Score:2)
The big problem there is not graphics quality - that's already there, take a look at Fallout 3 or Arkham Asylum at maximum graphical settings - it's the quality of tools. You'd need "digital actors" able to move, react and emote without you having to put every eyebrwo into place manually. You'd also need good-quali
reducing implementation time is a good thing (Score:2)
Re: (Score:2)
Great... (Score:5, Funny)
Re:Or... (Score:2, Insightful)
Re: (Score:2)
Re: (Score:2, Informative)
Cool, please tell where I can get one that has GPL drivers.
I'll wait.
Re: (Score:2)
As long as Moore's law holds (Score:2, Interesting)
How can there be any doubt that realtime rendering will approach the quality of today's offline rendering when computing power grows exponentially?
Re:As long as Moore's law holds (Score:5, Insightful)
Unfortunately, the faster the processors get, fancier rendering features become possible in the offline space as well.
Realtime rendering will never be on par with offline rendering of the same vintage.
Re: (Score:3, Insightful)
However, there is a point where CGI is "good enough" for most purposes. Yes, the maximum scene complexity may grow, but even there you may reach a "good enough" point, where you can easily fake the bits that cannot be done. Example: an outdoor scene with a forest in the distance. If the scene is rather static, with little action, the forest in the background may be just a picture. If more movement is involved, but the forest is always far away, impostors can be used. These tricks are cheap to implement and
Re: (Score:2)
My bad, I misread your post.
Although I don't blame myself. I blame the drugs, like a good pilgrim.
Re:As long as Moore's law holds (Score:5, Funny)
Re: (Score:2, Insightful)
I think those are Mike & Ike's...
There's a meme that is gonna stick if only I had mod points :)
Re:As long as Moore's law holds (Score:4, Informative)
Absolutely true, but there is an apex that both achieve to reach which is photo realistic rendering.
No, because "photo realism" is not a goal that visual effects aspires to. If you can take a photo of something, then it's almost always cheaper and better to do that, even though it usually requires many thousands of dollars on crew, make up, sets and lighting.
CGI is used for things that you can't take a photo of, such as a Na'vi or a talking ant. If the space ship can travel faster than light, or the penguin can dance, then "realistic" is not a goal.
(Disclaimer: I used to work in visual effects.)
Re: (Score:2)
You're mostly right but what's different with CG is that you can build it faster. What would have taken a model builder 1 year to manufacture can be done on a computer in a fraction of the time.
We're slowly moving towards the Avatar model of doing things even for romantic comedies. You don't need generators. You don't need street clearances. You don't condors. You don't need grips and gaffers and camera assistants. If the DP wants a big soft light source 6' behind the actor. Click, Click, bam. Ther
CGI-quality graphics (Score:5, Funny)
now there we have an accurate statement: "Computer Generated Imagery" quality graphics
Re:CGI-quality graphics (Score:4, Informative)
Obviously he's a member of the Tautology Club that has him as a member.
Re: (Score:2)
He's not just a member of the club, he's the Officer of Redundancy Officer.
Re: (Score:2)
"Movie-Quality" (Score:4, Insightful)
"Movie-Quality" is basically a worthless statement. Which movie? Avatar, Final Fantasy, Toy Story, Tron? The quality of digitally produced movies, and the quality of game graphics power are constantly moving targets.
Re: (Score:3, Funny)
Re: (Score:2)
Re: (Score:2)
PS I didnt play Crysis until long after the release date, and did so on maxxed settings.
Re:"Movie-Quality" (Score:4, Insightful)
This is basically what I was going to say. The latest crop of "funny fuzzy animal" movies have graphics about as good as the best video games — the secret to making games look as good as movies is apparently to make movies look shitty. I just can't sit through a movie that doesn't look as good as playing a game. I also can't sit through a movie with a worse plot than nethack, but that's a separate issue. Unfortunately, the aforementioned movies suffer from both of these failings.
Re: (Score:2)
You seem to forget that those funny fuzzy animal movies are only marketing tools for the related computer games and McDonalds toys. The games are about the same in terms of look as the movies themselves because they're often based on movie assets.
For example, see an interview with the Avatar game developers [worthplaying.com] where they talk about getting the models from Lightstorm Entertainment (who were responsible for the movie graphics).
Re: (Score:2, Offtopic)
It is like the Heisenberg Uncertainty Principal. In order to determine a particles position to a high degree of accuracy you merely need to do a shitty job measuring its velocity.
Re: (Score:2)
Watch a recent PIxar movie in HD and come back and say that.
Re: (Score:3, Interesting)
Can anyone tell me how close we are to being able to render Toy Story in real time? Say 1080p?
I know the state of the art keeps moving, Avatar is far better looking than the original Toy Story, but with the limited visual "feature set" used in Toy Story, are we very far from being able to do something close looking in real time?
Can we do it raster, now that we have so many GPU based effects?
Re:"Movie-Quality" (Score:4, Informative)
Not sure, but I can tell you that we're nowhere near rendering state of the art movie CGI in real time. Vertex and pixel shaders have enabled a class of effects that were previously impossible in real time, but those are all direct lighting effects or crude approximations of indirect lighting. Shadows are not really smooth, they're just blurred. Realistic smooth shadows depend on the size of the light source and are computationally prohibitive on current hardware under real time constraints. Movie-quality CGI includes a class of light interactions which is currently impossible in real time, for example caustics: A caustic is light which is reflected or refracted onto a surface which reflects diffusely. Light being refracted by the surface of a swimming pool is an effect which can be faked but not simulated in real time. Render farms use an algorithm called Photon Mapping to simulate this and other complicated light interactions. This algorithm is conceptually related to Raytracing but even more computationally intensive. It does not map well to the hardware which is currently used in the real time rendering pipeline.
Re: (Score:2)
iRay still does not replace the core rendering engine of Mental Ray ... it's just a toy renderer for quick visualization, not a rendering engine for movie quality CGI.
Re: (Score:3, Interesting)
Re: (Score:3, Interesting)
If Pixar had been able to render scenes with better quality in a matter of minutes, they wouldn't have needed over 100 machines [findarticles.com] in their render farm. In fact, each frame took "from two to 13 hours."
Re: (Score:3, Interesting)
Re: (Score:3, Informative)
Re:"Movie-Quality" (Score:4, Informative)
And just to put this in perspective, current GPUs manage somewhere in the region of 2TFlops, so assuming we can encode Pixar's raytracing/radiocity algorithm into OpenCL that will actually run on one of these cards and not drop to software, then the hard-to-render frames would still take 1.17 seconds to spit out. We need about another 2 orders of magnitude improvement before we're there. That will only take a few years from now though, so we're close, but no cigar.
Re: (Score:3, Informative)
At 5 TFlops you're still talking 0.5 seconds to render a single frame from toy story, even assuming we can encode their rendering algorithm efficiently onto a graphics card in such a way that it reaches peak performance (unlikely).
Re: (Score:2)
But GPUs are about 100x faster than CPUs at rendering. Imperfect rendering, but with how much they've advanced, they'd do fine for something like Toy Story.
Factor in the doubling of speed every X months, and a high end modern GPU could probably render Toy Story realtime 1600p no problem.
The guy below you says those machines have a theoretical speed of 15mflops. Pretty soon GPUs will be approaching ~2-3tflops (theoretical), so estimating low... 1500000mflops / 15mflops = 100,000 times faster than each of th
Re: (Score:2)
100x or even 100,000x faster isn't fast enough.
The article cited an average 7 hour render time per frame.
7 Hours = 420 minutes = 25,200 seconds @ 24fps = 604,800x faster in order to render in real-time.
That even makes an enormous (and inaccurate assumption) that GPUs can handle PRMan quality sampling/rendering. It doesn't. Especially not at 100x faster than CPU speeds.
Re: (Score:3, Interesting)
Blinn's Law states that the amount of time it takes to compute a frame of film remains constant over time, because audience expectation rises at the same speed as computer power.
I think it was Tom Duff who commented that one eyeball in a modern Pixar film requires roughly the same amount of work as a frame of Toy Story.
Re:"Movie-Quality" (Score:4, Interesting)
According to this [fudzilla.com], the original Toy Story needed about 7 TFLOPS to render in real time, although I've seen higher estimates.
87 dual-processor and 30 quad-processor 100-MHz SPARCstation 20s took 46 days to do ~75 minutes, so you need to be 883.2 times as fast to render in realtime. Anyone overclock a quadcore processor to 8 GHz? I suppose setup with 4 quadcore cpus @ 2GHz isn't out of reach.
But then again, the machines might have been IO bound instead of CPU bound, needing to send 7.7 gigabytes per second.
Re: (Score:2)
If I were a betting man (Score:2)
So what if it is? (Score:2)
As long as it gets the job done it's an interesting innovation. Real time rendering of game or modern movie quality CGI would be a good thing regardless of how it's implemented.
Re: (Score:2)
I'd wager that their solution is way more CPU-intensive than GPU-intensive.
I'd bet you're right... and you'll be able to do this stuff in realtime at home as soon as you have thousands of cores [cnet.com]. More seriously, though, a future without GPUs would be a good thing, if we could get the same performance (or better) without them. Why? Because in order to use the full power of a computer with a big GPU, you have to do two kinds of programming. A computer where all the powerful processing elements were identical would be much easier to fully utilize, and that means less wasted money.
As a former (contract) developer on Project Offset (Score:5, Interesting)
4 or 5 years ago, it was basically comparable to Unreal 3. The motion blur was probably the best feature I saw. Fine graphics, but nothing really mind blowing. Having said that, I have not seen what they've done since Intel bought them, but I'm guessing its basically support for Intel's research projects.
As a developer of modern console and PC games, My Professional Opinion is that there's nothing new to see here.
Re: (Score:2)
Exactly. I've been following now for about four years and they occasionally throw out a few interesting videos and such, but ultimately I haven't seen anything new from their team in quite some time. It was an interesting choice selling out to Intel of all places... I only hope they don't turn it into another Duke Nukem: Forever.
Re: (Score:3, Interesting)
I used to work with Sam Mcgrath and I consider him an old friend. I was fortunate enough to be there from the very start of his new engine and see it develop back when there was no company or anything...
He blew me away years ago with the very basics of its shader editing and render quality. I havent seen newer versions of it in years but... Sam was kicking ass from the start of it.. trust me.
Sam is an incredibly talented coder, perhaps one of the best and most hard working out there. Sammy, best of luck to
Re: (Score:2)
In all fairness, we do tend to browse the pictures.
Re: (Score:2)
I assume they use Havok, both Intel owned ya know.
Priorities first! (Score:2)
Attention, developers: graphics are not the most important thing.
For example, the two Sonic Adventure games for the Dreamcast were imperfect but very enjoyable. Now check Sonic The Hedgehog for PS3/X360. It looked far better, but it had craploads of game-breaking glitches, long loading times, overall poor design, so the reviews were mostly negative. Another example, Doom. Everyone loved the first two games... then came in Doom 3, that looked stunning, but played more like a survival horror game. How can som
Who are you arguing with? (Score:2)
Re: (Score:3, Funny)
Oh, so *Doom 3* played like a survival horror game.
I see.
Re: (Score:2)
Oh, so *Doom 3* played like a survival horror game.
In the GP's defense, maybe they finally made that version bright enough so you could see what what going on in the game.
Re: (Score:2)
Doom. Everyone loved the first two games... then came in Doom 3, that looked stunning, but played more like a survival horror game. How can someone take such a wild, frantic, exhilarating series and make something so boring out of it?
That really is a sad statement on how far survival horror games have fallen when someone thinks Doom 3 fits that genre. Doom 3 was just a crappy FPS... walk into dark room, shoot the bad guy that is always positioned in an out of the way corner, rinse and repeat. It was never a survival horror... if I told my wife what you said she would be quaking with fiery, and her ranting would be epic. She and I both miss the glory days of survival horror...
That said, the rest of your point still stands and I agree w
Re: (Score:2)
Wait, have you played the original Sonic games on the Genesis? If so, how can you say that ANY 3D Sonic game is good?
The 2D games are overrated anyway: Sonic 1 was fantastic, sure, but Sonic 2 was quite mediocre, Sonic 3 was pretty good, and Sonic & Knuckles was terrible (especially the soundtrack).
I must be jaded (Score:2)
So I went to the link in the summary to see the video, and I MUST be too jaded. It looked *exactly* like a level from Unreal Tournament 3. I love that game, so that's all well and good. I'm sure my laptop could render that youtube clip in realtime without a problem. It still seemed fake to me. The movement of the foliage was too "calculated", as was much of the debris when it fell. The camera motion was "too perfect" and looks exactly like what my camera moves look like in After Effects, which bear very lit
Re: (Score:2)
Re: (Score:2)
Agreed; that seems to be common practice in video games. The point I was getting at is that the paths seem a little "too perfect", and the motion itself seems a bit linear and calculated. I'm not saying that they need to have Michael Bay program the cameras, but for true photorealism, the camerawork needs to be less computationally convenient.
define movie quality (Score:3, Insightful)
Currently, the big push in 3d rendering is towards physically based raytrace or pathtrace rendering.
http://en.wikipedia.org/wiki/Path_tracing [wikipedia.org]
http://en.wikipedia.org/wiki/Ray_tracing_(graphics) [wikipedia.org]
Physically based rendering produces a much more accurate representation of how light interacts with and between surfaces. It has always taken a long time to render using physically based techniques due to the huge amount of calculations necessary to produce a grain free image. This has changed somewhat recently with multi-core systems and with GPGPU languages such as CUDA and OpenCL we are about to experience a big and sudden increase in performance regarding these rendering technologies.
While this game looks great, the engine is by no means going to be capable of rendering scenes containing hundreds of millions of polygons, ultra-high res textures, physically accurate lighting and shaders, and high render resolution. We are still pretty far away from real-time physically-based rendering, which is the direction film is currently headed. So that would have to be what "Movie-Quality CGI" is defined as and this game does not live up to that definition.
Re: (Score:3, Interesting)
It's also misleading because films can cheat. You can't see something from every angle and cameras don't always have to move through a space so a lot of what you see are flat cards carefully hand painted and positioned in 3D space.
In the end what really holds back video games is their memory. A small scene can consume in excess of 8GB of memory. That's fine on the CPU where you have a lot of RAM and you can swap back and forth from the HDD. With a GPU you have to load everything into memory which is
Re: (Score:3, Interesting)
>As long as games can't go through a post-process hand tuned by a team of artists for weeks
Well I don't know anything about movie production, but I highly doubt they do this. Are you really saying they take their pristine movie output and begin to photoshop it and make adjustments at the frame level? Do you know how laborious that is when you could just, oh, i don't know, adjust the model you already have and rerender those frames? D
Pictures? (Score:2, Interesting)
Okay, so this is slightly off-topic, but something I've always wondered about.
I can take a 12megapixel picture. And reduce it down to a 12k gif. Or 120k or whatever the compression results are.
At that point, it's just a .gif. (or .jpg or whatever). The computer doesn't know it's any different than a .gif I created in MSPaint, right?
So if I open GameMaker 7, and use that photo as one of the frames in my character's animation. By repetition, I could create a character moving and walking frame by frame.
Right?
Re: (Score:2)
At one time people did that, see the famous game Myst.
These days people like moving where ever they want.
Re:Pictures? (Score:5, Interesting)
Actually, that's how the characters in the older Myst games worked (except that they used this great new technology called "video camera" to get moving pictures into them).
This was fine in those games, because the viewpoint was always fixed. That's a restriction you don't want to have in current games.
Re: (Score:2)
Because you can never anticipate every situation you'll need to photograph.
Let's say you want a walk cycle of your character. Let's say a loop of about 1 second at 30fps. Now you have:
30 frames.
Now you want your character to turn so you need it to rotate. Now you have to shoot about 1080 more angles. So now you're up to:
32,400 frames.
Oops but now you also need to see this walking character from above or below. Let's say 200 degrees and assume nobody will ever see it from right below the ground. Ok n
Re: (Score:2)
The problem comes down to dynamic lighting and shading, animation and interaction.
You cant fake lighting, shadows, and objects dynamically interacting with each other with image sequences. Its just impossible.
I'm not sure you have thought this thought out fully.
Boof (Score:2, Interesting)
So this means we are going to see games with movie budgets and no gameplay at all...we already do, but the balance will detriment gameplay even further by reasoning of manpower.
great graphics, dull games (Score:2)
Game graphics seem to be getting better and better while the games seem to be getting more and more dull. Mass Effect 2 and Bioshock 2 are hardly games at all anymore, they are little more than movies with a fast forward button.
CGI (Score:3, Funny)
CGI is awful, they could at least have tried for EGA
FINALLY!!!one (Score:5, Funny)
It was such a pain, when computers couldn't achieve the quality of COMPUTER GENERATED IMAGES
A single comment (Score:2)
"Movie quality" is a relative term (Score:3, Insightful)
Re: (Score:2)
Re: (Score:2)
>At what point will the hardware capabilities exceed the software we can write?
With the state that the education system is in, I'd say not far off at all.
Re:Who will write the software for the bird? (Score:5, Insightful)
Never. More hardware means programmers can get away with writing less efficient code.
Re: (Score:2)
>> At what point will the hardware capabilities exceed the software we can write?
> Never. More hardware means programmers can get away with writing less efficient code.
Which is good. Even if you believe in writing good code, this translates to allowing more layers of abstraction. Additional layers of abstraction are generally less efficient, and make it easier to wire together complex components. Consider inverse kinematics and physics engines as examples.
Consider how much you could improve the imm
Re: (Score:2)
Re: (Score:2)