The Quest For Frames Per Second In Games 72
VL writes "Ever wondered why it is exactly everyone keeps striving for more frames per second from games? Is it simply for bragging rights, or is there more to it than that? After all, we watch TV at 30fps, and that's plenty." This editorial at ViperLair.com discusses motion blur, the frame-rate difference between movies and videogames, and why "...the human eye is a marvelous, complex and very clever thing indeed, but... needs a little help now and then."
Some serious flaws render the piece useless (Score:5, Informative)
This is idiotically wrong. This entire paragraph is predicated on the false assumption that our eye somehow has a "framerate" itself. (Either that, or the false assumption that our eye is basically a CCD with infinite discrimination, also wrong.) It does not. Our eye is fully analog. (Go figure.) You get motion blur because the nerves and the chemical receptors can only react so quickly, and because nerves fire as light accumlates on the
receptors. Since the receptors are moving quickly relative to the transmitting object, light rays from a given point are smeared across several cones/rods before the full processing of the image can take place. (Now, I'm simplifying because this isn't the place for a
textbook on vision, but at least I know I'm simplifying.) In fact, there's nothing the visual cortex could do to remove the motion blur coming from our eyes, because the motion blur causes actual information loss! (It can (and does) do some reconstruction, but you can't fill in details that don't exist.)
(Note in the portion I italized how he jumps from the "vision cortex" to "the eye"; the two are NOT the same and can't be lumped together like that in this context.)
This simple error renders the entire second page actively wrong.
Here's another, referring to interlacing:
Uh, wrong wrong wrong. Interlacing was a cheap hack to save bandwidth. "Progressive scan" is universally considered superior to interlacing (in terms of quality alone), and many (such as myself) consider keeping interlaced video modes in HDTV to be a serious
long-term mistake. It has nothing to do with convincing you you are seeing motion, in fact it has a strongly deleterious effect because you can frequently see the "combing"; that's why TVs have "anti-comb" filters. You don't see it as "motion", you see it as wierd "tearing".
ALSO wrong. The computer monitor and video card will pump out X frames per second, period. It has to. If the CRT is going at 60 fps and the video card (as in the 3D hardware) is only pumping at 30 fps, every frame will be shown for two CRT cycles. What else is the video card (as in the rasterizer) going to display? You'd notice if the screen were blank every other cycle!
Wrong again. CRTs at that frequency are "flicker free" because they pass the frequency the parts of our eyes more sensitive to motion (actually the peripheral vision, not the "primary" vision we're us
uh (Score:1, Informative)
Re:Motion Pictures (Score:3, Informative)
Wrong. They show 24 fps. (There's also a bit of black in between each frame, otherwise the eye would register a blur; but it's still 24fps.)
If the projector was run at normal speed and showed each frame twice, it would look like choppy slow motion. If it was run faster at 48 fps, the motion would be fast, like how you often see old silent pictures.
You would need a print with every frame printed twice in a row for it to work, and then a faster projector than is safe for most film.
There are certain camera systems under development which would shoot film at 48 fps, and you'd then need a projector that could show the film at 48fps, but the standard rate for cameras and projector for the last fifty years, everything you've ever seen in a cinema, has been 24fps.
Cinematographers also avoid certain shots, like rapidly panning from left to right, which look terrible on a movie screen.
It's called a swish pan, and it makes for a nice transition, if you cut in between two of them. But you don't have to, and it doesn't look "terrible."
Whoever modded you up is embarassingly ignorant of the topic at hand.
sorry murdock, but you are wrong. (Score:2, Informative)
And no, the film doesn't have to have the frames on it multiple times. The transport mechanism in a projector works like this: light off, move the film forward to the next frame, stop, light on, light off, move forward, stop, light on,
Now, instead of having ONE phase of light during a frame, modern projectors have TWO or THREE of them:
light off, move forward, stop, light on, light off, light on, light off, move forward
at least thats what i learned in school