The Quest For Frames Per Second In Games 72
VL writes "Ever wondered why it is exactly everyone keeps striving for more frames per second from games? Is it simply for bragging rights, or is there more to it than that? After all, we watch TV at 30fps, and that's plenty." This editorial at ViperLair.com discusses motion blur, the frame-rate difference between movies and videogames, and why "...the human eye is a marvelous, complex and very clever thing indeed, but... needs a little help now and then."
Re:it plays better (Score:3, Interesting)
No (Score:5, Interesting)
2: Remember, FPS is the *average* framerate. It may dip well below that mark. My goal is not to have the most FPS but to have a reasonably high resolution with FSAA and AF on, all the detail settings to full, and to never have the game dip below my monitor's refresh rate (75Hz).
Consistency also matters (Score:2, Interesting)
I'm still waiting for the day when machines are good enough and code works well enough for games can be considered "real-time" (meaning having fixed steps at, say, 60Hz - and the game is NOT ALLOWED to take longer than that 1/60th sec to render a frame).
- Disgruntled Planetside player who wishes that game always ran at 60Hz.
Re:Some serious flaws render the piece useless (Score:4, Interesting)
This is the Visual Cortex adding motion blur to perceived imagery so that rather than seeing everything in great detail, we are still able to perceive the effect of motion and direction as we ride by. The imagery is smoothly flowing from one point to the next and there are no jumps or flickering to be seen. If the eye wasn't to add this motion blur, we would get to see all of the details still but the illusion of moving imagery would be lost on us, with the brick wall sort of fading in and out to different points. It's pretty simple to test this.
>>This is idiotically wrong. This entire paragraph is predicated on the false assumption that our eye somehow has a "framerate" itself.
It does. It's about 7000 FPS (+ or - for each individual).
The way bio-psychs tested this is by taking a high-speed controllable projecter that ranged from 30FPS to 20000FPS. Subjects were lead into the totally black room with a mic. Then they were directed to look at the projecter screen by a red dot. Once the pattern started, the projecter took a spread of 3 seconds and at 1 frame put a number on screen. The average FPS for the subjects NOT to notice the number was about 7000FPS.
>>>>(Either that, or the false assumption that our eye is basically a CCD with infinite discrimination, also wrong.) It does not. Our eye is fully analog.
You just cant say that. The ion channels are directly countable and lead to a time based binary system like that of morse code. Not even biologists are sure about that.
>>>>>(Go figure.) You get motion blur because the nerves and the chemical receptors can only react so quickly, and because nerves fire as light accumlates on the receptors. Since the receptors are moving quickly relative to the transmitting object, light rays from a given point are smeared across several cones/rods before the full processing of the image can take place. (Now, I'm simplifying because this isn't the place for a textbook on vision, but at least I know I'm simplifying.)
It's not that the rods/cones (rods are black-white, cones are color) react quickly, it's the chemical breakdown takes a while. Take the simple theater test. Go from sunny outside to a theater room. You pretty much cant see anything. It takes about 15 minutes to FULLY 'charge up' the rods back to full usage. But when you walk out of that sucky movie
Other side effects of bright light is that you cannot see absolute 'white' or 'black'. Similar with dark rooms, you cannot easily see color, as it takes high energy photons to allow you to see it.
>>>>>In fact, there's nothing the visual cortex could do to remove the motion blur coming from our eyes, because the motion blur causes actual information loss! (It can (and does) do some reconstruction, but you can't fill in details that don't exist.)
Relative motion (Score:3, Interesting)
If you have something travelling at a velocity of 600 pixels/s on your screen (not uncommon for objects in FPS games) it is much easier to track it at 100 FPS (relative motion of 6 pixels per frame) than 30 FPS.
Timing is important (Score:3, Interesting)
When you have chosen a refresh rate, the optimal FPS is exactly the same number. Generating more FPS is waste because it is just gives worse quality. You would either be skiping frames, which harms animations. Or you would be showing parts of different frames at the same time, which gives visible horisontal lines, where the two parts doesn't match. And yes, you will spot those broken images even when only shown for 1/100th of a second.
But generating 100 FPS and showing 100 FPS is not enough, you have to ensure each frame is showed exactly once. It requires a litle help from the graphics hardware, but nothing that is hard to implement. Having a litle extra processing power is important, you must be able to produce ever frame fast enough. You don't want to miss a deadline because occationally one frame takes just a litle more CPU time to render.
Is there anything correct in that article? (Score:3, Interesting)
TV has an effective framerate of 60fps*, movies are 24 and cartoons are usually 12 fps. Those can all show motion just fine as long as you don't move things too fast for the medium. The average PC monitor has a refresh rate under 90hz, not really much better than the 60hz of television, so you still can't let an object move as quickly from one side of the screen to the other as we can perceive it in real life. As someone mentioned setting the refresh rate ate 72 or 80 or whatever doesn't make your eyes hurt has nothing to do with our motion perception. In normal office use you want to set this as low as possible while still avoiding flicker so that you don't waste cycles on displaying that one character you just typed into emacs a few ms faster. If you are playing a game you want to set it as high as your monitor will take it (up to 120hz at decent resolution on some monitors), while still keeping this number below the number of frames the game can render per second so that it doesn't have to show the some frames twice and mess up the motion.
Film in a projector does not flicker like a monitor running at 24 hz. The reason a monitor flickers is because the phosphor brightess decays. A film screen is fully lit while the film is in front of the light. It flickers simply because it the time it takes to change frames is not zero, doubling the frames to 48 frames per second would increase the time the screen was dark between frames.
*yes TV has 30 'frames' but this is just how many times you redraw the phosphors, as far as motion is concerned you have 60 seperate images representing 60 different snapshots in time (assuming this is really shot as TV and not an up-converted film). Your eyes don't care that the samples are offset, it is not like you ever see one moment with the same receptors as the next, they need a regeneration time before they can sample again. And they are not synchronized at an specific FPS so the flicker explanation was all wacky. The reason you see those nasty line artifacts when watching TV on your computer without a decent TV application like 'tvtime' is because simple TV apps like XawTV are showing two fields sampled at different times at the same time. Often for a variable 2-3 frames if your refresh rate is between 61 and 89 hz. If you show those in the standard 60 hz interlaced with a TV compatible resolution you won't see those artifacts outside a freeze frame, though you will get more flicker than a regular TV because the phosphors in monitors decay faster to avoid ghosting at the higher frequency and contrast they deal with.
Again, CRT flicker has nothing to do with frames rendered per second(fps), and everything to do with how long lasting the phosphors are with respect to the screen refresh rate. Film projector's flicker is a completely different beast. Heck LCD flicker is completely unrelated to refresh rate and has everything to do with your backlight's balast(flourescent) or temperature(halogen). FPS above about 50-60 fps is all about res