The Quest For Frames Per Second In Games 72
VL writes "Ever wondered why it is exactly everyone keeps striving for more frames per second from games? Is it simply for bragging rights, or is there more to it than that? After all, we watch TV at 30fps, and that's plenty." This editorial at ViperLair.com discusses motion blur, the frame-rate difference between movies and videogames, and why "...the human eye is a marvelous, complex and very clever thing indeed, but... needs a little help now and then."
it plays better (Score:3, Insightful)
Re:it plays better (Score:3, Interesting)
Constant Frame Rate (Score:2)
uh (Score:1, Informative)
Re:it plays better (Score:3, Insightful)
Re:it plays better (Score:1)
I've played Project IGI (the first one), which had a smooth gameplay compared to some of the more modern FPS. It wasn't perfectly smooth, but it's was hard to detect any significant jumps between frames.
I was suprised that it ran at 30FPS - constantly. There wasn't even a loading delay between indoor and outdoor areas. It was even smoother than D1X running at 30FPS.
Ugh (Score:5, Insightful)
Author seems to understand about as much about the primate visual system as... well... anyone else that's never studied it. The visual cortex doesn't "add blur."
His general point is probably correct, but is reasoning is fucked.
Re:Ugh (Score:1)
However when writing an article or paper that you expect to be published in some format you are expected to take a little more care. If someone were to post an article critisizing someone for spel
Re:Ugh (Score:1)
Motion Pictures (Score:3, Insightful)
Re:Motion Pictures (Score:3, Informative)
Wrong. They show 24 fps. (There's also a bit of black in between each frame, otherwise the eye would register a blur; but it's still 24fps.)
If the projector was run at normal speed and showed each frame twice, it would look like choppy slow motion. If it was run faster at 48 fps, the motion would be fast, like how you often see old silent pictures.
You would need a print with every frame printed
sorry murdock, but you are wrong. (Score:2, Informative)
And no, the film doesn't have to have the frames on it multiple times. The transport mechanism in a projector works like this: light off, move the film forward to the next frame, stop, light on, light off, move forward, stop, light on,
Now, instead of having ONE phase of light durin
No, I'm not. (Score:2)
The film is pulled frame by frame in front of the lens, and you may get the impression of flicker, but that's only because of a misaligned shutter that's in front of the bulb-- it lets light through when the frame is aligned, and blocks the light as the next frame is being pulled down. This happens 24 times per second.
You may want to consult this article [howstuffworks.com] at How Stuff Works [howstuffworks.com], specifically the fourth page [howstuffworks.com], which deals with bulbs, shutters, etc.
Re:No, I'm not. (Score:1)
of course not. my bad i didn't mention the shutter, but the effect is the same (light on and off).
The School was the Polytechnical University for Media Technology and Design in Hagenberg, Upper Austria, and while i think howstuffworks is a great resource, i'm sure what i learned there is correct.
if you check the first few paragraphs of this [grand-illusions.com], you'll see that the concept isn't new either:
Re:No, I'm not. (Score:2)
In any case there are usually 48 blinks of light per second (sometimes 72 but I belive that may only be on very old projectors designed to project silent films at 18 fps). The trick is that the same frame is shown in more than one blink.
A light blinking 24 times a second is quite obvious, while 48 times per seconds is approaching the limits of what peo
Swish Pan (Score:2)
The projector does open the shutter twice for each frame to reduce the sensation of flicker.
Re:Motion Pictures (Score:2)
Do a swish pan across a row of vertical lines (like a fence or vertical blinds) and it will, indeed, look horrible. The 24 fps isn't adequate to the job.
It gets even worse on NTSC/PAL though, since the interlaced nature of the picture starts breaking things up horribly.
For true pain, take a movie that does a horrible swish pan like that and then transfer i
Re:They must like to part with their money. (Score:2)
Re:They must like to part with their money. (Score:1)
No (Score:5, Interesting)
2: Remember, FPS is the *average* framerate. It may dip well below that mark. My goal is not to have the most FPS but to have a reasonably high resolution with FSAA and AF on, all the detail settings to full, and to never have the game dip below my monitor's refresh rate (75Hz).
Re:No (Score:2)
The truth is that TV programs and movies are filmed WITH motion blur. That means that every frame (for films) is made during 1/24th of a second. It's actualy a superposition of all the trillions of images that were projected to the camera during that time. Our eye gets all the light that was destined for it, the only thing tha
Consistency also matters (Score:2, Interesting)
I'm still waiting for the day when machines are good enough and code works well enough for games can be considered "real-time" (meaning having fixed steps at, say, 60Hz - and the game is NOT ALLOWED to take longer
movies are too slow (Score:1)
Re:movies are too slow (Score:1)
Re:movies are too slow (Score:1)
Good god.
You make it sound like there's an FPS101 at Gamer College that's manditory.
Re:movies are too slow (Score:2)
I flunked out of Crate Opening 356. Ruined my entire academic career.
Some serious flaws render the piece useless (Score:5, Informative)
This is idiotically wrong. This entire paragraph is predicated on the false assumption that our eye somehow has a "framerate" itself. (Either that, or the false assumption that our eye is basically a CCD with infinite discrimination, also wrong.) It does not. Our eye is fully analog. (Go figure.) You get motion blur because the nerves and the chemical receptors can only react so quickly, and because nerves fire as light accumlates on the
receptors. Since the receptors are moving quickly relative to the transmitting object, light rays from a given point are smeared across several cones/rods before the full processing of the image can take place. (Now, I'm simplifying because this isn't the place for a
textbook on vision, but at least I know I'm simplifying.) In fact, there's nothing the visual cortex could do to remove the motion blur coming from our eyes, because the motion blur causes actual information loss! (It can (and does) do some reconstruction, but you can't fill in details that don't exist.)
(Note in the portion I italized how he jumps from the "vision cortex" to "the eye"; the two are NOT the same and can't be lumped together like that in this context.)
This simple error renders the entire second page actively wrong.
Here's another, referring to interlacing:
Uh, wrong wrong wrong. Interlacing was a cheap hack to save bandwidth. "Progressive scan" is universally considered superior to interlacing (in terms of quality alone), and many (such as myself) consider keeping interlaced video modes in HDTV to be a serious
long-term mistake. It has nothing to do with convincing you you are seeing motion, in fact it has a strongly deleterious effect because you can frequently see the "combing"; that's why TVs have "anti-comb" filters. You don't see it as "motion", you see it as wierd "tearing".
ALSO wrong. The computer monitor and video card will pump out X frames per second, period. It has to. If the CRT is going at 60 fps and the video card (as in the 3D hardware) is only pumping at 30 fps, every frame will be shown for two CRT cycles. What else is the video card (as in the rasterizer) going to display? You'd notice if the screen were blank every other cycle!
Wrong again. CRTs at that frequency are "flicker free" because they pass the frequency the parts of our eyes more sensitive to motion (actually the peripheral vision, not the "primary" vision we're us
Re:Some serious flaws render the piece useless (Score:2)
A single use of the apostrophe key would do wonders to his prose. Maybe he thumbed the entire article. That would explain a lot.
Re:Some serious flaws render the piece useless (Score:2)
Re:Some serious flaws render the piece useless (Score:4, Interesting)
This is the Visual Cortex adding motion blur to perceived imagery so that rather than seeing everything in great detail, we are still able to perceive the effect of motion and direction as we ride by. The imagery is smoothly flowing from one point to the next and there are no jumps or flickering to be seen. If the eye wasn't to add this motion blur, we would get to see all of the details still but the illusion of moving imagery would be lost on us, with the brick wall sort of fading in and out to different points. It's pretty simple to test this.
>>This is idiotically wrong. This entire paragraph is predicated on the false assumption that our eye somehow has a "framerate" itself.
It does. It's about 7000 FPS (+ or - for each individual).
The way bio-psychs tested this is by taking a high-speed controllable projecter that ranged from 30FPS to 20000FPS. Subjects were lead into the totally black room with a mic. Then they were directed to look at the projecter screen by a red dot. Once the pattern started, the projecter took a spread of 3 seconds and at 1 frame put a number on screen. The average FPS for the subjects NOT to notice the number was about 7000FPS.
>>>>(Either that, or the false assumption that our eye is basically a CCD with infinite discrimination, also wrong.) It does not. Our eye is fully analog.
You just cant say that. The ion channels are directly countable and lead to a time based binary system like that of morse code. Not even biologists are sure about that.
>>>>>(Go figure.) You get motion blur because the nerves and the chemical receptors can only react so quickly, and because nerves fire as light accumlates on the receptors. Since the receptors are moving quickly relative to the transmitting object, light rays from a given point are smeared across several cones/rods before the full processing of the image can take place. (Now, I'm simplifying because this isn't the place for a textbook on vision, but at least I know I'm simplifying.)
It's not that the rods/cones (rods are black-white, cones are color) react quickly, it's the chemical breakdown takes a while. Take the simple theater test. Go from sunny outside to a theater room. You pretty much cant see anything. It takes about 15 minutes to FULLY 'charge up' the rods back to full usage. But when you walk out of that sucky movie
Other side effects of bright light is that you cannot see absolute 'white' or 'black'. Similar with dark rooms, you cannot easily see color, as it takes high energy photons to allow you to see it.
>>>>>In fact, there's nothing the visual cortex could do to remove the motion blur coming from our eyes, because the motion blur causes actual information loss! (It can (and does) do some reconstruction, but you can't fill in details that don't exist.)
Re:Some serious flaws render the piece useless (Score:1)
Because the only thing adding more energy to a red photon is going to get you is a green photon. .
Re:Some serious flaws render the piece useless (Score:1)
Re:Some serious flaws render the piece useless (Score:1)
Damn, I thought I got motion blur because of all the shroooms.
Re:Some serious flaws render the piece useless (Score:2)
Relative motion (Score:3, Interesting)
If you have something travelling at a velocity of 600 pixels/s on your screen (not uncommon for objects in FPS games) it is much easier to track it at 100 FPS (relative motion of 6 pixels per frame) than 30 FPS.
Re:Relative motion (Score:1)
Except that most gamers aren't using monitors that run at 100Hz at their gaming resolution, so they're not going to see every frame, and aren't going to see 6 pixels per frame. Never mind that it is uncommon for objects to move 600 pixels/sec unless you are moving your view quickly, which most peop
Re:Relative motion (Score:1)
Yeah, most people. Gamers on the other hand will turn their view a lot *and* track what's going on around them. A lot of this happens subconsciously (ie. it's fast) - for this a good framerate helps a lot. And an object crossing the screen in under one second isn't that uncommon, just think quake3 and jump
the truth is... (Score:4, Funny)
Timing is important (Score:3, Interesting)
When you have chosen a refresh rate, the optimal FPS is exactly the same number. Generating more FPS is waste because it is just gives worse quality. You would either be skiping frames, which harms animations. Or you would be showing parts of different frames at the same time, which gives visible horisontal lines, where the two parts doesn't match. And yes, you will spot those broken images even when only shown for 1/100th of a second.
But generating 100 FPS and showing 100 FPS is not enough, you have to ensure each frame is showed exactly once. It requires a litle help from the graphics hardware, but nothing that is hard to implement. Having a litle extra processing power is important, you must be able to produce ever frame fast enough. You don't want to miss a deadline because occationally one frame takes just a litle more CPU time to render.
Re:Timing is important (Score:1)
I have to have a minimum of 85Hz in most lighting environments. I can tolerate refresh rates down to almost 60Hz with very low or no light, but once a light comes on it starts to interfere and the rate needs to come back up (I start getting headaches after about 30 minutes with 60-75Hz in a lit room).
When you have chosen a refresh rate, the optimal FPS is exactly the same number. Generating more FPS is w
Is there anything correct in that article? (Score:3, Interesting)
TV has an effective framerate of 60fps*, movies are 24 and cartoons are usually 12 fps. Those can all show motion just fine as long as you don't move things too fast for the medium. The average PC monitor has a refresh rate under 90hz, not really much better than the 60hz of television, so you still can't let an object move as quickly from one side of the screen to the other as we can perceive it in real life. As someone mentioned setting the refresh rate ate 72 or 80 or whatever doesn't make your eyes hurt has nothing to do with our motion perception. In normal office use you want to set this as low as possible while still avoiding flicker so that you don't waste cycles on displaying that one character you just typed into emacs a few ms faster. If you are playing a game you want to set it as high as your monitor will take it (up to 120hz at decent resolution on some monitors), while still keeping this number below the number of frames the game can render per second so that it doesn't have to show the some frames twice and mess up the motion.
Film in a projector does not flicker like a monitor running at 24 hz. The reason a monitor flickers is because the phosphor brightess decays. A film screen is fully lit while the film is in front of the light. It flickers simply because it the time it takes to change frames is not zero, doubling the frames to 48 frames per second would increase the time the screen was dark between frames.
*yes TV has 30 'frames' but this is just how many times you redraw the phosphors, as far as motion is concerned you have 60 seperate images representing 60 different snapshots in time (assuming this is really shot as TV and not an up-converted film). Your eyes don't care that the samples are offset, it is not like you ever see one moment with the same receptors as the next, they need a regeneration time before they can sample again. And they are not synchronized at an specific FPS so the flicker explanation was all wacky. The reason you see those nasty line artifacts when watching TV on your computer without a decent TV application like 'tvtime' is because simple TV apps like XawTV are showing two fields sampled at different times at the same time. Often for a variable 2-3 frames if your refresh rate is between 61 and 89 hz. If you show those in the standard 60 hz interlaced with a TV compatible resolution you won't see those artifacts outside a freeze frame, though you will get more flicker than a regular TV because the phosphors in monitors decay faster to avoid ghosting at the higher frequency and contrast they deal with.
Again, CRT flicker has nothing to do with frames rendered per second(fps), and everything to do with how long lasting the phosphors are with respect to the screen refresh rate. Film projector's flicker is a completely different beast. Heck LCD flicker is completely unrelated to refresh rate and has everything to do with your backlight's balast(flourescent) or temperature(halogen). FPS above about 50-60 fps is all about res
Re:Is there anything correct in that article? (Score:2)
Real bitch is we finally get Cards that can push games past 30fps to 100fps, and then you enable AA and drops like a rock.
Then GFX cards that rock at AA are released, then the games push the polygons up so even 3ghz CPU's and 256Meg cutting edge GFX cards can only pump 30fps in a fire fight.
CS with 6x AA and high poly skins looks awesome. Cant wait to see how HL2/Doom3 work on normal hardware.
Re:Is there anything correct in that article? (Score:1)
The REAL difference between film and games. (Score:3, Insightful)
The reason film projection can smoothly present video is because the blur on film caused by movement of the target on a slow-shutter camera. This blur actually helps because when displayed with 24 other frames in one second (all having the blur effect themselves) it looks rather fluid. Even digital movie cameras accomplish their video quality using the same trick.
Video cards however, do not have the luxury of using this trick for video games. To show the movement of an avatar, for example; every single measurable instant of movement must be rendered for each measurable instant. Those instants are misleadingly called "frames". Achieving higher framerates is actually critical for good gameplay because there are more instants in a given amount of time. That's why low fps seems to feel sluggish on some games because 15/20/25/etc. instants are certainly not enough to show fluid movement. I myself feel right at home right around 75 fps on any first person shooter or what not. This is because the human brain registers information from the brain at about 75 Htz (at least that's what I was taught).
So, next time you hear "24 fps is all you should need!", you can tell them why it's not.
Re:The REAL difference between film and games. (Score:1)
Re:The REAL difference between film and games. (Score:1)
Motion blur is a natural effect on film, but on a computer it'd have to be specifically computed, which would only make things worse. If you only get 30 fps already, and motion blur slows it down to 10, then it's going to be too slow for the motion blur to be of much use.
Re:The REAL difference between film and games. (Score:2)
This was the next big thing for 3dfx; their 'T-buffer' was designed to do things like motion blur.
The idea being, of course, that yes, thirty or sixty FPS really is all you need *so long as you get the same blurring effects you get from film/video.*
Having 200 frames per second merely means that the jump from position to position gets smaller and smaller, in effect building in motion blur.
As an example, roll a ball across a table, at a speed that it takes one second. Film that with a film camera at fi
Grammar? (Score:4, Insightful)
I tried to RTFA, but I fainted mid-way during the first paragraph.
(They're all from the one paragraph introduction...)
Re:Grammar? (Score:2)
Well for me personally... (Score:3, Insightful)
psxndc
altered physics with high FPS is the reason (Score:1)
Physics approximations tied to frame rate (Score:2)
The games development algorithms mailing list [sourceforge.net] has recently covered this topic in some depth. (Apologies, the archives don't seem to be working properly at the moment.)
The problem can lie in the collision detection working with sampled points along the player's trajectory during a jump, checking for collisions between those points. The lower the frame rat
Monitor refresh rates (Score:1)
I play Quake 3 at 800x600, so my monitor can refresh at 160Hz. With everything at minimum detail I get about 200fps average, so my real fps is a solid 160fps (except in very crowded areas).
This makes a very big difference, especially in 1v1 with railguns. Here every few milliseco
Summary (Score:2)
Go Back To School (Score:1)
Stupid. (Score:2)
His attempt to explain away the fact that 24-30 fps works fine for movies and television is an utter failure. Surrounding darkness is not why movies look smooth, and the feeling of continuity here has nothing to do with the afterimage effect. The refresh rate of televisions, resulting in "each frame being drawn twice", does not double the framerate of the
Re:Stupid. (Score:2)
This is undoubtedly true, but anyone is capable of telling the difference between 25 and 100 fps with a little training. And would suffer from 25 fps in a video game without knowing why. Then again I consider myself pretty good at eyeing it, yet I'm sometimes off on my guess of the framerate by 60%. You need to be able to move an object fast enough for the motion to break down to get a good estimate of the fram
Translation (Score:2)
TV and movies can get away with 24 to 30 FPS mainly because the CAMERA that took the actor's picture kept the shutter open for almost the entire 1/24th of a second, so moving objects are blurry. That hides the fact that the actor moved quite a bit from one frame to the next, since the area between point A and point B is filled in with the bl
The Power of Suggestion (Score:1)
It's all hype and power of suggestion.
Take a 30 fps scene, tell someone it's running at 75, and they will tell you, yes, it looks m-u-c-h better.
Somt other points (Score:1)
1. A fixed frame-rate is better than even high but variable frame-rate. A solid 30hz can actually be better than 30-120 fluctuating. A ton of research has gone into how to make the graphics pipeline effectively lock to a fixed rate and there's a good reason: variable frame-rates make people sick; fix
Poorly researched (and thought out) article :P (Score:1)
Off the top of my head, I don't know where he's getting this "displayed twice" business. The closest thing I could think of the to that is the technique of interlacing frames used for displaying images on a TV screen. But when a series of images are interlaced, they're definitely not being displayed "twice"...more like one half at a time.
Also, motion blur is not "added" in the visual corte
How to calculate minimum desirable fps (Score:1)
First, the bare minimum fps should be the rate at which flicker is no longer detected (according to the article its 72 fps). In reality, the decay rate and decay curve of the display device are the real factors here, but this will soon be irrelevant as you'll see.
If you have an object moving across the screen, with no motion blur (such as with 3d games) at a low fps, you see multiple separate instances of that object rather
High End Cards + New Games (Score:1)