Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
First Person Shooters (Games) PC Games (Games) Entertainment Games

The Quest For Frames Per Second In Games 72

VL writes "Ever wondered why it is exactly everyone keeps striving for more frames per second from games? Is it simply for bragging rights, or is there more to it than that? After all, we watch TV at 30fps, and that's plenty." This editorial at ViperLair.com discusses motion blur, the frame-rate difference between movies and videogames, and why "...the human eye is a marvelous, complex and very clever thing indeed, but... needs a little help now and then."
This discussion has been archived. No new comments can be posted.

The Quest For Frames Per Second In Games

Comments Filter:
  • Re:it plays better (Score:3, Interesting)

    by ctr2sprt ( 574731 ) on Sunday September 07, 2003 @01:04AM (#6891399)
    If you get a constant, true 30fps and the game action isn't tied to that framerate (rounding errors), then that would be okay. Of course, that's like physicists talking about frictionless surfaces or perfectly spherical objects, and about as attainable.
  • No (Score:5, Interesting)

    by RzUpAnmsCwrds ( 262647 ) on Sunday September 07, 2003 @01:18AM (#6891439)
    1: 30 frames per second is simply not enough. It's fine for movies and TV, but that is only because TV shows and movies are designed around the limits of the medium. Ever notice how TV shows and movies don't have a lot of quick, jerky movements? Those movements lead to motion sickness on TV and in movies, and they are the exact movements in 3D games. 30fps makes me sick, I can tolerate 60fps.

    2: Remember, FPS is the *average* framerate. It may dip well below that mark. My goal is not to have the most FPS but to have a reasonably high resolution with FSAA and AF on, all the detail settings to full, and to never have the game dip below my monitor's refresh rate (75Hz).
  • by Anonymous Coward on Sunday September 07, 2003 @01:21AM (#6891445)
    If I can get a *SOLID* 30fps, I'd prefer that to a framerate that peaks at 60 and swoops down to 15 in places. I also can't stand it when vsync is turned off in games - tearing is horrible. A nice compromise is to keep VSync on when the framerate is high, turn it off if it drops below, say, 30fps.

    I'm still waiting for the day when machines are good enough and code works well enough for games can be considered "real-time" (meaning having fixed steps at, say, 60Hz - and the game is NOT ALLOWED to take longer than that 1/60th sec to render a frame).

    - Disgruntled Planetside player who wishes that game always ran at 60Hz. :(
  • by Creepy Crawler ( 680178 ) on Sunday September 07, 2003 @02:27AM (#6891611)
    I messed up the 'quote' delination by putting open-brackets instead of close brackets. Sorry for the jibberish post.

    This is the Visual Cortex adding motion blur to perceived imagery so that rather than seeing everything in great detail, we are still able to perceive the effect of motion and direction as we ride by. The imagery is smoothly flowing from one point to the next and there are no jumps or flickering to be seen. If the eye wasn't to add this motion blur, we would get to see all of the details still but the illusion of moving imagery would be lost on us, with the brick wall sort of fading in and out to different points. It's pretty simple to test this.

    >>This is idiotically wrong. This entire paragraph is predicated on the false assumption that our eye somehow has a "framerate" itself.

    It does. It's about 7000 FPS (+ or - for each individual).

    The way bio-psychs tested this is by taking a high-speed controllable projecter that ranged from 30FPS to 20000FPS. Subjects were lead into the totally black room with a mic. Then they were directed to look at the projecter screen by a red dot. Once the pattern started, the projecter took a spread of 3 seconds and at 1 frame put a number on screen. The average FPS for the subjects NOT to notice the number was about 7000FPS.

    >>>>(Either that, or the false assumption that our eye is basically a CCD with infinite discrimination, also wrong.) It does not. Our eye is fully analog.

    You just cant say that. The ion channels are directly countable and lead to a time based binary system like that of morse code. Not even biologists are sure about that.

    >>>>>(Go figure.) You get motion blur because the nerves and the chemical receptors can only react so quickly, and because nerves fire as light accumlates on the receptors. Since the receptors are moving quickly relative to the transmitting object, light rays from a given point are smeared across several cones/rods before the full processing of the image can take place. (Now, I'm simplifying because this isn't the place for a textbook on vision, but at least I know I'm simplifying.)

    It's not that the rods/cones (rods are black-white, cones are color) react quickly, it's the chemical breakdown takes a while. Take the simple theater test. Go from sunny outside to a theater room. You pretty much cant see anything. It takes about 15 minutes to FULLY 'charge up' the rods back to full usage. But when you walk out of that sucky movie ;-) , your eyes hurt (due to rapid depletion of rods) and your cones take effect very rapidly.

    Other side effects of bright light is that you cannot see absolute 'white' or 'black'. Similar with dark rooms, you cannot easily see color, as it takes high energy photons to allow you to see it.

    >>>>>In fact, there's nothing the visual cortex could do to remove the motion blur coming from our eyes, because the motion blur causes actual information loss! (It can (and does) do some reconstruction, but you can't fill in details that don't exist.)
  • Relative motion (Score:3, Interesting)

    by alyandon ( 163926 ) on Sunday September 07, 2003 @02:29AM (#6891619) Homepage
    FPS is important to FPS gamers because of one simple fact... relative motion.

    If you have something travelling at a velocity of 600 pixels/s on your screen (not uncommon for objects in FPS games) it is much easier to track it at 100 FPS (relative motion of 6 pixels per frame) than 30 FPS.
  • Timing is important (Score:3, Interesting)

    by kasperd ( 592156 ) on Sunday September 07, 2003 @03:02AM (#6891720) Homepage Journal
    If you are to look on a CRT screen for a long time, you certainly want a high refresh rate. How much is required to be enough probably depends on who you are, but 75Hz is not enough for me. But I can hardly tell the difference between 85Hz and 100Hz. I think 100Hz is enough for most people.

    When you have chosen a refresh rate, the optimal FPS is exactly the same number. Generating more FPS is waste because it is just gives worse quality. You would either be skiping frames, which harms animations. Or you would be showing parts of different frames at the same time, which gives visible horisontal lines, where the two parts doesn't match. And yes, you will spot those broken images even when only shown for 1/100th of a second.

    But generating 100 FPS and showing 100 FPS is not enough, you have to ensure each frame is showed exactly once. It requires a litle help from the graphics hardware, but nothing that is hard to implement. Having a litle extra processing power is important, you must be able to produce ever frame fast enough. You don't want to miss a deadline because occationally one frame takes just a litle more CPU time to render.
  • by zenyu ( 248067 ) on Sunday September 07, 2003 @03:40AM (#6891801)
    I don't even play video games and I know the reason you need high FPS has nothing to do with the framerate at which you meld seperate frames into motion. It's all about response time. When the game can render at 500 fps it means you have to wait 1/76+1/500+'AI time' seconds for a response to something you do on the controler. This assumes your refresh rate is 76 hz. The 1/76 is fixed by your refresh rate because unless you can do the entire redraw in the vertical retrace period and have dual ported RAM on the video card you need to double buffer. Some rendering engines, not designed for games, are actually triple buffered for better throughput. Video games are all about response time, and here you you will sacrifice 1000 fps for that 500 fps to avoid adding an extra 1/76 to that timing sum. There of course is a certain point at which that number is high enough that you don't need to double buffer, in reality those nv FX-2000's and ATI 98xx's are way to slow to approach that kind of framerate with the visual quality people want.

    TV has an effective framerate of 60fps*, movies are 24 and cartoons are usually 12 fps. Those can all show motion just fine as long as you don't move things too fast for the medium. The average PC monitor has a refresh rate under 90hz, not really much better than the 60hz of television, so you still can't let an object move as quickly from one side of the screen to the other as we can perceive it in real life. As someone mentioned setting the refresh rate ate 72 or 80 or whatever doesn't make your eyes hurt has nothing to do with our motion perception. In normal office use you want to set this as low as possible while still avoiding flicker so that you don't waste cycles on displaying that one character you just typed into emacs a few ms faster. If you are playing a game you want to set it as high as your monitor will take it (up to 120hz at decent resolution on some monitors), while still keeping this number below the number of frames the game can render per second so that it doesn't have to show the some frames twice and mess up the motion.

    Film in a projector does not flicker like a monitor running at 24 hz. The reason a monitor flickers is because the phosphor brightess decays. A film screen is fully lit while the film is in front of the light. It flickers simply because it the time it takes to change frames is not zero, doubling the frames to 48 frames per second would increase the time the screen was dark between frames.

    *yes TV has 30 'frames' but this is just how many times you redraw the phosphors, as far as motion is concerned you have 60 seperate images representing 60 different snapshots in time (assuming this is really shot as TV and not an up-converted film). Your eyes don't care that the samples are offset, it is not like you ever see one moment with the same receptors as the next, they need a regeneration time before they can sample again. And they are not synchronized at an specific FPS so the flicker explanation was all wacky. The reason you see those nasty line artifacts when watching TV on your computer without a decent TV application like 'tvtime' is because simple TV apps like XawTV are showing two fields sampled at different times at the same time. Often for a variable 2-3 frames if your refresh rate is between 61 and 89 hz. If you show those in the standard 60 hz interlaced with a TV compatible resolution you won't see those artifacts outside a freeze frame, though you will get more flicker than a regular TV because the phosphors in monitors decay faster to avoid ghosting at the higher frequency and contrast they deal with.

    Again, CRT flicker has nothing to do with frames rendered per second(fps), and everything to do with how long lasting the phosphors are with respect to the screen refresh rate. Film projector's flicker is a completely different beast. Heck LCD flicker is completely unrelated to refresh rate and has everything to do with your backlight's balast(flourescent) or temperature(halogen). FPS above about 50-60 fps is all about res

"No matter where you go, there you are..." -- Buckaroo Banzai

Working...