Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
First Person Shooters (Games) PC Games (Games) Entertainment Games

The Quest For Frames Per Second In Games 72

VL writes "Ever wondered why it is exactly everyone keeps striving for more frames per second from games? Is it simply for bragging rights, or is there more to it than that? After all, we watch TV at 30fps, and that's plenty." This editorial at ViperLair.com discusses motion blur, the frame-rate difference between movies and videogames, and why "...the human eye is a marvelous, complex and very clever thing indeed, but... needs a little help now and then."
This discussion has been archived. No new comments can be posted.

The Quest For Frames Per Second In Games

Comments Filter:
  • by Jerf ( 17166 ) on Sunday September 07, 2003 @01:38AM (#6891491) Journal
    I like the ideas behind this article (I couldn't immediately Google for a good replacement so there may be room on the web for an article like this) but the author (and there is no nice way to put this) is talking out of his ass. For instance, from the second page:

    This is the Visual Cortex adding motion blur to perceived imagery so that rather than seeing everything in great detail, we are still able to perceive the effect of motion and direction as we ride by. The imagery is smoothly flowing from one point to the next and there are no jumps or flickering to be seen. If

    the eye wasn't to add this motion blur, we would get to see all of the details still but the illusion of moving imagery would be lost on us, with the brick wall sort of fading in and out to different points. It's pretty simple to test this.

    This is idiotically wrong. This entire paragraph is predicated on the false assumption that our eye somehow has a "framerate" itself. (Either that, or the false assumption that our eye is basically a CCD with infinite discrimination, also wrong.) It does not. Our eye is fully analog. (Go figure.) You get motion blur because the nerves and the chemical receptors can only react so quickly, and because nerves fire as light accumlates on the
    receptors. Since the receptors are moving quickly relative to the transmitting object, light rays from a given point are smeared across several cones/rods before the full processing of the image can take place. (Now, I'm simplifying because this isn't the place for a
    textbook on vision, but at least I know I'm simplifying.) In fact, there's nothing the visual cortex could do to remove the motion blur coming from our eyes, because the motion blur causes actual information loss! (It can (and does) do some reconstruction, but you can't fill in details that don't exist.)

    (Note in the portion I italized how he jumps from the "vision cortex" to "the eye"; the two are NOT the same and can't be lumped together like that in this context.)

    This simple error renders the entire second page actively wrong.

    Here's another, referring to interlacing:

    Using a succession of moving images, the two refreshes per frame fool us into believing there is two frames for every one frame. With the motion blur the eye believes we are watching a smoothly flowing picture.

    Uh, wrong wrong wrong. Interlacing was a cheap hack to save bandwidth. "Progressive scan" is universally considered superior to interlacing (in terms of quality alone), and many (such as myself) consider keeping interlaced video modes in HDTV to be a serious
    long-term mistake. It has nothing to do with convincing you you are seeing motion, in fact it has a strongly deleterious effect because you can frequently see the "combing"; that's why TVs have "anti-comb" filters. You don't see it as "motion", you see it as wierd "tearing".

    Like the TV, your Computer Monitor (if it's a Cathode Ray Tube) refreshes by drawing the screen line by line horizontally, but unlike the TV, a Monitor and Video Card doesn't add extra frames. If your screen draws at 30 fps, you will GET 30 fps.

    ALSO wrong. The computer monitor and video card will pump out X frames per second, period. It has to. If the CRT is going at 60 fps and the video card (as in the 3D hardware) is only pumping at 30 fps, every frame will be shown for two CRT cycles. What else is the video card (as in the rasterizer) going to display? You'd notice if the screen were blank every other cycle!

    CRT Monitors are considered 'Flicker Free' at about 72Hz for a reason, and simply put it's to compensate for the lack of motion blur, afterimages and other trickery we live with every day in TV and Films.

    Wrong again. CRTs at that frequency are "flicker free" because they pass the frequency the parts of our eyes more sensitive to motion (actually the peripheral vision, not the "primary" vision we're us

  • uh (Score:1, Informative)

    by Anonymous Coward on Sunday September 07, 2003 @01:52AM (#6891532)
    The problem with that is that if you're in a detailed indoor environment, and then suddenly step outside, the game will look like ass because of constantly dropping or adding detail.
  • Re:Motion Pictures (Score:3, Informative)

    by Murdock037 ( 469526 ) <tristranthorn.hotmail@com> on Sunday September 07, 2003 @02:07AM (#6891565)
    Movie projectors cheat by displaying every frame twice, which doubles the frame rate from 24 fps to 48 fps.

    Wrong. They show 24 fps. (There's also a bit of black in between each frame, otherwise the eye would register a blur; but it's still 24fps.)

    If the projector was run at normal speed and showed each frame twice, it would look like choppy slow motion. If it was run faster at 48 fps, the motion would be fast, like how you often see old silent pictures.

    You would need a print with every frame printed twice in a row for it to work, and then a faster projector than is safe for most film.

    There are certain camera systems under development which would shoot film at 48 fps, and you'd then need a projector that could show the film at 48fps, but the standard rate for cameras and projector for the last fifty years, everything you've ever seen in a cinema, has been 24fps.

    Cinematographers also avoid certain shots, like rapidly panning from left to right, which look terrible on a movie screen.

    It's called a swish pan, and it makes for a nice transition, if you cut in between two of them. But you don't have to, and it doesn't look "terrible."

    Whoever modded you up is embarassingly ignorant of the topic at hand.
  • by LittleBigLui ( 304739 ) on Sunday September 07, 2003 @02:40AM (#6891664) Homepage Journal
    Nowadays, movies are INDEED filmed with 24 frames per second and therefore are also projected with 24 frames per second. But, the frames are shown multiple times (i think two is standard, but i've heared about three, too).

    And no, the film doesn't have to have the frames on it multiple times. The transport mechanism in a projector works like this: light off, move the film forward to the next frame, stop, light on, light off, move forward, stop, light on, .....

    Now, instead of having ONE phase of light during a frame, modern projectors have TWO or THREE of them:

    light off, move forward, stop, light on, light off, light on, light off, move forward ....

    at least thats what i learned in school ;)

The moon is made of green cheese. -- John Heywood

Working...