Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
First Person Shooters (Games) PC Games (Games) Entertainment Games

The Quest For Frames Per Second In Games 72

VL writes "Ever wondered why it is exactly everyone keeps striving for more frames per second from games? Is it simply for bragging rights, or is there more to it than that? After all, we watch TV at 30fps, and that's plenty." This editorial at ViperLair.com discusses motion blur, the frame-rate difference between movies and videogames, and why "...the human eye is a marvelous, complex and very clever thing indeed, but... needs a little help now and then."
This discussion has been archived. No new comments can be posted.

The Quest For Frames Per Second In Games

Comments Filter:
  • it plays better (Score:3, Insightful)

    by Song for the Deaf ( 608030 ) on Sunday September 07, 2003 @12:58AM (#6891379)
    With a higher framerate, a game just feels and plays better, it's as simple as that. 30 fps is just *not enough* to have good action and feel on most pc first person shooters.
    • Re:it plays better (Score:3, Interesting)

      by ctr2sprt ( 574731 )
      If you get a constant, true 30fps and the game action isn't tied to that framerate (rounding errors), then that would be okay. Of course, that's like physicists talking about frictionless surfaces or perfectly spherical objects, and about as attainable.
      • You could design a game to always preserve a constant frame rate. The program would synchronize itself to vertical retrace or another periodic signal. It would have to be able to build the next frame buffer in a bounded amount of time. This might require it to drop detail or features when a lot of things were going on.
        • uh (Score:1, Informative)

          by Anonymous Coward
          The problem with that is that if you're in a detailed indoor environment, and then suddenly step outside, the game will look like ass because of constantly dropping or adding detail.
    • Re:it plays better (Score:3, Insightful)

      by trompete ( 651953 )
      My largest problem isn't the graphic card's frame rate ability. It is that damned speed of light that is keeping me from getting a low ping when I play on European servers. Seriously....you can play most games with an average machine, but your frame rate is really limited by the propogation delay and all the hops between you and the server. Get me a lower ping, and I'll be one happy guy!
    • 30 fps is just *not enough* to have good action and feel on most pc first person shooters.


      I've played Project IGI (the first one), which had a smooth gameplay compared to some of the more modern FPS. It wasn't perfectly smooth, but it's was hard to detect any significant jumps between frames.

      I was suprised that it ran at 30FPS - constantly. There wasn't even a loading delay between indoor and outdoor areas. It was even smoother than D1X running at 30FPS.
  • Ugh (Score:5, Insightful)

    by Elwood P Dowd ( 16933 ) <judgmentalist@gmail.com> on Sunday September 07, 2003 @01:10AM (#6891415) Journal
    That article reminds me of the TV ads with scientists explaining how our patented hydro-oxytane reaches deep into your pores and assasinates uglificating bacteria.

    Author seems to understand about as much about the primate visual system as... well... anyone else that's never studied it. The visual cortex doesn't "add blur."

    His general point is probably correct, but is reasoning is fucked.
    • The primary thing he missed (although he almost got it) was that the biggest factor is the difference between your highest and lowest framerate in a given time frame. If you're running fat & happy at 100 fps and then (as someone mentioned earlier) walk outside, so to speak, everything slows to a crawl as it loads textures and tries to render an image much more complex (or at least with a much larger visible range). Until the card catches up you could be running below 30 fps, or you could be running 45 f
  • Motion Pictures (Score:3, Insightful)

    by Detritus ( 11846 ) on Sunday September 07, 2003 @01:10AM (#6891416) Homepage
    Movie projectors cheat by displaying every frame twice, which doubles the frame rate from 24 fps to 48 fps. Cinematographers also avoid certain shots, like rapidly panning from left to right, which look terrible on a movie screen.
    • Re:Motion Pictures (Score:3, Informative)

      by Murdock037 ( 469526 )
      Movie projectors cheat by displaying every frame twice, which doubles the frame rate from 24 fps to 48 fps.

      Wrong. They show 24 fps. (There's also a bit of black in between each frame, otherwise the eye would register a blur; but it's still 24fps.)

      If the projector was run at normal speed and showed each frame twice, it would look like choppy slow motion. If it was run faster at 48 fps, the motion would be fast, like how you often see old silent pictures.

      You would need a print with every frame printed
      • Nowadays, movies are INDEED filmed with 24 frames per second and therefore are also projected with 24 frames per second. But, the frames are shown multiple times (i think two is standard, but i've heared about three, too).

        And no, the film doesn't have to have the frames on it multiple times. The transport mechanism in a projector works like this: light off, move the film forward to the next frame, stop, light on, light off, move forward, stop, light on, .....

        Now, instead of having ONE phase of light durin
        • The bulb in the projector doesn't turn on and off continuously.

          The film is pulled frame by frame in front of the lens, and you may get the impression of flicker, but that's only because of a misaligned shutter that's in front of the bulb-- it lets light through when the frame is aligned, and blocks the light as the next frame is being pulled down. This happens 24 times per second.

          You may want to consult this article [howstuffworks.com] at How Stuff Works [howstuffworks.com], specifically the fourth page [howstuffworks.com], which deals with bulbs, shutters, etc.
          • The bulb in the projector doesn't turn on and off continuously.

            of course not. my bad i didn't mention the shutter, but the effect is the same (light on and off).

            The School was the Polytechnical University for Media Technology and Design in Hagenberg, Upper Austria, and while i think howstuffworks is a great resource, i'm sure what i learned there is correct.

            if you check the first few paragraphs of this [grand-illusions.com], you'll see that the concept isn't new either:

            For much of the 'silent' period, films were shot at

          • On those projectors pictured, the shutter rotates more than once per frame. Projectors I'm familiar with have more than one blade and rotate once per frame.

            In any case there are usually 48 blinks of light per second (sometimes 72 but I belive that may only be on very old projectors designed to project silent films at 18 fps). The trick is that the same frame is shown in more than one blink.

            A light blinking 24 times a second is quite obvious, while 48 times per seconds is approaching the limits of what peo
      • A swish pan may be a recognized effect but it wasn't what I was referring to. I was thinking of a shot of a landscape, where everything is in sharp focus, combined with a pan that is slow enough that instead of blurring, the landscape jerks across the screen, klunk, klunk, knunk, at 24 painfully obvious frames per second.

        The projector does open the shutter twice for each frame to reduce the sensation of flicker.

      • It's called a swish pan, and it makes for a nice transition, if you cut in between two of them. But you don't have to, and it doesn't look "terrible."

        Do a swish pan across a row of vertical lines (like a fence or vertical blinds) and it will, indeed, look horrible. The 24 fps isn't adequate to the job.

        It gets even worse on NTSC/PAL though, since the interlaced nature of the picture starts breaking things up horribly.

        For true pain, take a movie that does a horrible swish pan like that and then transfer i
  • No (Score:5, Interesting)

    by RzUpAnmsCwrds ( 262647 ) on Sunday September 07, 2003 @01:18AM (#6891439)
    1: 30 frames per second is simply not enough. It's fine for movies and TV, but that is only because TV shows and movies are designed around the limits of the medium. Ever notice how TV shows and movies don't have a lot of quick, jerky movements? Those movements lead to motion sickness on TV and in movies, and they are the exact movements in 3D games. 30fps makes me sick, I can tolerate 60fps.

    2: Remember, FPS is the *average* framerate. It may dip well below that mark. My goal is not to have the most FPS but to have a reasonably high resolution with FSAA and AF on, all the detail settings to full, and to never have the game dip below my monitor's refresh rate (75Hz).
    • by danila ( 69889 )
      Sure, action movies have no quick and jerky movies whatsover and the only reason Wachowsky brothers introduced the sol-mo effects, was to work around the limitations of the human eye.

      The truth is that TV programs and movies are filmed WITH motion blur. That means that every frame (for films) is made during 1/24th of a second. It's actualy a superposition of all the trillions of images that were projected to the camera during that time. Our eye gets all the light that was destined for it, the only thing tha
  • by Anonymous Coward
    If I can get a *SOLID* 30fps, I'd prefer that to a framerate that peaks at 60 and swoops down to 15 in places. I also can't stand it when vsync is turned off in games - tearing is horrible. A nice compromise is to keep VSync on when the framerate is high, turn it off if it drops below, say, 30fps.

    I'm still waiting for the day when machines are good enough and code works well enough for games can be considered "real-time" (meaning having fixed steps at, say, 60Hz - and the game is NOT ALLOWED to take longer
  • The well trained FPS player can practically see the individual frames in a standard 24fps movie. It's just too slow.
  • by Jerf ( 17166 ) on Sunday September 07, 2003 @01:38AM (#6891491) Journal
    I like the ideas behind this article (I couldn't immediately Google for a good replacement so there may be room on the web for an article like this) but the author (and there is no nice way to put this) is talking out of his ass. For instance, from the second page:

    This is the Visual Cortex adding motion blur to perceived imagery so that rather than seeing everything in great detail, we are still able to perceive the effect of motion and direction as we ride by. The imagery is smoothly flowing from one point to the next and there are no jumps or flickering to be seen. If

    the eye wasn't to add this motion blur, we would get to see all of the details still but the illusion of moving imagery would be lost on us, with the brick wall sort of fading in and out to different points. It's pretty simple to test this.

    This is idiotically wrong. This entire paragraph is predicated on the false assumption that our eye somehow has a "framerate" itself. (Either that, or the false assumption that our eye is basically a CCD with infinite discrimination, also wrong.) It does not. Our eye is fully analog. (Go figure.) You get motion blur because the nerves and the chemical receptors can only react so quickly, and because nerves fire as light accumlates on the
    receptors. Since the receptors are moving quickly relative to the transmitting object, light rays from a given point are smeared across several cones/rods before the full processing of the image can take place. (Now, I'm simplifying because this isn't the place for a
    textbook on vision, but at least I know I'm simplifying.) In fact, there's nothing the visual cortex could do to remove the motion blur coming from our eyes, because the motion blur causes actual information loss! (It can (and does) do some reconstruction, but you can't fill in details that don't exist.)

    (Note in the portion I italized how he jumps from the "vision cortex" to "the eye"; the two are NOT the same and can't be lumped together like that in this context.)

    This simple error renders the entire second page actively wrong.

    Here's another, referring to interlacing:

    Using a succession of moving images, the two refreshes per frame fool us into believing there is two frames for every one frame. With the motion blur the eye believes we are watching a smoothly flowing picture.

    Uh, wrong wrong wrong. Interlacing was a cheap hack to save bandwidth. "Progressive scan" is universally considered superior to interlacing (in terms of quality alone), and many (such as myself) consider keeping interlaced video modes in HDTV to be a serious
    long-term mistake. It has nothing to do with convincing you you are seeing motion, in fact it has a strongly deleterious effect because you can frequently see the "combing"; that's why TVs have "anti-comb" filters. You don't see it as "motion", you see it as wierd "tearing".

    Like the TV, your Computer Monitor (if it's a Cathode Ray Tube) refreshes by drawing the screen line by line horizontally, but unlike the TV, a Monitor and Video Card doesn't add extra frames. If your screen draws at 30 fps, you will GET 30 fps.

    ALSO wrong. The computer monitor and video card will pump out X frames per second, period. It has to. If the CRT is going at 60 fps and the video card (as in the 3D hardware) is only pumping at 30 fps, every frame will be shown for two CRT cycles. What else is the video card (as in the rasterizer) going to display? You'd notice if the screen were blank every other cycle!

    CRT Monitors are considered 'Flicker Free' at about 72Hz for a reason, and simply put it's to compensate for the lack of motion blur, afterimages and other trickery we live with every day in TV and Films.

    Wrong again. CRTs at that frequency are "flicker free" because they pass the frequency the parts of our eyes more sensitive to motion (actually the peripheral vision, not the "primary" vision we're us


    • A single use of the apostrophe key would do wonders to his prose. Maybe he thumbed the entire article. That would explain a lot.
    • This is the Visual Cortex adding motion blur to perceived imagery so that rather than seeing everything in great detail, we are still able to perceive the effect of motion and direction as we ride by. The imagery is smoothly flowing from one point to the next and there are no jumps or flickering to be seen. If the eye wasn't to add this motion blur, we would get to see all of the details still but the illusion of moving imagery would be lost on us, with the brick wall sort of fading in and out to different
    • by Creepy Crawler ( 680178 ) on Sunday September 07, 2003 @02:27AM (#6891611)
      I messed up the 'quote' delination by putting open-brackets instead of close brackets. Sorry for the jibberish post.

      This is the Visual Cortex adding motion blur to perceived imagery so that rather than seeing everything in great detail, we are still able to perceive the effect of motion and direction as we ride by. The imagery is smoothly flowing from one point to the next and there are no jumps or flickering to be seen. If the eye wasn't to add this motion blur, we would get to see all of the details still but the illusion of moving imagery would be lost on us, with the brick wall sort of fading in and out to different points. It's pretty simple to test this.

      >>This is idiotically wrong. This entire paragraph is predicated on the false assumption that our eye somehow has a "framerate" itself.

      It does. It's about 7000 FPS (+ or - for each individual).

      The way bio-psychs tested this is by taking a high-speed controllable projecter that ranged from 30FPS to 20000FPS. Subjects were lead into the totally black room with a mic. Then they were directed to look at the projecter screen by a red dot. Once the pattern started, the projecter took a spread of 3 seconds and at 1 frame put a number on screen. The average FPS for the subjects NOT to notice the number was about 7000FPS.

      >>>>(Either that, or the false assumption that our eye is basically a CCD with infinite discrimination, also wrong.) It does not. Our eye is fully analog.

      You just cant say that. The ion channels are directly countable and lead to a time based binary system like that of morse code. Not even biologists are sure about that.

      >>>>>(Go figure.) You get motion blur because the nerves and the chemical receptors can only react so quickly, and because nerves fire as light accumlates on the receptors. Since the receptors are moving quickly relative to the transmitting object, light rays from a given point are smeared across several cones/rods before the full processing of the image can take place. (Now, I'm simplifying because this isn't the place for a textbook on vision, but at least I know I'm simplifying.)

      It's not that the rods/cones (rods are black-white, cones are color) react quickly, it's the chemical breakdown takes a while. Take the simple theater test. Go from sunny outside to a theater room. You pretty much cant see anything. It takes about 15 minutes to FULLY 'charge up' the rods back to full usage. But when you walk out of that sucky movie ;-) , your eyes hurt (due to rapid depletion of rods) and your cones take effect very rapidly.

      Other side effects of bright light is that you cannot see absolute 'white' or 'black'. Similar with dark rooms, you cannot easily see color, as it takes high energy photons to allow you to see it.

      >>>>>In fact, there's nothing the visual cortex could do to remove the motion blur coming from our eyes, because the motion blur causes actual information loss! (It can (and does) do some reconstruction, but you can't fill in details that don't exist.)
      • >>>>>>Other side effects of bright light is that you cannot see absolute 'white' or 'black'. Similar with dark rooms, you cannot easily see color, as it takes high energy photons to allow you to see it. By 'High Energy' I presume you mean a large volume of photons? Or perhaps a 'higher amount of total energy, caused by a greater number of photons'.

        Because the only thing adding more energy to a red photon is going to get you is a green photon. . .(depending of course on how much higher
    • TV's don't have anti-comb filters, they have COMB filters. A comb filter picks up several equaly spaced frequency ranges, and has a frequency response that looks like a comb turned bristle-side-up. This is needed to seperate the Y and C channels, since the C channel is made up of harmonics riding on the color subcarrier, that sit in the middle of the Y signal's frequency spectrum.
    • You get motion blur because the nerves and the chemical receptors can only react so quickly, and because nerves fire as light accumlates on the receptors.

      Damn, I thought I got motion blur because of all the shroooms.
    • Fortunately, my eyes just ignored the bullshit in this article and it wasn't even passed to my visual cortex. :) But thanks for the rebuttal, hopefully this will help some readers.
  • Relative motion (Score:3, Interesting)

    by alyandon ( 163926 ) on Sunday September 07, 2003 @02:29AM (#6891619) Homepage
    FPS is important to FPS gamers because of one simple fact... relative motion.

    If you have something travelling at a velocity of 600 pixels/s on your screen (not uncommon for objects in FPS games) it is much easier to track it at 100 FPS (relative motion of 6 pixels per frame) than 30 FPS.
    • If you have something travelling at a velocity of 600 pixels/s on your screen (not uncommon for objects in FPS games) it is much easier to track it at 100 FPS (relative motion of 6 pixels per frame) than 30 FPS.

      Except that most gamers aren't using monitors that run at 100Hz at their gaming resolution, so they're not going to see every frame, and aren't going to see 6 pixels per frame. Never mind that it is uncommon for objects to move 600 pixels/sec unless you are moving your view quickly, which most peop
      • Never mind that it is uncommon for objects to move 600 pixels/sec unless you are moving your view quickly, which most people will ignore outright except to scan for basic images in the mess that goes by.

        Yeah, most people. Gamers on the other hand will turn their view a lot *and* track what's going on around them. A lot of this happens subconsciously (ie. it's fast) - for this a good framerate helps a lot. And an object crossing the screen in under one second isn't that uncommon, just think quake3 and jump
  • by edmz ( 118519 ) on Sunday September 07, 2003 @02:50AM (#6891699) Homepage
    ...the are trying to compensate for something.
  • Timing is important (Score:3, Interesting)

    by kasperd ( 592156 ) on Sunday September 07, 2003 @03:02AM (#6891720) Homepage Journal
    If you are to look on a CRT screen for a long time, you certainly want a high refresh rate. How much is required to be enough probably depends on who you are, but 75Hz is not enough for me. But I can hardly tell the difference between 85Hz and 100Hz. I think 100Hz is enough for most people.

    When you have chosen a refresh rate, the optimal FPS is exactly the same number. Generating more FPS is waste because it is just gives worse quality. You would either be skiping frames, which harms animations. Or you would be showing parts of different frames at the same time, which gives visible horisontal lines, where the two parts doesn't match. And yes, you will spot those broken images even when only shown for 1/100th of a second.

    But generating 100 FPS and showing 100 FPS is not enough, you have to ensure each frame is showed exactly once. It requires a litle help from the graphics hardware, but nothing that is hard to implement. Having a litle extra processing power is important, you must be able to produce ever frame fast enough. You don't want to miss a deadline because occationally one frame takes just a litle more CPU time to render.
    • But I can hardly tell the difference between 85Hz and 100Hz. I think 100Hz is enough for most people.

      I have to have a minimum of 85Hz in most lighting environments. I can tolerate refresh rates down to almost 60Hz with very low or no light, but once a light comes on it starts to interfere and the rate needs to come back up (I start getting headaches after about 30 minutes with 60-75Hz in a lit room).

      When you have chosen a refresh rate, the optimal FPS is exactly the same number. Generating more FPS is w
  • by zenyu ( 248067 ) on Sunday September 07, 2003 @03:40AM (#6891801)
    I don't even play video games and I know the reason you need high FPS has nothing to do with the framerate at which you meld seperate frames into motion. It's all about response time. When the game can render at 500 fps it means you have to wait 1/76+1/500+'AI time' seconds for a response to something you do on the controler. This assumes your refresh rate is 76 hz. The 1/76 is fixed by your refresh rate because unless you can do the entire redraw in the vertical retrace period and have dual ported RAM on the video card you need to double buffer. Some rendering engines, not designed for games, are actually triple buffered for better throughput. Video games are all about response time, and here you you will sacrifice 1000 fps for that 500 fps to avoid adding an extra 1/76 to that timing sum. There of course is a certain point at which that number is high enough that you don't need to double buffer, in reality those nv FX-2000's and ATI 98xx's are way to slow to approach that kind of framerate with the visual quality people want.

    TV has an effective framerate of 60fps*, movies are 24 and cartoons are usually 12 fps. Those can all show motion just fine as long as you don't move things too fast for the medium. The average PC monitor has a refresh rate under 90hz, not really much better than the 60hz of television, so you still can't let an object move as quickly from one side of the screen to the other as we can perceive it in real life. As someone mentioned setting the refresh rate ate 72 or 80 or whatever doesn't make your eyes hurt has nothing to do with our motion perception. In normal office use you want to set this as low as possible while still avoiding flicker so that you don't waste cycles on displaying that one character you just typed into emacs a few ms faster. If you are playing a game you want to set it as high as your monitor will take it (up to 120hz at decent resolution on some monitors), while still keeping this number below the number of frames the game can render per second so that it doesn't have to show the some frames twice and mess up the motion.

    Film in a projector does not flicker like a monitor running at 24 hz. The reason a monitor flickers is because the phosphor brightess decays. A film screen is fully lit while the film is in front of the light. It flickers simply because it the time it takes to change frames is not zero, doubling the frames to 48 frames per second would increase the time the screen was dark between frames.

    *yes TV has 30 'frames' but this is just how many times you redraw the phosphors, as far as motion is concerned you have 60 seperate images representing 60 different snapshots in time (assuming this is really shot as TV and not an up-converted film). Your eyes don't care that the samples are offset, it is not like you ever see one moment with the same receptors as the next, they need a regeneration time before they can sample again. And they are not synchronized at an specific FPS so the flicker explanation was all wacky. The reason you see those nasty line artifacts when watching TV on your computer without a decent TV application like 'tvtime' is because simple TV apps like XawTV are showing two fields sampled at different times at the same time. Often for a variable 2-3 frames if your refresh rate is between 61 and 89 hz. If you show those in the standard 60 hz interlaced with a TV compatible resolution you won't see those artifacts outside a freeze frame, though you will get more flicker than a regular TV because the phosphors in monitors decay faster to avoid ghosting at the higher frequency and contrast they deal with.

    Again, CRT flicker has nothing to do with frames rendered per second(fps), and everything to do with how long lasting the phosphors are with respect to the screen refresh rate. Film projector's flicker is a completely different beast. Heck LCD flicker is completely unrelated to refresh rate and has everything to do with your backlight's balast(flourescent) or temperature(halogen). FPS above about 50-60 fps is all about res
    • True. 100hz refresh is flicker free. 100fps has smooth action.

      Real bitch is we finally get Cards that can push games past 30fps to 100fps, and then you enable AA and drops like a rock.

      Then GFX cards that rock at AA are released, then the games push the polygons up so even 3ghz CPU's and 256Meg cutting edge GFX cards can only pump 30fps in a fire fight.

      CS with 6x AA and high poly skins looks awesome. Cant wait to see how HL2/Doom3 work on normal hardware.
    • AFAIK your explanation of TV is backwards. NTSC is 60Hz interlaced. That means there are effectively 30 actual images per second and it draws half an image in 1/60 of a second. The reason TV on a monitor looks weird is because your monitor is usually not running a 60Hz interlaced mode. Thus the app has to fill in the blank lines with something, which can either be just the data from the previous frame, or some sort of filter applied to it.
  • The argument that 24 FPS should be enough for every medium is false, and here's why:

    The reason film projection can smoothly present video is because the blur on film caused by movement of the target on a slow-shutter camera. This blur actually helps because when displayed with 24 other frames in one second (all having the blur effect themselves) it looks rather fluid. Even digital movie cameras accomplish their video quality using the same trick.

    Video cards however, do not have the luxury of using this trick for video games. To show the movement of an avatar, for example; every single measurable instant of movement must be rendered for each measurable instant. Those instants are misleadingly called "frames". Achieving higher framerates is actually critical for good gameplay because there are more instants in a given amount of time. That's why low fps seems to feel sluggish on some games because 15/20/25/etc. instants are certainly not enough to show fluid movement. I myself feel right at home right around 75 fps on any first person shooter or what not. This is because the human brain registers information from the brain at about 75 Htz (at least that's what I was taught).

    So, next time you hear "24 fps is all you should need!", you can tell them why it's not.
    • What I wonder is why a video card can't do the same thing to show more fluid motion. For ex, suppose your card is only capable of 30fps. Why can't it just add motion blur in each frame based on what the previous one was so it looks fluid instead of like a high-speed slide-show?
      • Because motion blur requires quite a lot of CPU time, that's why.

        Motion blur is a natural effect on film, but on a computer it'd have to be specifically computed, which would only make things worse. If you only get 30 fps already, and motion blur slows it down to 10, then it's going to be too slow for the motion blur to be of much use.
      • This was the next big thing for 3dfx; their 'T-buffer' was designed to do things like motion blur.

        The idea being, of course, that yes, thirty or sixty FPS really is all you need *so long as you get the same blurring effects you get from film/video.*

        Having 200 frames per second merely means that the jump from position to position gets smaller and smaller, in effect building in motion blur.

        As an example, roll a ball across a table, at a speed that it takes one second. Film that with a film camera at fi

  • Grammar? (Score:4, Insightful)

    by JonoPlop ( 626887 ) <me.JonathonMah@com> on Sunday September 07, 2003 @04:29AM (#6891888) Homepage

    I tried to RTFA, but I fainted mid-way during the first paragraph.

    ...computers are tested for there ability to improve frame rates in games.
    ...heard from your friends about the latest drivers for there system...
    ...gave them an extra 30 fps over there old card...

    (They're all from the one paragraph introduction...)

    • Yep, the guy consistently uses 'there' instead of 'their'. Actually the word 'their' does not appear in the article at all. The word 'there' is used correctly only once. Combined with the factual inaccuracies, this is a pretty lousy article.
  • by psxndc ( 105904 ) on Sunday September 07, 2003 @04:51AM (#6891934) Journal
    Higher FPS means it can just handle more stuff happening on the screen at once. I don't need super whopping detail to start, I just need the game to not turn into a slideshow when 5 grenades explode around me at the same time. A video card that generates higher FPS means instead of 5 grenades it can handle 7, or 9, or 11 ad nauseum. Once it can handle a good amount of "stuff" on the screen, bump the resolution up a little or add more detail and we're back to only handling 7 grenades. Is this acceptable? Personal preference. Tweak up or down, lather, rinse, repeat.

    psxndc

  • The reason for getting the highest FPS possible in the quake line of games is physics. All(?) of them have enabled one to run faster and jump higher with FPS above certain limit making certain trick jumps easier and otherwise impossible jumps possible. I think in Q2 it was 90 FPS and Q3 there were various limits, one being 125 FPS and other being 300+ FPS. // ville
    • Absolutely. Looks like everyone's got bogged down in details about perception and the biology of the eye, and overlooked some of the more mundane points.

      The games development algorithms mailing list [sourceforge.net] has recently covered this topic in some depth. (Apologies, the archives don't seem to be working properly at the moment.)

      The problem can lie in the collision detection working with sampled points along the player's trajectory during a jump, checking for collisions between those points. The lower the frame rat
  • One thing I don't understand is why some people play FPSs with low monitor refresh rates. If your monitor is retracing at 75Hz, it doesn't matter that your graphics card can render 300fps, you're only going to see 75 of them.

    I play Quake 3 at 800x600, so my monitor can refresh at 160Hz. With everything at minimum detail I get about 200fps average, so my real fps is a solid 160fps (except in very crowded areas).

    This makes a very big difference, especially in 1v1 with railguns. Here every few milliseco
  • by tsa ( 15680 )
    The average person needs at least 72 fps. Not 30. And that is because TV's work a little bit different than computer monitors.
  • Not one instance of "their" or "you're" in the whole article.
  • The article is filled with obvious factual errors and is a badly-done apologetic for the obsessive and nonsensical quest for ludicrously high framerates.

    His attempt to explain away the fact that 24-30 fps works fine for movies and television is an utter failure. Surrounding darkness is not why movies look smooth, and the feeling of continuity here has nothing to do with the afterimage effect. The refresh rate of televisions, resulting in "each frame being drawn twice", does not double the framerate of the
    • not very many people can tell the difference between a 25 fps framerate and a 100 fps framerate.

      This is undoubtedly true, but anyone is capable of telling the difference between 25 and 100 fps with a little training. And would suffer from 25 fps in a video game without knowing why. Then again I consider myself pretty good at eyeing it, yet I'm sometimes off on my guess of the framerate by 60%. You need to be able to move an object fast enough for the motion to break down to get a good estimate of the fram
  • Ok, this guy is mostly correct, but it is sure hard to read and get the real point. So here is a (hopefully more understandable) summary of what he was trying to say.

    TV and movies can get away with 24 to 30 FPS mainly because the CAMERA that took the actor's picture kept the shutter open for almost the entire 1/24th of a second, so moving objects are blurry. That hides the fact that the actor moved quite a bit from one frame to the next, since the area between point A and point B is filled in with the bl
  • Test it yourself, if people are not told beforehand what the framerate is, 99% cannot distinguish whether a game is being displayed at 20 or 30 or 60 or 75 or whatever fps.

    It's all hype and power of suggestion.

    Take a 30 fps scene, tell someone it's running at 75, and they will tell you, yes, it looks m-u-c-h better.
  • The article and some of the comments here are all over the map. So to avoid redundancy, I'll point out a few common sense things and a few places where research since the 60s has already addressed these issues.

    1. A fixed frame-rate is better than even high but variable frame-rate. A solid 30hz can actually be better than 30-120 fluctuating. A ton of research has gone into how to make the graphics pipeline effectively lock to a fixed rate and there's a good reason: variable frame-rates make people sick; fix
  • Unfortunately, I don't think the author of this article was ever able to really wrap his head around this concept.

    Off the top of my head, I don't know where he's getting this "displayed twice" business. The closest thing I could think of the to that is the technique of interlacing frames used for displaying images on a TV screen. But when a series of images are interlaced, they're definitely not being displayed "twice"...more like one half at a time.

    Also, motion blur is not "added" in the visual corte

  • Ok, I'm just speculating, but I think I see some things people are missing here...

    First, the bare minimum fps should be the rate at which flicker is no longer detected (according to the article its 72 fps). In reality, the decay rate and decay curve of the display device are the real factors here, but this will soon be irrelevant as you'll see.

    If you have an object moving across the screen, with no motion blur (such as with 3d games) at a low fps, you see multiple separate instances of that object rather
  • am I the only one bothered by the fact that the latest generation of graphics cards still can't output high framerates on new games such as UT2k3 and Doom III?? The Radeon 9800 Pro 256MB and the GeForce FX 5900 Ultra run these games fine. Yet if you take a 9700 Pro or a 5800 Ultra, the games get sub 100 FPS, even sub 60 in 1024x768! These cards were released after the games were developed! This is just wrong. I high end card should be able to output 100+FPS at 1600x1200 resolutions on the latest game that i

It is easier to write an incorrect program than understand a correct one.

Working...