Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!


Forgot your password?
First Person Shooters (Games) Games

Framerates Matter 521

An anonymous reader writes "As more and more games move away from 60fps, the myth of the human eye only being able to detect 30fps keeps popping up. What's more, most people don't seem to realize the numerous advantages of a high framerate, and there's plenty of those."
This discussion has been archived. No new comments can be posted.

Framerates Matter

Comments Filter:
  • by sopssa ( 1498795 ) * <sopssa@email.com> on Wednesday January 06, 2010 @01:26PM (#30671832) Journal

    The article notes about motion blurring, and links to NVidia's page about it's technology [nvidia.com]. The last figure [nvidia.com] shows a terrain with full-screen motion blur effect, which in my opinion is pretty important in games to create that feeling of speed. People usually object against this and bloom effects and just want a sharp picture, but maybe some games have taken it too far. It's important none the less, even if it's not all sharp picture, because your eye picture isn't all that sharp either and you experience the same blur.

    • Doom always appeared to draw textures with much lower resolution while you were moving, and only display the full texture when you stopped and looked directly at an object, as a way of speeding up rendering. This gave the appearance of "motion blur" without a lot of additional processing required.
      • Doom 1? (Score:3, Interesting)

        by tepples ( 727027 )
        By "Doom" do you mean Doom (1993) or Doom 3? If the former, I never saw this effect while playing the game on MS-DOS (vanilla version), Mac (Ultimate Doom), or GBA.
        • Re:Doom 1? (Score:5, Insightful)

          by JackDW ( 904211 ) on Wednesday January 06, 2010 @02:16PM (#30672586) Homepage

          I've worked with the Doom source code recently, and can confirm that there was no motion blur at all. In fact, blur of any kind couldn't really be implemented, because Doom's graphics were 8-bit indexed colour.

          Also, there were no engine changes at all between Doom 1 and 2.

          Perhaps the GP is referring to the bobbing effect that occurs when the Doom guy runs. That moves the camera nearer to and away from the ground, changing the appearance of the texture.

      • Perhaps you're thinking of mipmapping [wikipedia.org], which was implemented at least as early as Quake 1.

    • Re: (Score:3, Insightful)

      by Hatta ( 162192 )

      It's important none the less, even if it's not all sharp picture, because your eye picture isn't all that sharp either and you experience the same blur.

      If my eye creates the blur, why do I need artificial motion blur?

      • by sopssa ( 1498795 ) *

        Because you're still looking at a single object, your monitor, and the picture and movement in it is artificially created. If you look at real objects moving or move or shake your head you'll notice theres huge motion blur effect. If you do it in game that has no motion blur effect, you notice how it instantly jumps to where you want to look at.

        • Re: (Score:3, Informative)

          That just means we should strive for a higher framerate until our eyes blur things on their own. Reality is not inherently blurry (unless you need glasses...), our eyes and brain do that internally.

          Making movement in a game inherently blurry when your head is already going to blur it for you internally is just a shortcut to motion sickness for a whole lot of people.

          • what are you talking about?

            ever waved your hand so fast back and forth that it creates a blur? Ever seen those little things that go back and forth back enough to display an image?

            Reality is indeed inherently blurry. It's just hard to accurately portray blur when you're staring at something that's not moving.

            • by takev ( 214836 ) on Wednesday January 06, 2010 @02:37PM (#30672894)

              But, if you follow the hand with your eyes, your hand will appear sharp. You'll be supprised how quickly and stable eyes can track moving objects.

              The BBC has been experimenting with fast frame rate TV, running at 300 frames-per-second. Moving objects will appear much sharper with such a broadcast compared to the standard 50 frames-per-second (not fields). They showed a side by side example, both were 1080 progressive scan. Great for sports broadcasting.

              Also Silicon Graphics (when they were called that) have done test with fighter pilots when designing flight simulators. Motion sickness is a problem with those flight simulators, compared to an actual jet plane. When they got a constant frame rate above 80 frames (160 frames per second when doing stereo imaging) per second the motion sickness was greatly reduced. They solved the processing power problem by being able to reduce the rendering resolution on each frame.

          • Re: (Score:3, Interesting)

            by sopssa ( 1498795 ) *

            I dont think we will get to a point that the framerate would be fast enough. The 3D monitors only generate up to 120fps too, and there's still lots of hardware limits to generate framerates over that with current games on good resolutions. And there is no framerate in real world; you're taking in images in realtime. Some argue (like the battle between 30fps vs 60fps) that human eye can't process more than certain amount of "frames" per second. The natural motion blurring effect and it's absence with video g

            • Re: (Score:3, Insightful)

              by brianosaurus ( 48471 )

              I agree with you...

              Some argue (like the battle between 30fps vs 60fps) that human eye can't process more than certain amount of "frames" per second.

              Isn't the reason movies use 24 fps (and similarly TV uses ~30fps) because of historical technical limitations? That is right about the minimum rate where your eyes and brain can smooth out the annoying flicker. 30fps isn't the upper limit that the eye can process, but rather a lower limit that makes the image sequence appear as motion without causing stutter, headaches, or otherwise detract from the visual experience. Its a compromise to allow movies to fit on reasonable sized rolls of

      • Re: (Score:3, Interesting)

        by spun ( 1352 )

        Just a guess, but perhaps because the frame rate isn't high enough for your eye to generate the blur? That is to say, if the scene were real, the frame rate would be well-nigh infinite, and your eye, capable of only a certain frame rate, would blur together all the frames. With discrete frames, you need to put in the blur the eye would generate from the frames in-between.

        Or something like that.

        • Re: (Score:3, Insightful)

          by Moryath ( 553296 )

          Not quite.

          The eye blur happens for two reasons. The first is the fact that the human eye is "assembling" an analog reading of the light taken over a specific time, very similar to how a camera exposure works. We aren't "digital" beings, in the sense that there is allowance forward and back in our visual processing, but we DO assemble "frames" for the rest of our brain to analyze.

          The second is focusing. A fast-moving object moves into, and out of, the focused field quite quickly. Either we keep tracking it (

      • by Shin-LaC ( 1333529 ) on Wednesday January 06, 2010 @01:45PM (#30672132)
        Your eyes introduce blur due to the reaction time of the light-sensitive cells in the retina. Fortunately, the image processing area in your brain treats blur introduced by the eyes and blur built into the frame more or less the same, so you can use blur to give the impression of smooth motion with a lower frame rate than would otherwise be necessary. This is used to good effect in cinema, where the camera's exposure time naturally introduces blur that is quite similar to the one introduced by your eye.

        In the case of video games, however, it is not so clear that rendering effctive artificial motion blur saves much processing time compared to simply rendering more frames. Then again, there is a limit to how fast your monitor can update its image, so rendering more frames is no longer an option past that point.
    • Essentially people want these effects to be done by their eyes though, not the game. Why can't the game/computer/monitor produce fast enough frame-rates that its my eyes that are creating the blur, not the Post Rendering effects?

      Don't get me wrong, I like the realism that these effects give, but some people see them as kind of fake and it draws away from their experience. Perhaps some people's eyes can percieve frame-rates slightly faster than others and thus don't actually see as much blur when moving fast

      • Re: (Score:3, Interesting)

        by LordKazan ( 558383 )

        Why can't the game/computer/monitor produce fast enough frame-rates that its my eyes that are creating the blur, not the Post Rendering effects?

        Physics.. monitors cannot change fast enough and in the right way to do this. they simply don't work that way.

        Speaking of Physics - the properties of a game's physics engine have the properties of a Riemann sum where n=fps. so the higher your FPS the more accurate your physics simulation, even if your monitor cannot discretely display all those frames.

        [note: only

        • Most physics engines simulate best when the timestep is the same every update - larger timesteps result in less accuracy of the simulation, to name just one issue. Rendering time varies every frame depending on the number of polys rendered, etc. So it is standard practice to decouple the physics engine from rendering, which allows the physics engine to run at whatever fixed timestep is desired. Multiple physics updates can occur for a single rendered frame and vice versa. Interpolation of position is used

      • by sopssa ( 1498795 ) *

        Also bloom [imageshack.us] and lighting effects [imageshack.us] you still have to do in game because they rely on game world, can hide objects behind that bloom or make other objects dark, and because monitor just shows the color data you give it.

  • Really? (Score:5, Informative)

    by non0score ( 890022 ) on Wednesday January 06, 2010 @01:30PM (#30671876)
    • Re: (Score:3, Insightful)

      by Sockatume ( 732728 )

      Graphics are sold by screenshots and by box shots. YouTube and so on might make a difference, but ultimately you'll get more players to swoon with half the framerate and twice the geometry, than vice versa.

  • In fighting games you need 30FPS period.
    There are books for Tekken and the like that have frame data for every move.
    Input any lag into the equation and what might be safe of block, might not, costing you the game.
    • I think what you're getting at is that consistancy matters more than maximum frame rate. For different reasons than the one you state, I'd rather play a game at a constant 20 hz than at 30 (or even 60) hz most of the time but dropping down to 15 during the most intense moments. It's the large changes in framerate that are noticable, your brain can fill in the missing pieces if the framerate is constant.

    • by spun ( 1352 )

      You don't understand how frame rate works, do you? The pictures drawn on the screen aren't the real model the game uses. Adding frames in between other frames won't generate lag (if the processing speed is high enough) So, if activating a block at a given frame works with 30fps, it will work with 15fps, 60fps, or 300fps. The frames aren't the 'real thing,' the game's unseen internal model is the real thing. The frames are drawn as often as is possible, given the hardware, and are drawn to comply with the cu

      • Sorry but, you are dead wrong. Input will lag behind with any other lag.
        • by spun ( 1352 )

          How so? What lag are you talking about? What, in your theory, does 'lag' mean?

          Heck, could you rephrase the sentence, "Input will lag behind with any other lag," so it actually makes sense?

      • Re: (Score:3, Interesting)

        by maxume ( 22995 )

        That isn't always the case, I recall a game in the past where gravity had less effect on players that had faster hardware. Or something like that. Anyway, the logic was mixed in with the rendering, so frame rate had an impact on what the player could do.

        • Yes, that's entirely possible if it's programmed so that you fall x meters for each frame rendered. What should be done is to say you fall x meters per second, taking into account how long it's been since you last calculated the value.

          (I'm simplifying the effect of acceleration above--many games could get along without it and produce a decent result, though it's not hard to factor in if you want.)

          • Which is what the parent was getting at, alot of fighting games go by frames, not by seconds. Sounds ridiculous but it makes for easier programming and its alot less resource intensive.

          • by Kreigaffe ( 765218 ) on Wednesday January 06, 2010 @02:09PM (#30672506)

            It's way, way way more than that.

            The old HL engine -- at least in Natural Selection, but most likely any game on that engine -- your framerate didn't just effect your gravity (which made it so that at certain framerates you could literally jump further, which meant BHopping was sicker)..

            it also changed the DPS of weapons. Yep. Weapon firing rate was tied to FPS in a very very odd way. Some dudes did too much testing. Insane.

            And you can, visually, tell a difference between 100fps and 50fps and 25fps. Very easily. Takes a few minutes of playing, but there's a clear difference and anybody saying otherwise eats paint chips.

            Graphics don't make games good. Graphics can cripple good games. Graphics never make bad games good.

      • Re: (Score:3, Interesting)

        by Speare ( 84249 )
        In many embedded apps, like coin-op arcade games, the "model" is indeed tied to the frame rate. The main loop assumes a fixed dt, and pipelines the input, update, render tasks. Often this is done without threading, just while (!dead) { do_input(); do_update(); do_render(); } in the main function. Even with threads or co-processors, they often tie the rates 1:1:1. Some have no room for adjustment, and some will at least update their dt if the render took too long.
        • by spun ( 1352 )

          Ah, so faster hardware will actually update the model more quickly. But does this change the way the model acts? In some physics models, I guess it would. More frames would, in fact, be more accurate. But in most simple models, would calculating more time-slices actually change anything? I kind of doubt it, so even though you are right, and visual frame rate (in a non-threaded game) is tied to model frame rate, more frames would not change the outcome.

          Basically, the original poster was making it sound as if

      • You don't understand the games he's talking about.

        For something like Street Fighter at EFO, they take extra steps to make sure that the framerate is consistant across all play-times, times when the players are just standing there, and times when players are attempting to break blocks for their Hypercombofinishes.

        Like many flash games - there is code that is actually executed ON THE FRAME. It is done as the frame is being rendered. When you get intensive moments that have people putting alot of input, lots o

      • Re: (Score:3, Informative)

        by Spazmania ( 174582 )

        The pictures drawn on the screen aren't the real model the game uses.

        That's not necessarily true. There's a long history of games relying on the graphics processor to determine when two objects overlap or otherwise meet specific conditions relative to each other. Goes all the way back to the 8-bit days when the graphics processor could tell you whether the non-transparent parts of two sprites overlapped.

  • Counter-Strike... (Score:3, Informative)

    by Manip ( 656104 ) on Wednesday January 06, 2010 @01:35PM (#30671962)

    I myself used to play Counter-Strike (classic), and I can tell you both FPS and Ping made a HUGE difference in that game to the point that my score would increase as I connected to servers closer to home and used OpenGL instead of DirectX (since OpenGL almost doubled the FPS at the time).

    Now, I wasn't an expert but I did play a whole lot. I think you ask most serious players and they would agree the impact of both...

    • by hitmark ( 640295 )

      i wonder how much that had to do with the engine design. As in having the render engine and the game logic joined at the hip so that higher fps meant more repeats of the game logic pr second.

    • Agreed. Most players would notice a difference in a few miliseconds of network Latency more than a dozen frames per second, but its undeniable that extra Frames per second give you a distinct advantage.

      If I see you and you see me, and you're running at twice my frames per second, You will have a smoother "turn and shoot" motion than me, which means you'll either notice your reticule over my head a slight bit faster than me, or you won't make the mistake of over or under compensating your aim since your moti

  • by ForestHill ( 958172 ) on Wednesday January 06, 2010 @01:36PM (#30671982)
    she's dead, Jim
  • Cached Version (Score:5, Informative)

    by sabre86 ( 730704 ) on Wednesday January 06, 2010 @01:36PM (#30671994)
    Looks like it's Slashdotted already. Here's the cached page: []
    • by Zocalo ( 252965 )
      Nah, that's not Slashdotted. It's proving the point by showing the importance of a higher framerate.

      Than zero.
  • by gurps_npc ( 621217 ) on Wednesday January 06, 2010 @01:37PM (#30672020) Homepage
    The human eye can clearly detect frame rates far greater than 30. So can the human brain.


    The human mind is evolutionary designed to make instant assumptions. Cat in mid air facing us = DANGER. No "Is it dead and being thrown at us?" No "Is it a picture?" As such, video games can quite easily take advantage of this evolutionary assumptions and trick the MIND, if not the brain. into thinking something is real.

    So while a higher frame rate will increase the quality of the game, it is not essential. It's like getting gold plated controls on your car's dashboard. Yes it is a real increase in quality, but most people would rather spend the money on a GPS device, real leather, plug-in-hybrid engines before you get around to putting gold in the car.

    • by TheCarp ( 96830 ) <[ten.tenaprac] [ta] [cjs]> on Wednesday January 06, 2010 @02:03PM (#30672408) Homepage

      > The human mind is evolutionary designed to make instant assumptions. Cat in mid air facing us = DANGER. No "Is it dead
      > and being thrown at us?" No "Is it a picture?" As such, video games can quite easily take advantage of this evolutionary
      > assumptions and trick the MIND, if not the brain. into thinking something is real.

      Sort of. Its actually less "Cat in mid air" and more "This sets off a trigger based on something that happened before and hurt me".

      Most adults, if you chuck a rock at their face, will toss up their arms to block, or move their head/body to dodge. This is completely learned. Do the same trick with a young child who has never played "catch" before, and your rock is going to bean him right off his skull.

      From my own experience, my first motorcycle accident, I was on the ground so fast, I had to think afterwards about what happened. First two spills actually.

      The one after those.... whole different story. The adrenalin hit as soon as I felt the bike start to turn sideways, by the time the bike was fully 90 degrees to my momentum vector, and the wheels were sliding out from under me, I was already calmly kicking my legs backwards and positioning myself for the impact. I hit the ground and slid 150 feet while watching my bike spark and slide away. I thought "shit I am in traffic" jumped to my feet and ran to the bike, picked it up and pushed it into a parking lot.

      All I am saying is, its more complicated than that. The memory of such things and whole "flight or fight" response is an evolving and learning response. Its more than just visual, it encompasses all the senses. I doubt "cat facing us in mid air" is going to trigger much beyond anything in mid air moving towards us.

    • I would say it stops a 120fps but I haven't been able to run tests past that point so far.

  • Absolutely (Score:5, Funny)

    by occamsarmyknife ( 673159 ) on Wednesday January 06, 2010 @01:39PM (#30672036)

    I couldn't agree more. That Internal Server Error looks way better at 120 Hz on my 45" HD display.

  • Headroom... (Score:2, Insightful)

    by Anonymous Coward

    the biggest reason to go for the highest frame rate possible is headroom. If your framerate is 30 at best, it'll dip down to 10 sometimes. If it's at 120 optimal it can dip down to 30, and still be playable.

  • by Monkeedude1212 ( 1560403 ) on Wednesday January 06, 2010 @01:41PM (#30672070) Journal

    You can tell the difference between 30 FPS and 60 FPS.

    The way I tested this was I made a 2 second video in flash, a circle moving from the left side of the screen to the right side. 60 frames. Run it at 30 FPS.

    Then I made a second 2 second video, same exact positions. 12 Frames. Ran it at 60 FPS. Asked me, and all of my surrounding classmates, which was about 24 students IIRC.

    100% of us noticed a visible difference in the smoothness. Whether our eyes were making out each individual frame perfectly or blurring some together to create a smoother effect, it was irrelevant since there WAS a noticable difference. I was going to slowly bump the 30 and 60 FPS up higher and higher to see at what point the difference is not distinguishable, but I got lazy (High school student at the time.)

    The point I think most gamers would agree on is that more frames per second are nice - but that 30 frames per second are Necessary. You can occaisonally dip down to 24 and be alright (24 is supposedly the speed that most Movie theatres play at) - but when you get around 20 or so its really does take away from the experience.

    • 120 Frames* I mean. Sheesh. Not proof reading even though theres a preview button.

    • Time Splitters was the first game I played which was locked at 60fps: it was quite a remarkable transition, even from games which were locked at 30fps, never mind games that fluctuated (I'll take 30fps and locked over 30-60fps any day). Gran Turismo had a "Hi-Spec" mode which doubled the resolution and framerate too, albeit at an obvious graphical cost, and it looked like The Future.

      On the subject of movie theatres, 24fps was chosen because it's pretty much as low as you can go before people notice problems

    • by jeffmeden ( 135043 ) on Wednesday January 06, 2010 @02:04PM (#30672434) Homepage Journal

      You can occaisonally dip down to 24 and be alright (24 is supposedly the speed that most Movie theatres play at) - but when you get around 20 or so its really does take away from the experience.

      If by 'supposedly' you mean 'definitely' and if by 'most movie theaters' you mean 'all theaters and even all motion picture production processes in recent years', then yes. The difference is lost on most people, but the reason 24fps is acceptable in movies is that the frame you see isn't what happened at that instant in time when it's displayed, it's everything that happened in the last 1/24th of a second, since it's recorded on film that exposed for that 24th of a second to derive the image. When a computer does it, it only cares about what is happening at that exact 24th of a second; so the difference between a series of exact frames of motion and a series of frames that include the blur of what happens between frames is HUGE.

      However, this nuance is lost on pretty much everyone who fires up a computer game, notes the FPS indicator, and goes "OMG I CAN TOTALLY TELL ITS ONLY 30FPSZZZZ!!!! HOW INFERIOR!!!". Whine about framerates all you want, but they are only a small part of the experience.

      • Re: (Score:3, Informative)

        If by 'supposedly' you mean 'definitely' and if by 'most movie theaters' you mean 'all theaters and even all motion picture production processes in recent years', then yes.

        I'm sorry but that's not quite correct. I worked as a movie projectionist for several years, so I know this from experience. While 24fps works, and is what used to be used in cinemas, it is noticeably flickery. As with most advancements in cinema technology, they came up with a bit of a hack. While there are still only 24 frames of film per second, the projector shows each frame twice, giving an effective frame rate of 48fps.

    • Re: (Score:3, Informative)

      by Aladrin ( 926209 )

      Actually, the lower limit matters based on the person. My eyes are apparently a bit wonky and my lower limit is 15 fps, which would drive most people insane in a video game. Below that and it drives me insane. As for telling the different between 30 and 60... I can do it... Barely. Compare 60fps and anything higher and it's absolutely pointless for me to try. However, I've met people who can definitely tell the different between 100fps and 60 fps.

  • Whoa, the motion blur image with the birds and the mountain is nice, what game is that screenshot from??!!1
    • Whoa, the motion blur image with the birds and the mountain is nice, what game is that screenshot from??!!1

      Reality 1.0, very popular MMO with billions of users. Excellent graphics but the developers have been very unresponsive to bug reports.

  • I had a friend who was bothered by anything less than 60fps.

    The screen looked "stuttery". He would take a lower resolution to maintain 60fps.

    We could verify this in one game with a built in frame rate command.

    This is like the "myth of the g spot" post a few days ago. sheesh.

    • I would not doubt it for a second.
      Many people can see the difference,
      maybe not that small of an increment.
      There are a few people like your friend who can probably see even more then just 3FPS difference.
  • Personally I get annoyed by the fact that although they've invented HD (woohoo) they're still shoving it out at only 24 or 25 FPS. To me, this looks really jittery! I wish they'd go up to 50FPS for HD.

    Watching Avatar in 3D seemed to accentuate that problem. I'm not sure how they do the whole left/right thing in terms of projection, but it seemed to me that the left/right image was shown alternately and at nothing like a high enough speed for me to perceive it as fluid motion. Did anyone else notice this?

  • The 30-fps-is-all-you-can-see myth was probably born of the notion that the illusion of continuous movement starts to set in around 25-30fps (in film for example). Therefore actually 30fps is the minimum you need rather than the maximum you can perceive.
    I could tell in a glance the difference between 72fps and 100fps (both common refresh rates that translate to the max fps when v-sync is on) in Counter-Strike just by briefly moving the mouse to pan the scene.
    This site has had the definitive explanation
    • by Chris Burke ( 6130 ) on Wednesday January 06, 2010 @02:42PM (#30672972) Homepage

      The 30-fps-is-all-you-can-see myth was probably born of the notion that the illusion of continuous movement starts to set in around 25-30fps (in film for example). Therefore actually 30fps is the minimum you need rather than the maximum you can perceive.

      I think it's more likely born of the notion that film gives a completely convincing illusion of motion that is not greatly improved by higher frame rates, because the process by which it is created automatically includes motion blur because it's recording continuous data, just broken up into 24 fps. Computer games display discreet moments in time, not many moments blurred together into one picture. That's why film looks smoother than computer games with 3 times the framerate.

      Nevertheless, the illusion of continuous movement is apparent at much lower framerates than even film, even in a computer game. Quake's models were animated at 10 fps, and they gave a convincing illusion of movement, and you can probably make due with a lot less since the brain fills in so much. But it's not a completely convincing illusion, and neither is 30, 60, or even 100 when using static instants in time.

      But the basic myth comes from the fact that film is so convincing and thus you don't "need" more... as long as each frame is a blurred representation of the full period of time it is displayed for.

      • Re: (Score:3, Informative)

        by Psyborgue ( 699890 )

        But the basic myth comes from the fact that film is so convincing and thus you don't "need" more... as long as each frame is a blurred representation of the full period of time it is displayed for.

        Not quite. Film cameras, because of they way they work, max out about half of the time they are exposed for (180 degree shutter [tylerginter.com]). 24fps is usually shot at 1/48 second exposure time per frame. The full time (a 360 degree shutter) would be far too blurry.

  • by 1u3hr ( 530656 ) on Wednesday January 06, 2010 @01:56PM (#30672308)
    the numerous advantages of a high framerate, and there's plenty of those.

    Brought to you by the Department of Redundancy Department.

  • Now I can justify another $1000 worth of hardware to my wife, to play the same game I can get on a $300 console.

    She gets it. I'm the Computer Guy. I know how it works. I know what is needed. I know how to keep her from not being able to play Farmtown. Or is it Fishville? hard to keep up with the Facebook privacy violations/games.

    Ya gotta have priorities.

  • The 30fps myth is simply an over simplification. The eye+brain starts to naturally perceive movement at around 10fps, usually a little lower. Motion usually starts to appear smooth somewhere between 15 and 25fps though it depends on many factors other than just the framerate (smoothness of the frame rate, relative change velocities of objects (or parts thereof) in the image, absolute colour and tone, colour and tone contrasts within the image, the existence or not of dropped frames and other inconsistencies

  • Nearly everyone these days uses LCD monitors that have a pathetic maximum of 60hz display at HD resolutions (I think because of DVI spec/bandwidth limitations, Whatever moron invented DVI needs to be shot because of that).
    I still have an analog CRT monitor that supports much higher frame rates at HD resolutions which gives a very noticeable edge when playing twitch-games like Unreal Tournament.
    I never understood why people claim framerates above 60hz are better when their monitor is only capable of displayi

  • The brightness of the image and ambient lighting makes a difference. The more light that goes into your eye, the faster it responds. I run 1600x1200 @ 62 Hz interlaced, and sometimes I notice flicker. When that happens I close the shades, and the flicker goes away.
  • 30 Fps myth (Score:2, Interesting)

    by ggendel ( 1061214 )

    There were a lot of studies done a long time ago, and there are some very accurate psycho-visual computer models of the human visual system. I had the pleasure of working with the Jeff Lubin model when I worked at Sarnoff Corp, which won an Emmy Award back in 2000.

    The 30 fps requirement is not a fixed point, but depends upon a lot of other factors, including viewing distance, field of view, and lighting conditions. The reason that film operates at 24 fps is because it is expected to be viewed in a darkene

  • by DeskLazer ( 699263 ) on Wednesday January 06, 2010 @02:08PM (#30672478) Homepage
    15 FPS vs 30 FPS vs 60 FPS [boallen.com]. This is a visual representation. There are points made, however, that when you watch a movie, the image is "softened" and runs at a lower framerate [something like 24 or 25 FPS?] because your brain helps "fill in the gaps" or something of that sort. Pretty interesting stuff.
  • Sorry, you lost me (Score:4, Insightful)

    by nobodyman ( 90587 ) on Wednesday January 06, 2010 @02:09PM (#30672484) Homepage

    As more and more games move away from 60fps *snip*

    Hmm... I don't accept that premise, either on the PC (where midrange graphics cards can easily pull 60fps with any game on the market now) or on the consoles (where framerates are only going up as PS3 and 360 development matures).

    I think that this article (or at least the summary) is a bit of a strawman. Most of the gamers I know recognize that good framerates are important.

  • Same with audio... (Score:5, Insightful)

    by QuietLagoon ( 813062 ) on Wednesday January 06, 2010 @02:14PM (#30672568)
    Everyone says a "framerate" (i.e., sample frequency) of 44.1kHz is all that is needed. Yet many people hear better imaging, depth and transparency at higher sample rates.
  • by Improv ( 2467 ) <pgunn01@gmail.com> on Wednesday January 06, 2010 @02:30PM (#30672788) Homepage Journal

    It may be true that high framerates are a good thing, but the linked article is rubbish - the author's arguments are really very stupid.

  • by pz ( 113803 ) on Wednesday January 06, 2010 @02:43PM (#30672982) Journal

    I am a visual neuroscientist (IAAVNS). The standard idea of refresh rate comes from CRT based monitors where the image is drawn by a scanning electron beam. If you use an instrument to measure the instantaneous brightness at a given point on the screen it will rapidly peak as the beam swings by, and then decay as the phosphor continues to release absorbed energy in the form of photons. Different monitors have different decay rates, and, typically, CRTs that were designed for television use have pretty slow decay rates. CRTs that were designed for computer monitors typically have faster decay rates. If the decay rate were very very fast, then the hypothetical point on the screen would be dark most of the time and only occasionally very bright as the beam sweeps by on each frame.

    As you can imagine this highly impulsive temporal profile is hard to smooth out into something closer to the constant brightness of the world around us. The human retina has an inherent dynamic response rate to it, but it's actually quite fast, and there have been studies showing clear responses in higher order visual areas of the brain up to 135 Hz. But standard phosphors used in CRTs have a little smoother response, and so at more-or-less 80 Hz, the brain stops seeing the flicker (at 60 Hz most people see flicker on a computer monitor). The exact refresh rate where perceptual blurring happens (so the flickering goes away) varies widely between individual, and with the exact details of the environment and what is being shown on the screen. More-or-less at 100 Hz refresh, no one sees the flicker anymore (although the brain can be shown to be still responding).

    Contemporary screens, however, are LCD based (I'm going to ignore plasma screens since the field is still working out how they interact with the visual system). Making the same experiment as above, the temporal profile of brightness at a given spot on the screen will look more like a staircase, holding a value until the next frame gets drawn. This is a far, far smoother stimulus for the visual system, so a 60 Hz frame rate produces a perceptually far more flicker-free experience. That's why most CRTs at 60 Hz make your eyes bleed, while LCDs at 60 Hz are just fine.

    Except that newer LCDs have LED backlighting which is no longer constant, but flashed (WHY? WHY? WHY? Just to save some power? Please, computer manufacturers, let *me* make that decision!), so the experience is somewhat more like a CRT.

    So that's one part of the equation: flicker.

    The other part of the equation is update rate, which still applies even there might be no flicker at all. Here, we have the evidence that the brain is responding at up to 135 Hz. In measurements made in my lab, I've found some responses up to 160 Hz. But the brain is super good at interpolating static images and deducing the motion. This is called "apparent motion" and is why strings of lights illuminated in sequence seem to move around a theater marquis. The brain is really good at that. Which is why even a 24 Hz movie (with 48 Hz frame doubling) in a movie theater is perceptually acceptable, but a 200 Hz movie would look much more like a window into reality. On TV you can see the difference between shows that have been shot on film (at 24 Hz) versus on video (at 30 or 60 Hz). Video seems clearer, less movie like.

    For games, 60 Hz means 16 ms between frame updates -- and that can be a significant delay for twitch response. Further, modern LCD monitors have an inherent two or three frame processing delay, adding to the latency. As we know, long latency leads to poor gameplay. Faster updates means, potentially shorter latency, since it is a frame-by-frame issue.

    So, just as with audio equipment where inexpensive low-fidelity equipment can produce an acceptable experience, while a more expensive setup can create the illusion of being at a concert, so too inexpensive video equipment (from camera to video board to monitor) can produce an acceptable experience, while a more expensive setup can create the illusion of visual reality.

    • by smellsofbikes ( 890263 ) on Wednesday January 06, 2010 @06:12PM (#30675784) Journal
      For the record (as an ex-LED-backlight hardware designer) the LED's are waaay too bright to run full-out, both visually and from a power usage and heat generation standpoint, and the only good way to dim an LED is by cycling it on and off rapidly to approximate the desired brightness. The reason I say 'the only good way' is because LED's are constant-current devices and all the drivers I'm familiar with are all designed around that, so you can't just go varying the voltage to try and dim them: the drivers aren't really voltage devices.

      With THAT said, I have absolutely zero idea why any sane LED driver dimmer would be anywhere near frequencies that any human could see. LED's can turn on and off in nanoseconds, so a reasonable dim signal should be in the kilohertz range, at least, not the 100hz range. It's *possible* to put a 100hz dim signal on an LED driver, but it seems really dumb to me.

  • Outside Looking In (Score:5, Informative)

    by DynaSoar ( 714234 ) on Wednesday January 06, 2010 @03:14PM (#30673432) Journal

    I'm a neuroscientist that covers sensation and perception and its bidirectional interaction with cognition, particularly attention. I've got comments and questions and very few answers after reading this. I'm seeing a lot of things stated as facts that I've never heard of before. Some of them make sense, and some don't. Some of them are correct, some not, and many more than the others combined I have no experience in and can't say. Those seem to be well supported, or at least well known, particularly among those who've obviously done their homework. I can find references to these among the publications (like ACM) that are most applicable to the field in question, but I can find precious little in my customary pubs and books. That's not to say the stuff in the technically oriented pubs is wrong, just that some may not be covered much (ie. 'not of interest') in my field. My field is very cautious about experimental evidence, but I suspect in gaming's perception area there are common knowledge kids of things that came from hear say (we have many of those in rocketry too). It might do well for both fields to compare works.

    What catches my eye at first is this "myth". As stated it's overly simplistic. Which humans' eye? Some have different reaction times. Those who could probably detect 30 fps discontinuity are those who see the TV screen jiggle and waver when they chew something crunchy while watching (you know who you are, here's a place to own up to it). What part of the visual field, central or peripheral? They operate differently. Jittering or blurring of objects attended to or not? Betcha it happens more to those not attended to, but that's not noticed for the same reason (hypnosis can bring that out right nicely). And how is it frame rates matter when the visual system evolved as a constant flow analog system? If a phenomenon that shouldn't make a difference does, and that frame rate is strictly due to technical considerations, how do we know that a variable frame rate might not give even better results? Since the visual system does not have full-field frames that refresh, why should artificial presentations? Why not present faster moving objects at a high change rate, slower moving at a slower rate, more or less a timing equivalent to some video compression techniques? Some of this makes good sense from my perspective, some appears goofy but may not be, and some clearly is whack according to well supported experimental evidence from my side, not sure about yours.

    Here's an interesting one, apparent motion from blurring, occurring at the retina, ostensibly due to 'reaction time' of light receptor cells (rods and cones). I can see how this might occur. But if it's a time lag that causes blurring, everything should be blurred, because the layers of cells of different types in the retina between the receptors and those firing down the optic nerve operate strictly by slow potentials -- there's not a 'firing' neuron among them. Or, if their processing, though slow, accounts for motion and compensates, preventing adding to the blurring, how can that be used to increase apparent motion?

    A last point which I'm fairly certain isn't covered in gaming and graphics presentation because very few know much about it and we don't understand it well: 10% of the optic nerve is feed-forward, top down control or tuning of the retina and its processing. Motion perception can be primed, can suffer from habituation, and has variance in efficacy according to several factors. What cognitive factors have an influence on this, and how can that be used to improve motion perception and/or produce motion perception that's as adequate as what's being used now but requiring less external computational effort because internal computation is being stimulated.

    It's probable that both fields have things of interest and use to the other, including things the other isn't aware of. I've said much the same following another article on a different subject. From this one I can see it's probable there's a few peoples' careers worth o

  • 24 fps (Score:3, Interesting)

    by Sir Holo ( 531007 ) on Wednesday January 06, 2010 @03:55PM (#30673922)
    Movies are 24 fps because film is expensive.

Order and simplification are the first steps toward mastery of a subject -- the actual enemy is the unknown. -- Thomas Mann