Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
First Person Shooters (Games) Games

Framerates Matter 521

An anonymous reader writes "As more and more games move away from 60fps, the myth of the human eye only being able to detect 30fps keeps popping up. What's more, most people don't seem to realize the numerous advantages of a high framerate, and there's plenty of those."
This discussion has been archived. No new comments can be posted.

Framerates Matter

Comments Filter:
  • by Hatta ( 162192 ) on Wednesday January 06, 2010 @01:30PM (#30671874) Journal

    It's important none the less, even if it's not all sharp picture, because your eye picture isn't all that sharp either and you experience the same blur.

    If my eye creates the blur, why do I need artificial motion blur?

  • by gurps_npc ( 621217 ) on Wednesday January 06, 2010 @01:37PM (#30672020) Homepage
    The human eye can clearly detect frame rates far greater than 30. So can the human brain.

    HOWEVER

    The human mind is evolutionary designed to make instant assumptions. Cat in mid air facing us = DANGER. No "Is it dead and being thrown at us?" No "Is it a picture?" As such, video games can quite easily take advantage of this evolutionary assumptions and trick the MIND, if not the brain. into thinking something is real.

    So while a higher frame rate will increase the quality of the game, it is not essential. It's like getting gold plated controls on your car's dashboard. Yes it is a real increase in quality, but most people would rather spend the money on a GPS device, real leather, plug-in-hybrid engines before you get around to putting gold in the car.

  • Headroom... (Score:2, Insightful)

    by Anonymous Coward on Wednesday January 06, 2010 @01:40PM (#30672056)

    the biggest reason to go for the highest frame rate possible is headroom. If your framerate is 30 at best, it'll dip down to 10 sometimes. If it's at 120 optimal it can dip down to 30, and still be playable.

  • Re:Really? (Score:3, Insightful)

    by Sockatume ( 732728 ) on Wednesday January 06, 2010 @01:55PM (#30672280)

    Graphics are sold by screenshots and by box shots. YouTube and so on might make a difference, but ultimately you'll get more players to swoon with half the framerate and twice the geometry, than vice versa.

  • by TheCarp ( 96830 ) <sjc@NospAM.carpanet.net> on Wednesday January 06, 2010 @02:03PM (#30672408) Homepage

    > The human mind is evolutionary designed to make instant assumptions. Cat in mid air facing us = DANGER. No "Is it dead
    > and being thrown at us?" No "Is it a picture?" As such, video games can quite easily take advantage of this evolutionary
    > assumptions and trick the MIND, if not the brain. into thinking something is real.

    Sort of. Its actually less "Cat in mid air" and more "This sets off a trigger based on something that happened before and hurt me".

    Most adults, if you chuck a rock at their face, will toss up their arms to block, or move their head/body to dodge. This is completely learned. Do the same trick with a young child who has never played "catch" before, and your rock is going to bean him right off his skull.

    From my own experience, my first motorcycle accident, I was on the ground so fast, I had to think afterwards about what happened. First two spills actually.

    The one after those.... whole different story. The adrenalin hit as soon as I felt the bike start to turn sideways, by the time the bike was fully 90 degrees to my momentum vector, and the wheels were sliding out from under me, I was already calmly kicking my legs backwards and positioning myself for the impact. I hit the ground and slid 150 feet while watching my bike spark and slide away. I thought "shit I am in traffic" jumped to my feet and ran to the bike, picked it up and pushed it into a parking lot.

    All I am saying is, its more complicated than that. The memory of such things and whole "flight or fight" response is an evolving and learning response. Its more than just visual, it encompasses all the senses. I doubt "cat facing us in mid air" is going to trigger much beyond anything in mid air moving towards us.

  • Sorry, you lost me (Score:4, Insightful)

    by nobodyman ( 90587 ) on Wednesday January 06, 2010 @02:09PM (#30672484) Homepage

    As more and more games move away from 60fps *snip*

    Hmm... I don't accept that premise, either on the PC (where midrange graphics cards can easily pull 60fps with any game on the market now) or on the consoles (where framerates are only going up as PS3 and 360 development matures).

    I think that this article (or at least the summary) is a bit of a strawman. Most of the gamers I know recognize that good framerates are important.

  • by Kreigaffe ( 765218 ) on Wednesday January 06, 2010 @02:09PM (#30672506)

    It's way, way way more than that.

    The old HL engine -- at least in Natural Selection, but most likely any game on that engine -- your framerate didn't just effect your gravity (which made it so that at certain framerates you could literally jump further, which meant BHopping was sicker)..

    it also changed the DPS of weapons. Yep. Weapon firing rate was tied to FPS in a very very odd way. Some dudes did too much testing. Insane.

    And you can, visually, tell a difference between 100fps and 50fps and 25fps. Very easily. Takes a few minutes of playing, but there's a clear difference and anybody saying otherwise eats paint chips.

    Graphics don't make games good. Graphics can cripple good games. Graphics never make bad games good.

  • Same with audio... (Score:5, Insightful)

    by QuietLagoon ( 813062 ) on Wednesday January 06, 2010 @02:14PM (#30672568)
    Everyone says a "framerate" (i.e., sample frequency) of 44.1kHz is all that is needed. Yet many people hear better imaging, depth and transparency at higher sample rates.
  • Re:Doom 1? (Score:5, Insightful)

    by JackDW ( 904211 ) on Wednesday January 06, 2010 @02:16PM (#30672586) Homepage

    I've worked with the Doom source code recently, and can confirm that there was no motion blur at all. In fact, blur of any kind couldn't really be implemented, because Doom's graphics were 8-bit indexed colour.

    Also, there were no engine changes at all between Doom 1 and 2.

    Perhaps the GP is referring to the bobbing effect that occurs when the Doom guy runs. That moves the camera nearer to and away from the ground, changing the appearance of the texture.

  • by Moryath ( 553296 ) on Wednesday January 06, 2010 @02:16PM (#30672596)

    Not quite.

    The eye blur happens for two reasons. The first is the fact that the human eye is "assembling" an analog reading of the light taken over a specific time, very similar to how a camera exposure works. We aren't "digital" beings, in the sense that there is allowance forward and back in our visual processing, but we DO assemble "frames" for the rest of our brain to analyze.

    The second is focusing. A fast-moving object moves into, and out of, the focused field quite quickly. Either we keep tracking it (in which case the unfocused foreground and background areas alter) or we don't, and it goes out of focus. We mentally render this as blurring. Directors in 2D movies use depth-of-field to do a quick transition between two speaking characters and ensure the right one has prominence, by keeping the speaker in focus and then quickly shifting focus in/out to bring the other to prominence when the dialogue turns.

    The real sin, and unalterable problem currently, for 3D technology is that everything renders in-focus. Motion blurs work to some degree, but a large-scale image with "background" objects sharply in focus gives us headaches. We follow the other visual cues, try to "focus" to distance, try to "refocus" for the fuzziness it causes, and then wobble back and forth till we have sore, tired eye muscles.

    The 3D Brendan Frasier Journey to the Center of the Earth was the closest done so far, because they did introduce some background blur, but it still had problems should the viewer decide to focus on something other than what the director wanted them to focus on, visually. Avatar commits the same sin as well, and doesn't even try to do it properly. It's like watching some big pixely, perfect-focus-for-miles video game.

    As for the other items they mention - "The framerate of a game is usually directly tied to the processing of its logic." Not true. Indeed, only true if you've got shoddy programmers (the fix for one of the most notorious examples, the jumping-height differences of various iterations of the Quake engine, was to simply lock the calculations to assume a static framerate [savagehelp.com]; the id software programmers, who chose to instead discard "erroneous" round-up errors, wound up widely criticized for STILL making the jumps somewhat randomly framerate-dependent. The truth is that the visual rendering framerate of a game simply does not have to be the same as the internal calculation "frame" rate.

    As for input lag... the difference in "lag" between a 30-fps framerate and a 60-fps framerate is 16 ms. Even if you get to 120-fps and have a monitor capable of doing so at your chosen resolution, your difference is 25 ms. Human reaction to visual stimuli is generally in the neighborhood of 150-300 ms.

    Even playing on a LAN in the same building, you're looking at random lag times longer than the difference between 120fps and 30fps.

  • by Aladrin ( 926209 ) on Wednesday January 06, 2010 @02:23PM (#30672684)

    Congratulations. That -is- incredibly nitpicky. I'm amazed.

    He is not a scientist and this is not a paper he is writing for publication. He is using the word 'designed' as the unwashed masses do all the time, and as such, he is not incorrect in his statement. Everyone knew exactly what he meant and nobody had to stop and trying to figure it out. He accomplished his task without getting excessively wordy or having to explain himself 3 times. As far as communication goes, he scored perfectly.

  • by Improv ( 2467 ) <pgunn01@gmail.com> on Wednesday January 06, 2010 @02:30PM (#30672788) Homepage Journal

    It may be true that high framerates are a good thing, but the linked article is rubbish - the author's arguments are really very stupid.

  • by Chris Burke ( 6130 ) on Wednesday January 06, 2010 @02:42PM (#30672972) Homepage

    The 30-fps-is-all-you-can-see myth was probably born of the notion that the illusion of continuous movement starts to set in around 25-30fps (in film for example). Therefore actually 30fps is the minimum you need rather than the maximum you can perceive.

    I think it's more likely born of the notion that film gives a completely convincing illusion of motion that is not greatly improved by higher frame rates, because the process by which it is created automatically includes motion blur because it's recording continuous data, just broken up into 24 fps. Computer games display discreet moments in time, not many moments blurred together into one picture. That's why film looks smoother than computer games with 3 times the framerate.

    Nevertheless, the illusion of continuous movement is apparent at much lower framerates than even film, even in a computer game. Quake's models were animated at 10 fps, and they gave a convincing illusion of movement, and you can probably make due with a lot less since the brain fills in so much. But it's not a completely convincing illusion, and neither is 30, 60, or even 100 when using static instants in time.

    But the basic myth comes from the fact that film is so convincing and thus you don't "need" more... as long as each frame is a blurred representation of the full period of time it is displayed for.

  • by smitty97 ( 995791 ) on Wednesday January 06, 2010 @02:47PM (#30673046)

    Because the framerate is high.

    There, i've taken it full circle.

  • by brianosaurus ( 48471 ) on Wednesday January 06, 2010 @03:55PM (#30673920) Homepage

    I agree with you...

    Some argue (like the battle between 30fps vs 60fps) that human eye can't process more than certain amount of "frames" per second.

    Isn't the reason movies use 24 fps (and similarly TV uses ~30fps) because of historical technical limitations? That is right about the minimum rate where your eyes and brain can smooth out the annoying flicker. 30fps isn't the upper limit that the eye can process, but rather a lower limit that makes the image sequence appear as motion without causing stutter, headaches, or otherwise detract from the visual experience. Its a compromise to allow movies to fit on reasonable sized rolls of film, and for TV to have been able to fit "good enough" video quality into the available bandwidth at the time, and to not have frequency beating artifacts due to lights running on 60Hz AC power (or 50Hz & 25fps in Europe, etc).

    For an easy example that 30fps isn't enough, run iTunes, play some music and turn on the "iTunes Classic Visualizer" full screen. Hit "F" to display the frame rate, then use "T" to toggle the 30fps limit on and off. Tell me you don't see a big difference.

    I'm sure there's an upper threshold where you can't distinguish a difference as frame rate increases, but its much higher than 30 or 60 fps, and as the parent said it is probably higher than we can achieve in hardware for the near future.

  • by Eraesr ( 1629799 ) on Thursday January 07, 2010 @04:21AM (#30680058) Homepage

    The debate about 30fps vs 60fps isn't about whether people can actually notice the difference. I don't think I've ever seen a developer say that the difference is not noticeable. The thing is that if they render at 30fps rather than 60fps, they have twice the amount of time to render a single frame, allowing for much more details and effects in each scene. So the question isn't whether people can see the difference in framerate, but it's about what level of detail the developer wants to achieve and whether or not that's possible at 60fps.

    People interested in the subject should take a look at Eurogamer's Digital Foundry (http://www.eurogamer.net/digitalfoundry/). They got loads of technical game reviews and articles about this very subject.

  • by TheLink ( 130905 ) on Thursday January 07, 2010 @09:39AM (#30681558) Journal
    But why? Motion blur is overrated. Sure put it in scenes where it is "important to the story/gameplay", but to use it whenever there is fast motion is stupid.

    Why? Because people aren't staring at the same spot on the screen all the time. And nowadays screens are getting bigger.

    Say in real life, you're in a room where there are two moving objects that are moving around at fast but eye-trackable speeds in different directions.

    If you are staring at sommething else, both objects are blurry.

    But if you start to look at one, that particular object becomes _sharp_, the other object becomes blurry.

    You look at the other, it becomes sharp and the other becomes blurry.

    When a game or movie blurs moving stuff, it just makes stuff you are looking at look out of focus even if they are moving at speeds which your eye can track. You can't focus on it even if in real life you could!

    With motion blur, I often experience eye strain when I try to track moving objects/backgrounds that have been blurred.

    Then there are the artificial "out of focus" shots in static scenes. These effects should also be restricted to scenes where it is important to the story that only a few items are in focus.

    In Avatar (2D), my eyes were often trying to focus on blurry images and it wasn't pleasant - initially I was wondering what was wrong with my eyes - felt like I had difficulty focusing on stuff.

    When I watched it in 3D, I realized that a lot of stuff was actually blurry and it wasn't my eyes. In some fairly static scenes the focal range was low - only a few objects were in focus. Then in some scenes the moving objects were blurry. Whereas in other scenes most stuff was in focus. In Avatar 3D it was easier to figure out where I "should" be looking and avoid the eyestrain bits :).

    If you ask me I prefer as much of each frame to be sharp and in focus as possible, then let the limitations of my eyes blur it.

    Artificial blurring (motion or defocus) is like listening to artificially degraded music/audio. While there are some cases that call for it (distance effect) it's just silly if you use it a lot.
  • by gerryn ( 1416389 ) on Thursday January 07, 2010 @11:03AM (#30682464)
    As briefly mentioned in the article linked, it's not only about if the picture is percieved as being smooth or not. When playing FPS games the mouse's responsiveness is directly linked to the FPS. A good example of this is the V-SYNC option available in many games, even though the game runs very smooth with V-SYNC on, the controls are all but smooth, and you end up with a very unresponsive and "rigid" camera. At least that's how it is for me. I have discussed this issue with quite a lot of people and some say they don't notice anything while others say they do. I have always noticed, and it does not matter what kind of hardware I use (it happens with all hardware and all games). I think some people have the ABILITY to notice these kind of things, while others don't (I might be wrong...) The nightmare scenario is 1) 30 FPS and the responsiveness of the mouse that comes with it, secondly the visual experience which is also greatly reduced. With 60 FPS (at least) it feels good, and I have noticed that with even higher framerates the responsiveness is increased, which makes sense since the DPI of the mouse has increased greatly over the years as well. I dont know the ratio of mouse dpi / framerate for a good experience but its obvious that there must be one. /G

With your bare hands?!?

Working...