Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
First Person Shooters (Games) Games

Framerates Matter 521

An anonymous reader writes "As more and more games move away from 60fps, the myth of the human eye only being able to detect 30fps keeps popping up. What's more, most people don't seem to realize the numerous advantages of a high framerate, and there's plenty of those."
This discussion has been archived. No new comments can be posted.

Framerates Matter

Comments Filter:
  • Really? (Score:5, Informative)

    by non0score ( 890022 ) on Wednesday January 06, 2010 @01:30PM (#30671876)
  • Counter-Strike... (Score:3, Informative)

    by Manip ( 656104 ) on Wednesday January 06, 2010 @01:35PM (#30671962)

    I myself used to play Counter-Strike (classic), and I can tell you both FPS and Ping made a HUGE difference in that game to the point that my score would increase as I connected to servers closer to home and used OpenGL instead of DirectX (since OpenGL almost doubled the FPS at the time).

    Now, I wasn't an expert but I did play a whole lot. I think you ask most serious players and they would agree the impact of both...

  • Cached Version (Score:5, Informative)

    by sabre86 ( 730704 ) on Wednesday January 06, 2010 @01:36PM (#30671994)
    Looks like it's Slashdotted already. Here's the cached page: http://74.125.47.132/search?hl=en&q=cache%3Awww.significant-bits.com%2Fframerates-do-matter&aq=f&oq=&aqi= [74.125.47.132]
  • by Shin-LaC ( 1333529 ) on Wednesday January 06, 2010 @01:45PM (#30672132)
    Your eyes introduce blur due to the reaction time of the light-sensitive cells in the retina. Fortunately, the image processing area in your brain treats blur introduced by the eyes and blur built into the frame more or less the same, so you can use blur to give the impression of smooth motion with a lower frame rate than would otherwise be necessary. This is used to good effect in cinema, where the camera's exposure time naturally introduces blur that is quite similar to the one introduced by your eye.

    In the case of video games, however, it is not so clear that rendering effctive artificial motion blur saves much processing time compared to simply rendering more frames. Then again, there is a limit to how fast your monitor can update its image, so rendering more frames is no longer an option past that point.
  • by Shadow of Eternity ( 795165 ) on Wednesday January 06, 2010 @01:48PM (#30672178)

    That just means we should strive for a higher framerate until our eyes blur things on their own. Reality is not inherently blurry (unless you need glasses...), our eyes and brain do that internally.

    Making movement in a game inherently blurry when your head is already going to blur it for you internally is just a shortcut to motion sickness for a whole lot of people.

  • by Iyonesco ( 1482555 ) on Wednesday January 06, 2010 @02:15PM (#30672570)

    For a high speed game like Quake even 60fps is totally unplayable and there's a massive difference between 90fps and 120fps. I consider 120fps the minimum for Quake and for that reason I continue to use a CRT. If you put my CRT at 120Hz+120fps next to a 60Hz+60fps LCD the difference is night and day and the LCD looks extremely choppy. You don't even have to do a side by side comparison and if you're used to playing at 120fps on a daily basis then you'll instantly see the difference when you see the game running at 60fps.

    People who think you can't tell above 60fps have obviously never done any sort of valid comparisons because the difference is extremely pronounced. Research done by Sony found that "240Hz is the perception limit for the degradation of motion image quality for the human eye in following natural images" (Journal of the Society for Information Display Vol 15.1). I suspect there would be a noticeable difference between 240fps and 120fps but I've never had the opportunity to compare.

    These comments are all in the context of playing Quake which is a very fast moving game so there is a large difference between each frame. If you play a much slower game then the difference between each frame will be significantly less, in which case 30fps might look absolutely fine. However, just because some games look fine at 30fps doesn't justify the whole "the human eye can't perceive above 30fps" idiocy.

  • by Spazmania ( 174582 ) on Wednesday January 06, 2010 @02:24PM (#30672700) Homepage

    The pictures drawn on the screen aren't the real model the game uses.

    That's not necessarily true. There's a long history of games relying on the graphics processor to determine when two objects overlap or otherwise meet specific conditions relative to each other. Goes all the way back to the 8-bit days when the graphics processor could tell you whether the non-transparent parts of two sprites overlapped.

  • by Aladrin ( 926209 ) on Wednesday January 06, 2010 @02:25PM (#30672738)

    Actually, the lower limit matters based on the person. My eyes are apparently a bit wonky and my lower limit is 15 fps, which would drive most people insane in a video game. Below that and it drives me insane. As for telling the different between 30 and 60... I can do it... Barely. Compare 60fps and anything higher and it's absolutely pointless for me to try. However, I've met people who can definitely tell the different between 100fps and 60 fps.

  • by takev ( 214836 ) on Wednesday January 06, 2010 @02:37PM (#30672894)

    But, if you follow the hand with your eyes, your hand will appear sharp. You'll be supprised how quickly and stable eyes can track moving objects.

    The BBC has been experimenting with fast frame rate TV, running at 300 frames-per-second. Moving objects will appear much sharper with such a broadcast compared to the standard 50 frames-per-second (not fields). They showed a side by side example, both were 1080 progressive scan. Great for sports broadcasting.

    Also Silicon Graphics (when they were called that) have done test with fighter pilots when designing flight simulators. Motion sickness is a problem with those flight simulators, compared to an actual jet plane. When they got a constant frame rate above 80 frames (160 frames per second when doing stereo imaging) per second the motion sickness was greatly reduced. They solved the processing power problem by being able to reduce the rendering resolution on each frame.

  • by pz ( 113803 ) on Wednesday January 06, 2010 @02:43PM (#30672982) Journal

    I am a visual neuroscientist (IAAVNS). The standard idea of refresh rate comes from CRT based monitors where the image is drawn by a scanning electron beam. If you use an instrument to measure the instantaneous brightness at a given point on the screen it will rapidly peak as the beam swings by, and then decay as the phosphor continues to release absorbed energy in the form of photons. Different monitors have different decay rates, and, typically, CRTs that were designed for television use have pretty slow decay rates. CRTs that were designed for computer monitors typically have faster decay rates. If the decay rate were very very fast, then the hypothetical point on the screen would be dark most of the time and only occasionally very bright as the beam sweeps by on each frame.

    As you can imagine this highly impulsive temporal profile is hard to smooth out into something closer to the constant brightness of the world around us. The human retina has an inherent dynamic response rate to it, but it's actually quite fast, and there have been studies showing clear responses in higher order visual areas of the brain up to 135 Hz. But standard phosphors used in CRTs have a little smoother response, and so at more-or-less 80 Hz, the brain stops seeing the flicker (at 60 Hz most people see flicker on a computer monitor). The exact refresh rate where perceptual blurring happens (so the flickering goes away) varies widely between individual, and with the exact details of the environment and what is being shown on the screen. More-or-less at 100 Hz refresh, no one sees the flicker anymore (although the brain can be shown to be still responding).

    Contemporary screens, however, are LCD based (I'm going to ignore plasma screens since the field is still working out how they interact with the visual system). Making the same experiment as above, the temporal profile of brightness at a given spot on the screen will look more like a staircase, holding a value until the next frame gets drawn. This is a far, far smoother stimulus for the visual system, so a 60 Hz frame rate produces a perceptually far more flicker-free experience. That's why most CRTs at 60 Hz make your eyes bleed, while LCDs at 60 Hz are just fine.

    Except that newer LCDs have LED backlighting which is no longer constant, but flashed (WHY? WHY? WHY? Just to save some power? Please, computer manufacturers, let *me* make that decision!), so the experience is somewhat more like a CRT.

    So that's one part of the equation: flicker.

    The other part of the equation is update rate, which still applies even there might be no flicker at all. Here, we have the evidence that the brain is responding at up to 135 Hz. In measurements made in my lab, I've found some responses up to 160 Hz. But the brain is super good at interpolating static images and deducing the motion. This is called "apparent motion" and is why strings of lights illuminated in sequence seem to move around a theater marquis. The brain is really good at that. Which is why even a 24 Hz movie (with 48 Hz frame doubling) in a movie theater is perceptually acceptable, but a 200 Hz movie would look much more like a window into reality. On TV you can see the difference between shows that have been shot on film (at 24 Hz) versus on video (at 30 or 60 Hz). Video seems clearer, less movie like.

    For games, 60 Hz means 16 ms between frame updates -- and that can be a significant delay for twitch response. Further, modern LCD monitors have an inherent two or three frame processing delay, adding to the latency. As we know, long latency leads to poor gameplay. Faster updates means, potentially shorter latency, since it is a frame-by-frame issue.

    So, just as with audio equipment where inexpensive low-fidelity equipment can produce an acceptable experience, while a more expensive setup can create the illusion of being at a concert, so too inexpensive video equipment (from camera to video board to monitor) can produce an acceptable experience, while a more expensive setup can create the illusion of visual reality.

  • by Animaether ( 411575 ) on Wednesday January 06, 2010 @03:02PM (#30673260) Journal

    more accurately - most film cameras don't have a notion of a shutter 'speed'.

    The film roll still goes by at 24fps, but the actual shutter is a wheel. That wheel can have various sizes of gaps (to increase/decrease exposure *time*) and sizes (to produce specific motion blur effects; e.g. an object leading its own motion blur path requires a small shutter opening at first, ending in a large shutter opening). You use fairly sensitive film and a small shutter gap, and you'll get nearly motion blur-less shots like that of Saving Private Ryan (watch explosions in that film and every speck of dirt that gets thrown about appears almost razor-sharp; some find this objectionable). Heck, you can even expose twice per frame if you want to get all experimental and stuff.

    That said.. you can't - short of electronic shutters - expose for -more- than the film's fps, though. A bit under 1/24th of a second is the most you'll get (that 'bit' being required to transport the film to the next frame).

    Anyway.. wiki: http://en.wikipedia.org/wiki/Rotary_disc_shutter [wikipedia.org]

  • by Psyborgue ( 699890 ) on Wednesday January 06, 2010 @03:13PM (#30673392) Journal

    But the basic myth comes from the fact that film is so convincing and thus you don't "need" more... as long as each frame is a blurred representation of the full period of time it is displayed for.

    Not quite. Film cameras, because of they way they work, max out about half of the time they are exposed for (180 degree shutter [tylerginter.com]). 24fps is usually shot at 1/48 second exposure time per frame. The full time (a 360 degree shutter) would be far too blurry.

  • Outside Looking In (Score:5, Informative)

    by DynaSoar ( 714234 ) on Wednesday January 06, 2010 @03:14PM (#30673432) Journal

    I'm a neuroscientist that covers sensation and perception and its bidirectional interaction with cognition, particularly attention. I've got comments and questions and very few answers after reading this. I'm seeing a lot of things stated as facts that I've never heard of before. Some of them make sense, and some don't. Some of them are correct, some not, and many more than the others combined I have no experience in and can't say. Those seem to be well supported, or at least well known, particularly among those who've obviously done their homework. I can find references to these among the publications (like ACM) that are most applicable to the field in question, but I can find precious little in my customary pubs and books. That's not to say the stuff in the technically oriented pubs is wrong, just that some may not be covered much (ie. 'not of interest') in my field. My field is very cautious about experimental evidence, but I suspect in gaming's perception area there are common knowledge kids of things that came from hear say (we have many of those in rocketry too). It might do well for both fields to compare works.

    What catches my eye at first is this "myth". As stated it's overly simplistic. Which humans' eye? Some have different reaction times. Those who could probably detect 30 fps discontinuity are those who see the TV screen jiggle and waver when they chew something crunchy while watching (you know who you are, here's a place to own up to it). What part of the visual field, central or peripheral? They operate differently. Jittering or blurring of objects attended to or not? Betcha it happens more to those not attended to, but that's not noticed for the same reason (hypnosis can bring that out right nicely). And how is it frame rates matter when the visual system evolved as a constant flow analog system? If a phenomenon that shouldn't make a difference does, and that frame rate is strictly due to technical considerations, how do we know that a variable frame rate might not give even better results? Since the visual system does not have full-field frames that refresh, why should artificial presentations? Why not present faster moving objects at a high change rate, slower moving at a slower rate, more or less a timing equivalent to some video compression techniques? Some of this makes good sense from my perspective, some appears goofy but may not be, and some clearly is whack according to well supported experimental evidence from my side, not sure about yours.

    Here's an interesting one, apparent motion from blurring, occurring at the retina, ostensibly due to 'reaction time' of light receptor cells (rods and cones). I can see how this might occur. But if it's a time lag that causes blurring, everything should be blurred, because the layers of cells of different types in the retina between the receptors and those firing down the optic nerve operate strictly by slow potentials -- there's not a 'firing' neuron among them. Or, if their processing, though slow, accounts for motion and compensates, preventing adding to the blurring, how can that be used to increase apparent motion?

    A last point which I'm fairly certain isn't covered in gaming and graphics presentation because very few know much about it and we don't understand it well: 10% of the optic nerve is feed-forward, top down control or tuning of the retina and its processing. Motion perception can be primed, can suffer from habituation, and has variance in efficacy according to several factors. What cognitive factors have an influence on this, and how can that be used to improve motion perception and/or produce motion perception that's as adequate as what's being used now but requiring less external computational effort because internal computation is being stimulated.

    It's probable that both fields have things of interest and use to the other, including things the other isn't aware of. I've said much the same following another article on a different subject. From this one I can see it's probable there's a few peoples' careers worth o

  • by TheEvilOverlord ( 684773 ) on Wednesday January 06, 2010 @03:42PM (#30673760) Journal

    If by 'supposedly' you mean 'definitely' and if by 'most movie theaters' you mean 'all theaters and even all motion picture production processes in recent years', then yes.

    I'm sorry but that's not quite correct. I worked as a movie projectionist for several years, so I know this from experience. While 24fps works, and is what used to be used in cinemas, it is noticeably flickery. As with most advancements in cinema technology, they came up with a bit of a hack. While there are still only 24 frames of film per second, the projector shows each frame twice, giving an effective frame rate of 48fps.

  • Re:Doom 1? (Score:1, Informative)

    by Anonymous Coward on Wednesday January 06, 2010 @04:01PM (#30674008)

    Oh indeed. Motion blur existed in the Magic Carpet games which were from Doom's era, 1994-95.

    They also had SVGA, full deformable terrain, and reflective water too.

  • by LaminatorX ( 410794 ) <sabotage@praeca n t a t o r . com> on Wednesday January 06, 2010 @04:05PM (#30674066) Homepage

    At very short exposure times, the length of the blur due to motion becomes smaller than the circle of confusion of the reproduced image, eventually falling beneath even the circle of confusion of the image capture medium. Generally, though, if you increase the magnification enough, you still see blur.

    For reference, when examining negatives under a microscope Ansel Adams could no longer detect a difference between a handheld shot and a tripod shot of the same scene at exposures shorter than 1/500 of a second with a 50mm lens. The motion blur from his hands at that speed was smaller than his film and lens could resolve.

    However, with a 300mm lens, he'd have had to shoot much faster to achieve the same equivalence, due to the higher lens magnification.

  • by sperxios10 ( 848382 ) on Wednesday January 06, 2010 @04:21PM (#30674322) Homepage

    Speaking of Physics - the properties of a game's physics engine have the properties of a Riemann sum where n=fps. so the higher your FPS the more accurate your physics simulation, even if your monitor cannot discretely display all those frames.

    [note: only applies in games where physics ticks/sec are tied to framerate... which is almost all games]

    Actually all decent FPS engines have geometry/physics engines quite distinct from the graphics-pipeline!

    The geometry/physics engines work on body bounding-boxes and their respective velocity-vectors describing their trajectories, and they try to solve the intersection-problem among all bodies with regard to time, by responding with a timestamp - the collision-timestamp - to questions like this:

    "When is body A going to hit body B?"

    And on that collision-timestamp an event is scheduled, for the game-logic to kick-in, to calculate the new body-trajectories, or deaths, new body births, sarpnels, whatever.

    The physics/geometry usually runs on the game-server *simultanesous* with the clients to avoid sending back-and-forth excessive info into the network. The server is only authoritative for the game-logic decisions. Yet the client runs additionally the graphics-pipeline which uses the next-frame's timestamp to calculate the body-positions on the 3D space.

    But sometimes there is a slight delay between the collision-timestamp and the response from the server about what to do next (the game-logic's decision), that may allow a body to be drawn past its collisions point, and this is what make us think that FPS affects physics.

    To sum it up, fps has nothing to do with physics, even if some times it seems that way.

  • by Khyber ( 864651 ) <techkitsune@gmail.com> on Wednesday January 06, 2010 @04:21PM (#30674330) Homepage Journal

    Smaller lenses make for higher magnification.

    For example, the macro mode on my 62mm lens is much less powerful than the macro on my 50mm lens. the 50mm lens also has a longer zoom range.

  • by Khyber ( 864651 ) <techkitsune@gmail.com> on Wednesday January 06, 2010 @04:39PM (#30674570) Homepage Journal

    The human eye can detect FAR MORE than 30FPS.

    And here's a simple way to prove it - find yourself some 60Hz fluorescent lighting. Look up into the light, wave your hand in front of it. Note the strobe effects, and if you're good enough you can count the different hand images and do some math to figure out your eyes average response time/FPS. Do the same thing in front of an incandescent light bulbs, notice you don't get a blur.

    The average calculated human response is approximately 72 FPS.

    You also 'predict the future' as it takes about 1/10 of a second for the signal from your eyes to be processed by the brain. When you play baseball and make a swing, your brain is automatically doing lots of lag compensation so you can actually hit such a fast moving object.

  • by vikstar ( 615372 ) on Wednesday January 06, 2010 @05:19PM (#30675048) Journal

    You didn't even read your own link. So for the benefit of people who may stumble upon your misinformed post let me say that the wagon wheel effect is visible with the naked eye under continuous illumination, which happens to be mentioned in your own link.

  • by Dan East ( 318230 ) on Wednesday January 06, 2010 @05:21PM (#30675062) Journal

    Most physics engines simulate best when the timestep is the same every update - larger timesteps result in less accuracy of the simulation, to name just one issue. Rendering time varies every frame depending on the number of polys rendered, etc. So it is standard practice to decouple the physics engine from rendering, which allows the physics engine to run at whatever fixed timestep is desired. Multiple physics updates can occur for a single rendered frame and vice versa. Interpolation of position is used so objects still appear to move smoothly even though the rendering update is seldom, if ever, exactly in sync with a physics update.

    So while the parent's post is right in theory, in practice rendering and physics update rates typically have nothing to do with one another.

    More info here on implementation details:
    http://gafferongames.com/game-physics/fix-your-timestep/ [gafferongames.com]

  • by smellsofbikes ( 890263 ) on Wednesday January 06, 2010 @06:12PM (#30675784) Journal
    For the record (as an ex-LED-backlight hardware designer) the LED's are waaay too bright to run full-out, both visually and from a power usage and heat generation standpoint, and the only good way to dim an LED is by cycling it on and off rapidly to approximate the desired brightness. The reason I say 'the only good way' is because LED's are constant-current devices and all the drivers I'm familiar with are all designed around that, so you can't just go varying the voltage to try and dim them: the drivers aren't really voltage devices.

    With THAT said, I have absolutely zero idea why any sane LED driver dimmer would be anywhere near frequencies that any human could see. LED's can turn on and off in nanoseconds, so a reasonable dim signal should be in the kilohertz range, at least, not the 100hz range. It's *possible* to put a 100hz dim signal on an LED driver, but it seems really dumb to me.

  • by Zeussy ( 868062 ) on Wednesday January 06, 2010 @06:38PM (#30676104) Homepage
    New Scientist ran a really good article on this, a couple of months ago. It can be found here: http://www.newscientist.com/article/mg20427311.300-timewarp-how-your-brain-creates-the-fourth-dimension.html [newscientist.com]
    A 2006 experiment has put the rate of human vision at about 13 fps. People can see the wagon wheel effect in real life, without the aid of Strobing lights, television etc. After I read this article I did manage to observe this effect outside in sunlight, while travelling parallel to a car travelling at about 50km/h. Very surreal
  • by McGiraf ( 196030 ) on Wednesday January 06, 2010 @07:52PM (#30676918)

    "Movies (at 24 fps) look like they stutter to me, especially any sweeping pan motion with the camera."

    Yes, I'm not the only one! I find this very anoying.

  • by twidarkling ( 1537077 ) on Wednesday January 06, 2010 @11:14PM (#30678514)

    Having actually run projectors, both ones 2 decades old, and 5 years old, I have to say, you're full of shit. Film projectors in movie theatres do NOT show each frame twice. How do I know this? I've hand-cranked them to ensure they were threaded correctly. Frame is shown while shutter is open, frame moves while shutter closes. This allows it to not be a smear across the screen while the film moves. Showing the same frame twice in a row wouldn't do shit but *decrease* the frame rate, since you'd be showing 12 frames in the space of what should have been 24.

    Now, if you mean "each frame is duplicated on a reel, making it twice as long as it would have been had each frame only been present once," again, my time splicing reels together to run on those same projectors proves you an idiot, not to mention naive at best. Film is expensive to make, and costly to transport (and needs to be transported securely). You really think they'd have designed a system that takes up twice as many resources as they could otherwise get away with?

  • Correction (Score:2, Informative)

    by Estanislao Martínez ( 203477 ) on Thursday January 07, 2010 @01:15AM (#30679260) Homepage

    I suspect that by the criteria you're using, most stills cameras don't have "shutter speeds" either.

    Um, I'm certainly wrong about the "most" part there. Most stills cameras don't have focal plane shutters. Most interchangeable lens still cameras do, though.

  • by Dutch Gun ( 899105 ) on Thursday January 07, 2010 @04:00PM (#30686698)

    Well, I was a projectionist at a 5-plex (about 20 years ago). No, the frames were not duplicated. As you pointed out, that would be ridiculous. The films area already huge and cumbersome to transport and maintain. And yes, I'm well familiar with the star-cam and shutter mechanism in projectors. On the projectors I ran, the shutter opened twice on the same frame for each full revolution of the cam.

    Here's a link to a patent that describes a particular star-cam mechanism [freepatentsonline.com].

    A quote of interest from that article (emphasis mine):

    The reason that the shutter must close during pull down is that the projected movie image would be degraded if the moving film were projected onto the screen. Therefore, the projected movie image necessarily "flickers" as the shutter opens and closes. It has been found that a flicker rate of 24 Hz produces a noticeable flicker and is objectionable to the audience. This problem is much less noticeable at a flicker rate of 48 Hz. For this reason, it is common to use a shutter which closes again while the film frame is motionless in the projection gate. From the standpoint of flicker, this results in a good quality movie projection.

    Another important aspect of movie projection quality is screen brightness. While closing the shutter twice per frame is good from the standpoint of flicker, it is bad from the standpoint of screen brightness. To achieve high screen brightness while still having a shutter rate of 48 Hz, the duration of the time the shutter is closed in comparison to the time that it is open should be as short as possible. But the length of time the shutter is closed is determined by the time required for film pull down. So screen brightness can be improved by reducing the film pull down time.

    I know it's fun to jump on someone you think is wrong, but at the very least, please make sure you're actually correct before you do so.

The rule on staying alive as a program manager is to give 'em a number or give 'em a date, but never give 'em both at once.

Working...