Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
First Person Shooters (Games) Games

Framerates Matter 521

An anonymous reader writes "As more and more games move away from 60fps, the myth of the human eye only being able to detect 30fps keeps popping up. What's more, most people don't seem to realize the numerous advantages of a high framerate, and there's plenty of those."
This discussion has been archived. No new comments can be posted.

Framerates Matter

Comments Filter:
  • by sopssa ( 1498795 ) * <sopssa@email.com> on Wednesday January 06, 2010 @01:26PM (#30671832) Journal

    The article notes about motion blurring, and links to NVidia's page about it's technology [nvidia.com]. The last figure [nvidia.com] shows a terrain with full-screen motion blur effect, which in my opinion is pretty important in games to create that feeling of speed. People usually object against this and bloom effects and just want a sharp picture, but maybe some games have taken it too far. It's important none the less, even if it's not all sharp picture, because your eye picture isn't all that sharp either and you experience the same blur.

  • by spun ( 1352 ) <loverevolutionary&yahoo,com> on Wednesday January 06, 2010 @01:35PM (#30671970) Journal

    Just a guess, but perhaps because the frame rate isn't high enough for your eye to generate the blur? That is to say, if the scene were real, the frame rate would be well-nigh infinite, and your eye, capable of only a certain frame rate, would blur together all the frames. With discrete frames, you need to put in the blur the eye would generate from the frames in-between.

    Or something like that.

  • Doom 1? (Score:3, Interesting)

    by tepples ( 727027 ) <tepplesNO@SPAMgmail.com> on Wednesday January 06, 2010 @01:37PM (#30672014) Homepage Journal
    By "Doom" do you mean Doom (1993) or Doom 3? If the former, I never saw this effect while playing the game on MS-DOS (vanilla version), Mac (Ultimate Doom), or GBA.
  • by Anonymous Coward on Wednesday January 06, 2010 @01:38PM (#30672026)

    Avatar had a lot of flickering because of the frame rate. The flicker gets more obvious with 3D and Imax. Apparently there is talk of going to 60 frames for projected movies but I wouldn't hold my breath since theaters are already squealing about switching to digital projection and 3D. The technology is becoming available but I'll be surprised if they try to deploy it before the 2020s. Too bad because it would make a massive difference for action films especially 3D. With talking head pictures you'd never notice the difference.

  • by Monkeedude1212 ( 1560403 ) on Wednesday January 06, 2010 @01:41PM (#30672070) Journal

    You can tell the difference between 30 FPS and 60 FPS.

    The way I tested this was I made a 2 second video in flash, a circle moving from the left side of the screen to the right side. 60 frames. Run it at 30 FPS.

    Then I made a second 2 second video, same exact positions. 12 Frames. Ran it at 60 FPS. Asked me, and all of my surrounding classmates, which was about 24 students IIRC.

    100% of us noticed a visible difference in the smoothness. Whether our eyes were making out each individual frame perfectly or blurring some together to create a smoother effect, it was irrelevant since there WAS a noticable difference. I was going to slowly bump the 30 and 60 FPS up higher and higher to see at what point the difference is not distinguishable, but I got lazy (High school student at the time.)

    The point I think most gamers would agree on is that more frames per second are nice - but that 30 frames per second are Necessary. You can occaisonally dip down to 24 and be alright (24 is supposedly the speed that most Movie theatres play at) - but when you get around 20 or so its really does take away from the experience.

  • by maxume ( 22995 ) on Wednesday January 06, 2010 @01:44PM (#30672122)

    That isn't always the case, I recall a game in the past where gravity had less effect on players that had faster hardware. Or something like that. Anyway, the logic was mixed in with the rendering, so frame rate had an impact on what the player could do.

  • by Speare ( 84249 ) on Wednesday January 06, 2010 @01:52PM (#30672230) Homepage Journal
    In many embedded apps, like coin-op arcade games, the "model" is indeed tied to the frame rate. The main loop assumes a fixed dt, and pipelines the input, update, render tasks. Often this is done without threading, just while (!dead) { do_input(); do_update(); do_render(); } in the main function. Even with threads or co-processors, they often tie the rates 1:1:1. Some have no room for adjustment, and some will at least update their dt if the render took too long.
  • by LordKazan ( 558383 ) on Wednesday January 06, 2010 @01:55PM (#30672278) Homepage Journal

    Why can't the game/computer/monitor produce fast enough frame-rates that its my eyes that are creating the blur, not the Post Rendering effects?

    Physics.. monitors cannot change fast enough and in the right way to do this. they simply don't work that way.

    Speaking of Physics - the properties of a game's physics engine have the properties of a Riemann sum where n=fps. so the higher your FPS the more accurate your physics simulation, even if your monitor cannot discretely display all those frames.

    [note: only applies in games where physics ticks/sec are tied to framerate... which is almost all games]

  • by Anonymous Coward on Wednesday January 06, 2010 @02:03PM (#30672406)

    The human eye does not work with frames. It is a continuous, dynamic system.

    The only way to achieve the blurring effect without artificially rendering it would be to make the object move on the screen at the same velocity it would in normal life. Needless to say, making a car move at 300 km/h in the virtual world and showing it with the real speed on the screen would simply make the game unplayable even excluding the practical difficulty of driving at that speed, because today screens simply don't have the necessary response times.

  • by sopssa ( 1498795 ) * <sopssa@email.com> on Wednesday January 06, 2010 @02:04PM (#30672420) Journal

    I dont think we will get to a point that the framerate would be fast enough. The 3D monitors only generate up to 120fps too, and there's still lots of hardware limits to generate framerates over that with current games on good resolutions. And there is no framerate in real world; you're taking in images in realtime. Some argue (like the battle between 30fps vs 60fps) that human eye can't process more than certain amount of "frames" per second. The natural motion blurring effect and it's absence with video games perfectly shows that it can. While you see a smooth movement, you're still missing extra things like that generated by brain.

  • by jeffmeden ( 135043 ) on Wednesday January 06, 2010 @02:04PM (#30672434) Homepage Journal

    You can occaisonally dip down to 24 and be alright (24 is supposedly the speed that most Movie theatres play at) - but when you get around 20 or so its really does take away from the experience.

    If by 'supposedly' you mean 'definitely' and if by 'most movie theaters' you mean 'all theaters and even all motion picture production processes in recent years', then yes. The difference is lost on most people, but the reason 24fps is acceptable in movies is that the frame you see isn't what happened at that instant in time when it's displayed, it's everything that happened in the last 1/24th of a second, since it's recorded on film that exposed for that 24th of a second to derive the image. When a computer does it, it only cares about what is happening at that exact 24th of a second; so the difference between a series of exact frames of motion and a series of frames that include the blur of what happens between frames is HUGE.

    However, this nuance is lost on pretty much everyone who fires up a computer game, notes the FPS indicator, and goes "OMG I CAN TOTALLY TELL ITS ONLY 30FPSZZZZ!!!! HOW INFERIOR!!!". Whine about framerates all you want, but they are only a small part of the experience.

  • 30 Fps myth (Score:2, Interesting)

    by ggendel ( 1061214 ) on Wednesday January 06, 2010 @02:07PM (#30672462)

    There were a lot of studies done a long time ago, and there are some very accurate psycho-visual computer models of the human visual system. I had the pleasure of working with the Jeff Lubin model when I worked at Sarnoff Corp, which won an Emmy Award back in 2000.

    The 30 fps requirement is not a fixed point, but depends upon a lot of other factors, including viewing distance, field of view, and lighting conditions. The reason that film operates at 24 fps is because it is expected to be viewed in a darkened room. When film is trans-coded for TVs, they have to modify the gamma for a normally lighted viewing area or it will look bad. NTSC TVs are interlaced, displaying 60 fields per second, even though the frame rate is 30 frames per second.

    Bottom line is that this article should include the environmental factors under which this point was made.

  • by DeskLazer ( 699263 ) on Wednesday January 06, 2010 @02:08PM (#30672478) Homepage
    15 FPS vs 30 FPS vs 60 FPS [boallen.com]. This is a visual representation. There are points made, however, that when you watch a movie, the image is "softened" and runs at a lower framerate [something like 24 or 25 FPS?] because your brain helps "fill in the gaps" or something of that sort. Pretty interesting stuff.
  • by DavidTC ( 10147 ) <slas45dxsvadiv.v ... m ['box' in gap]> on Wednesday January 06, 2010 @02:16PM (#30672588) Homepage

    More to the point, the eye does not work with frames. The eye itself has no framerate.

    Rods and cones individually update at about 15 times a second, but each individual one is entirely asynchronous from all the others. One update, another update, another update, etc. Your entire eye is not read 15 times a second, each individual light sensor 'trips' 15 times a second, semi-randomly, and sends the current light level. (1)

    While each rod and cone only sends one signal, and then nothing, until it resets and sends another, our brains seems to assume that the light and color levels have remained the same.

    Hence we get a 'blur', as objects move, and our brain assumes that said object is also in the old position until all rods and cones have updated.

    1) And even that's not entirely right. Each rod and cone is actually sending a sorta average of the light it received since in the last update. You don't have to receive a photon exactly as it updates.

  • by Artifex33 ( 932236 ) on Wednesday January 06, 2010 @02:36PM (#30672874)

    The real problem with low framerate is controller lag. I had a copy of Unreal Tournament 3 for my PS3, which had the amazing distinction of allowing you to use a compatible keyboard and mouse combo instead of the regular sixaxis controller. As a die-hard FPS gamer who had been resisting an expensive PC upgrade, this was welcome.

    Unreal Tournament 3 for the PS3 is pegged at 30 FPS. The result when used with a kb+mouse was horrible controller lag. It was as if the view angle attached to the mouse was on rubber band that would stretch during a quick mouse move and then snap back into position.

    When I tried the sixaxis, the controller lag wasn't noticable at all. My best guess at this was because the joystick-controlled view had a finite acceleration, rather than from any hardware lag. The keyboard, mouse and the sixaxis were all bluetooth connected. Using the same mouse on a PC game playing Quakelive showed no signs of lag. The sixaxis just isn't capable of the whiplash movements that a mouse is, so it couldn't show the same responsiveness issue.

    The kb+mouse combo was still an advantage, but for a PC gamer, it was crippling to adjust to the laggy feel.

    I'll have to try out some of the PC games that end up in the sub-30 FPS range to see if I can reproduce the same feel.

  • Showscan (Score:2, Interesting)

    by davidjohnburrowes ( 884536 ) on Wednesday January 06, 2010 @02:44PM (#30673000)
    fwiw, the reports I read of folks that watched showscan movies ( http://en.wikipedia.org/wiki/Showscan [wikipedia.org] ) 20+ years ago overwhelmingly said that the higher framerate gave the films an level of realism that they'd never seen in films before.
  • 24 fps (Score:3, Interesting)

    by Sir Holo ( 531007 ) on Wednesday January 06, 2010 @03:55PM (#30673922)
    Movies are 24 fps because film is expensive.
  • by Anonymous Coward on Thursday January 07, 2010 @02:32AM (#30679612)

    Tell that to the auto manufacturers. Some crappy cars out there have their tail lights set blink in the tens of hertz. It looks like there are a dozen cars about every time you scan your eyes around, which is no problem for the Chatty Cathies that aren't looking at the road in the first place. For anyone that does actually put effort into driving though, it's a dangerous mirage that should have never been make street legal.

E = MC ** 2 +- 3db

Working...