Framerates Matter 521
An anonymous reader writes "As more and more games move away from 60fps, the myth of the human eye only being able to detect 30fps keeps popping up. What's more, most people don't seem to realize the numerous advantages of a high framerate, and there's plenty of those."
Motion blur and bloom effects (Score:4, Interesting)
The article notes about motion blurring, and links to NVidia's page about it's technology [nvidia.com]. The last figure [nvidia.com] shows a terrain with full-screen motion blur effect, which in my opinion is pretty important in games to create that feeling of speed. People usually object against this and bloom effects and just want a sharp picture, but maybe some games have taken it too far. It's important none the less, even if it's not all sharp picture, because your eye picture isn't all that sharp either and you experience the same blur.
Re: (Score:2)
Doom 1? (Score:3, Interesting)
Re:Doom 1? (Score:5, Insightful)
I've worked with the Doom source code recently, and can confirm that there was no motion blur at all. In fact, blur of any kind couldn't really be implemented, because Doom's graphics were 8-bit indexed colour.
Also, there were no engine changes at all between Doom 1 and 2.
Perhaps the GP is referring to the bobbing effect that occurs when the Doom guy runs. That moves the camera nearer to and away from the ground, changing the appearance of the texture.
Artificial blurring is overrated (Score:4, Insightful)
Why? Because people aren't staring at the same spot on the screen all the time. And nowadays screens are getting bigger.
Say in real life, you're in a room where there are two moving objects that are moving around at fast but eye-trackable speeds in different directions.
If you are staring at sommething else, both objects are blurry.
But if you start to look at one, that particular object becomes _sharp_, the other object becomes blurry.
You look at the other, it becomes sharp and the other becomes blurry.
When a game or movie blurs moving stuff, it just makes stuff you are looking at look out of focus even if they are moving at speeds which your eye can track. You can't focus on it even if in real life you could!
With motion blur, I often experience eye strain when I try to track moving objects/backgrounds that have been blurred.
Then there are the artificial "out of focus" shots in static scenes. These effects should also be restricted to scenes where it is important to the story that only a few items are in focus.
In Avatar (2D), my eyes were often trying to focus on blurry images and it wasn't pleasant - initially I was wondering what was wrong with my eyes - felt like I had difficulty focusing on stuff.
When I watched it in 3D, I realized that a lot of stuff was actually blurry and it wasn't my eyes. In some fairly static scenes the focal range was low - only a few objects were in focus. Then in some scenes the moving objects were blurry. Whereas in other scenes most stuff was in focus. In Avatar 3D it was easier to figure out where I "should" be looking and avoid the eyestrain bits
If you ask me I prefer as much of each frame to be sharp and in focus as possible, then let the limitations of my eyes blur it.
Artificial blurring (motion or defocus) is like listening to artificially degraded music/audio. While there are some cases that call for it (distance effect) it's just silly if you use it a lot.
Re: (Score:2)
Doom 2 was the same engine, just with new levels. If the engine was changed at all, I doubt it was to put in a poor-mans motion blur.
Re: (Score:2)
Perhaps you're thinking of mipmapping [wikipedia.org], which was implemented at least as early as Quake 1.
Re: (Score:3, Insightful)
It's important none the less, even if it's not all sharp picture, because your eye picture isn't all that sharp either and you experience the same blur.
If my eye creates the blur, why do I need artificial motion blur?
Re: (Score:2)
Because you're still looking at a single object, your monitor, and the picture and movement in it is artificially created. If you look at real objects moving or move or shake your head you'll notice theres huge motion blur effect. If you do it in game that has no motion blur effect, you notice how it instantly jumps to where you want to look at.
Re: (Score:3, Informative)
That just means we should strive for a higher framerate until our eyes blur things on their own. Reality is not inherently blurry (unless you need glasses...), our eyes and brain do that internally.
Making movement in a game inherently blurry when your head is already going to blur it for you internally is just a shortcut to motion sickness for a whole lot of people.
Re: (Score:2)
what are you talking about?
ever waved your hand so fast back and forth that it creates a blur? Ever seen those little things that go back and forth back enough to display an image?
Reality is indeed inherently blurry. It's just hard to accurately portray blur when you're staring at something that's not moving.
Re:Motion blur and bloom effects (Score:5, Informative)
But, if you follow the hand with your eyes, your hand will appear sharp. You'll be supprised how quickly and stable eyes can track moving objects.
The BBC has been experimenting with fast frame rate TV, running at 300 frames-per-second. Moving objects will appear much sharper with such a broadcast compared to the standard 50 frames-per-second (not fields). They showed a side by side example, both were 1080 progressive scan. Great for sports broadcasting.
Also Silicon Graphics (when they were called that) have done test with fighter pilots when designing flight simulators. Motion sickness is a problem with those flight simulators, compared to an actual jet plane. When they got a constant frame rate above 80 frames (160 frames per second when doing stereo imaging) per second the motion sickness was greatly reduced. They solved the processing power problem by being able to reduce the rendering resolution on each frame.
Re:Motion blur and bloom effects (Score:5, Insightful)
Because the framerate is high.
There, i've taken it full circle.
Re:Motion blur and bloom effects (Score:4, Informative)
At very short exposure times, the length of the blur due to motion becomes smaller than the circle of confusion of the reproduced image, eventually falling beneath even the circle of confusion of the image capture medium. Generally, though, if you increase the magnification enough, you still see blur.
For reference, when examining negatives under a microscope Ansel Adams could no longer detect a difference between a handheld shot and a tripod shot of the same scene at exposures shorter than 1/500 of a second with a 50mm lens. The motion blur from his hands at that speed was smaller than his film and lens could resolve.
However, with a 300mm lens, he'd have had to shoot much faster to achieve the same equivalence, due to the higher lens magnification.
Re: (Score:3, Informative)
Smaller lenses make for higher magnification.
For example, the macro mode on my 62mm lens is much less powerful than the macro on my 50mm lens. the 50mm lens also has a longer zoom range.
Re: (Score:3, Interesting)
I dont think we will get to a point that the framerate would be fast enough. The 3D monitors only generate up to 120fps too, and there's still lots of hardware limits to generate framerates over that with current games on good resolutions. And there is no framerate in real world; you're taking in images in realtime. Some argue (like the battle between 30fps vs 60fps) that human eye can't process more than certain amount of "frames" per second. The natural motion blurring effect and it's absence with video g
Re: (Score:3, Insightful)
I agree with you...
Some argue (like the battle between 30fps vs 60fps) that human eye can't process more than certain amount of "frames" per second.
Isn't the reason movies use 24 fps (and similarly TV uses ~30fps) because of historical technical limitations? That is right about the minimum rate where your eyes and brain can smooth out the annoying flicker. 30fps isn't the upper limit that the eye can process, but rather a lower limit that makes the image sequence appear as motion without causing stutter, headaches, or otherwise detract from the visual experience. Its a compromise to allow movies to fit on reasonable sized rolls of
Re: (Score:3, Informative)
"Movies (at 24 fps) look like they stutter to me, especially any sweeping pan motion with the camera."
Yes, I'm not the only one! I find this very anoying.
Re: (Score:3, Informative)
Having actually run projectors, both ones 2 decades old, and 5 years old, I have to say, you're full of shit. Film projectors in movie theatres do NOT show each frame twice. How do I know this? I've hand-cranked them to ensure they were threaded correctly. Frame is shown while shutter is open, frame moves while shutter closes. This allows it to not be a smear across the screen while the film moves. Showing the same frame twice in a row wouldn't do shit but *decrease* the frame rate, since you'd be showing 1
Re: (Score:3, Informative)
Well, I was a projectionist at a 5-plex (about 20 years ago). No, the frames were not duplicated. As you pointed out, that would be ridiculous. The films area already huge and cumbersome to transport and maintain. And yes, I'm well familiar with the star-cam and shutter mechanism in projectors. On the projectors I ran, the shutter opened twice on the same frame for each full revolution of the cam.
Here's a link to a patent that describes a particular star-cam mechanism [freepatentsonline.com].
A quote of interest from that artic
Re: (Score:3, Interesting)
Just a guess, but perhaps because the frame rate isn't high enough for your eye to generate the blur? That is to say, if the scene were real, the frame rate would be well-nigh infinite, and your eye, capable of only a certain frame rate, would blur together all the frames. With discrete frames, you need to put in the blur the eye would generate from the frames in-between.
Or something like that.
Re: (Score:3, Insightful)
Not quite.
The eye blur happens for two reasons. The first is the fact that the human eye is "assembling" an analog reading of the light taken over a specific time, very similar to how a camera exposure works. We aren't "digital" beings, in the sense that there is allowance forward and back in our visual processing, but we DO assemble "frames" for the rest of our brain to analyze.
The second is focusing. A fast-moving object moves into, and out of, the focused field quite quickly. Either we keep tracking it (
Re:Motion blur and bloom effects (Score:5, Interesting)
More to the point, the eye does not work with frames. The eye itself has no framerate.
Rods and cones individually update at about 15 times a second, but each individual one is entirely asynchronous from all the others. One update, another update, another update, etc. Your entire eye is not read 15 times a second, each individual light sensor 'trips' 15 times a second, semi-randomly, and sends the current light level. (1)
While each rod and cone only sends one signal, and then nothing, until it resets and sends another, our brains seems to assume that the light and color levels have remained the same.
Hence we get a 'blur', as objects move, and our brain assumes that said object is also in the old position until all rods and cones have updated.
1) And even that's not entirely right. Each rod and cone is actually sending a sorta average of the light it received since in the last update. You don't have to receive a photon exactly as it updates.
Re:Motion blur and bloom effects (Score:5, Informative)
You didn't even read your own link. So for the benefit of people who may stumble upon your misinformed post let me say that the wagon wheel effect is visible with the naked eye under continuous illumination, which happens to be mentioned in your own link.
Re: (Score:3)
Sigh, you didn't read it either, so for your benefit let me do the work for you...
"In 2008, Kline and Eagleman demonstrated that illusory reversals of two spatially overlapping motions could be perceived separately, providing further evidence that illusory motion reversal is not caused by temporal sampling [9]. They also showed that illusory motion reversal occurs with non-uniform and non-periodic stimuli (for example, a spinning belt of sandpaper), which also cannot be compatible with discrete sampling. Kl
Re:Motion blur and bloom effects (Score:4, Informative)
A 2006 experiment has put the rate of human vision at about 13 fps. People can see the wagon wheel effect in real life, without the aid of Strobing lights, television etc. After I read this article I did manage to observe this effect outside in sunlight, while travelling parallel to a car travelling at about 50km/h. Very surreal
Re:Motion blur and bloom effects (Score:5, Funny)
Maybe the sun is just blinking really fast?
Re:Motion blur and bloom effects (Score:5, Informative)
In the case of video games, however, it is not so clear that rendering effctive artificial motion blur saves much processing time compared to simply rendering more frames. Then again, there is a limit to how fast your monitor can update its image, so rendering more frames is no longer an option past that point.
Re: (Score:2)
Essentially people want these effects to be done by their eyes though, not the game. Why can't the game/computer/monitor produce fast enough frame-rates that its my eyes that are creating the blur, not the Post Rendering effects?
Don't get me wrong, I like the realism that these effects give, but some people see them as kind of fake and it draws away from their experience. Perhaps some people's eyes can percieve frame-rates slightly faster than others and thus don't actually see as much blur when moving fast
Re: (Score:3, Interesting)
Physics.. monitors cannot change fast enough and in the right way to do this. they simply don't work that way.
Speaking of Physics - the properties of a game's physics engine have the properties of a Riemann sum where n=fps. so the higher your FPS the more accurate your physics simulation, even if your monitor cannot discretely display all those frames.
[note: only
Decoupling physics and rendering (Score:3, Informative)
Most physics engines simulate best when the timestep is the same every update - larger timesteps result in less accuracy of the simulation, to name just one issue. Rendering time varies every frame depending on the number of polys rendered, etc. So it is standard practice to decouple the physics engine from rendering, which allows the physics engine to run at whatever fixed timestep is desired. Multiple physics updates can occur for a single rendered frame and vice versa. Interpolation of position is used
Re: (Score:2)
Also bloom [imageshack.us] and lighting effects [imageshack.us] you still have to do in game because they rely on game world, can hide objects behind that bloom or make other objects dark, and because monitor just shows the color data you give it.
Really? (Score:5, Informative)
Re: (Score:3, Insightful)
Graphics are sold by screenshots and by box shots. YouTube and so on might make a difference, but ultimately you'll get more players to swoon with half the framerate and twice the geometry, than vice versa.
Where it matters most. (Score:2)
There are books for Tekken and the like that have frame data for every move.
Input any lag into the equation and what might be safe of block, might not, costing you the game.
Re: (Score:2)
I think what you're getting at is that consistancy matters more than maximum frame rate. For different reasons than the one you state, I'd rather play a game at a constant 20 hz than at 30 (or even 60) hz most of the time but dropping down to 15 during the most intense moments. It's the large changes in framerate that are noticable, your brain can fill in the missing pieces if the framerate is constant.
Re:Where it matters most. (Score:4, Informative)
The human eye can detect FAR MORE than 30FPS.
And here's a simple way to prove it - find yourself some 60Hz fluorescent lighting. Look up into the light, wave your hand in front of it. Note the strobe effects, and if you're good enough you can count the different hand images and do some math to figure out your eyes average response time/FPS. Do the same thing in front of an incandescent light bulbs, notice you don't get a blur.
The average calculated human response is approximately 72 FPS.
You also 'predict the future' as it takes about 1/10 of a second for the signal from your eyes to be processed by the brain. When you play baseball and make a swing, your brain is automatically doing lots of lag compensation so you can actually hit such a fast moving object.
Re: (Score:2)
You don't understand how frame rate works, do you? The pictures drawn on the screen aren't the real model the game uses. Adding frames in between other frames won't generate lag (if the processing speed is high enough) So, if activating a block at a given frame works with 30fps, it will work with 15fps, 60fps, or 300fps. The frames aren't the 'real thing,' the game's unseen internal model is the real thing. The frames are drawn as often as is possible, given the hardware, and are drawn to comply with the cu
Re: (Score:2)
Re: (Score:2)
How so? What lag are you talking about? What, in your theory, does 'lag' mean?
Heck, could you rephrase the sentence, "Input will lag behind with any other lag," so it actually makes sense?
Re: (Score:3, Interesting)
That isn't always the case, I recall a game in the past where gravity had less effect on players that had faster hardware. Or something like that. Anyway, the logic was mixed in with the rendering, so frame rate had an impact on what the player could do.
Re: (Score:2)
Yes, that's entirely possible if it's programmed so that you fall x meters for each frame rendered. What should be done is to say you fall x meters per second, taking into account how long it's been since you last calculated the value.
(I'm simplifying the effect of acceleration above--many games could get along without it and produce a decent result, though it's not hard to factor in if you want.)
Re: (Score:2)
Which is what the parent was getting at, alot of fighting games go by frames, not by seconds. Sounds ridiculous but it makes for easier programming and its alot less resource intensive.
Re:Where it matters most. (Score:5, Insightful)
It's way, way way more than that.
The old HL engine -- at least in Natural Selection, but most likely any game on that engine -- your framerate didn't just effect your gravity (which made it so that at certain framerates you could literally jump further, which meant BHopping was sicker)..
it also changed the DPS of weapons. Yep. Weapon firing rate was tied to FPS in a very very odd way. Some dudes did too much testing. Insane.
And you can, visually, tell a difference between 100fps and 50fps and 25fps. Very easily. Takes a few minutes of playing, but there's a clear difference and anybody saying otherwise eats paint chips.
Graphics don't make games good. Graphics can cripple good games. Graphics never make bad games good.
Re: (Score:3, Interesting)
Re: (Score:2)
Ah, so faster hardware will actually update the model more quickly. But does this change the way the model acts? In some physics models, I guess it would. More frames would, in fact, be more accurate. But in most simple models, would calculating more time-slices actually change anything? I kind of doubt it, so even though you are right, and visual frame rate (in a non-threaded game) is tied to model frame rate, more frames would not change the outcome.
Basically, the original poster was making it sound as if
Re: (Score:2)
You don't understand the games he's talking about.
For something like Street Fighter at EFO, they take extra steps to make sure that the framerate is consistant across all play-times, times when the players are just standing there, and times when players are attempting to break blocks for their Hypercombofinishes.
Like many flash games - there is code that is actually executed ON THE FRAME. It is done as the frame is being rendered. When you get intensive moments that have people putting alot of input, lots o
Re: (Score:3, Informative)
The pictures drawn on the screen aren't the real model the game uses.
That's not necessarily true. There's a long history of games relying on the graphics processor to determine when two objects overlap or otherwise meet specific conditions relative to each other. Goes all the way back to the 8-bit days when the graphics processor could tell you whether the non-transparent parts of two sprites overlapped.
Re: (Score:2)
Real fightans are a song an dance of multiple move and mind games against another person.
CPU's are to stupid to play with your head, they typically follow a pattern.
Re: (Score:2)
Re: (Score:2)
Smash brothers gets not repsect on the Fightan scene,
this is probably because of all the Casuals/Button mashers who play it as well
There are skilled players in any game that would whoop my butt, even good old Pacman!
Re: (Score:2)
Re:Where it matters most. (Score:5, Funny)
I thought I was super badass at street fighter 2 in middle school, then I went to the arcade and saw older kids getting the insane combos on killer instinct. First thing I thought was... wow, you really have to study this stuff to know what you're doing. If only there was some sort of global information network where I could quickly and easily find out what all those moves are.
Re: (Score:2)
When I was 13 (1992) I manually compiled moves learned while playing and discussing Mortal Kombat and sold them for $3-$5 a piece. Who needs the internet an enterprising little kid is destroying you in MK then offers to sell you the list of moves he knows?
Counter-Strike... (Score:3, Informative)
I myself used to play Counter-Strike (classic), and I can tell you both FPS and Ping made a HUGE difference in that game to the point that my score would increase as I connected to servers closer to home and used OpenGL instead of DirectX (since OpenGL almost doubled the FPS at the time).
Now, I wasn't an expert but I did play a whole lot. I think you ask most serious players and they would agree the impact of both...
Re: (Score:2)
i wonder how much that had to do with the engine design. As in having the render engine and the game logic joined at the hip so that higher fps meant more repeats of the game logic pr second.
Re: (Score:2)
Agreed. Most players would notice a difference in a few miliseconds of network Latency more than a dozen frames per second, but its undeniable that extra Frames per second give you a distinct advantage.
If I see you and you see me, and you're running at twice my frames per second, You will have a smoother "turn and shoot" motion than me, which means you'll either notice your reticule over my head a slight bit faster than me, or you won't make the mistake of over or under compensating your aim since your moti
Apparently web servers also matter (Score:3, Funny)
Cached Version (Score:5, Informative)
Re: (Score:2)
Than zero.
The human eye can dectect 30 (Score:5, Insightful)
HOWEVER
The human mind is evolutionary designed to make instant assumptions. Cat in mid air facing us = DANGER. No "Is it dead and being thrown at us?" No "Is it a picture?" As such, video games can quite easily take advantage of this evolutionary assumptions and trick the MIND, if not the brain. into thinking something is real.
So while a higher frame rate will increase the quality of the game, it is not essential. It's like getting gold plated controls on your car's dashboard. Yes it is a real increase in quality, but most people would rather spend the money on a GPS device, real leather, plug-in-hybrid engines before you get around to putting gold in the car.
Re:The human eye can dectect 30 (Score:4, Insightful)
> The human mind is evolutionary designed to make instant assumptions. Cat in mid air facing us = DANGER. No "Is it dead
> and being thrown at us?" No "Is it a picture?" As such, video games can quite easily take advantage of this evolutionary
> assumptions and trick the MIND, if not the brain. into thinking something is real.
Sort of. Its actually less "Cat in mid air" and more "This sets off a trigger based on something that happened before and hurt me".
Most adults, if you chuck a rock at their face, will toss up their arms to block, or move their head/body to dodge. This is completely learned. Do the same trick with a young child who has never played "catch" before, and your rock is going to bean him right off his skull.
From my own experience, my first motorcycle accident, I was on the ground so fast, I had to think afterwards about what happened. First two spills actually.
The one after those.... whole different story. The adrenalin hit as soon as I felt the bike start to turn sideways, by the time the bike was fully 90 degrees to my momentum vector, and the wheels were sliding out from under me, I was already calmly kicking my legs backwards and positioning myself for the impact. I hit the ground and slid 150 feet while watching my bike spark and slide away. I thought "shit I am in traffic" jumped to my feet and ran to the bike, picked it up and pushed it into a parking lot.
All I am saying is, its more complicated than that. The memory of such things and whole "flight or fight" response is an evolving and learning response. Its more than just visual, it encompasses all the senses. I doubt "cat facing us in mid air" is going to trigger much beyond anything in mid air moving towards us.
Re:The human eye can dectect 30 (Score:4, Funny)
Re: (Score:2)
I would say it stops a 120fps but I haven't been able to run tests past that point so far.
Re: (Score:2)
Re:The human eye can dectect 30 (Score:5, Insightful)
Congratulations. That -is- incredibly nitpicky. I'm amazed.
He is not a scientist and this is not a paper he is writing for publication. He is using the word 'designed' as the unwashed masses do all the time, and as such, he is not incorrect in his statement. Everyone knew exactly what he meant and nobody had to stop and trying to figure it out. He accomplished his task without getting excessively wordy or having to explain himself 3 times. As far as communication goes, he scored perfectly.
Absolutely (Score:5, Funny)
I couldn't agree more. That Internal Server Error looks way better at 120 Hz on my 45" HD display.
Headroom... (Score:2, Insightful)
the biggest reason to go for the highest frame rate possible is headroom. If your framerate is 30 at best, it'll dip down to 10 sometimes. If it's at 120 optimal it can dip down to 30, and still be playable.
Any animator knows... (Score:5, Interesting)
You can tell the difference between 30 FPS and 60 FPS.
The way I tested this was I made a 2 second video in flash, a circle moving from the left side of the screen to the right side. 60 frames. Run it at 30 FPS.
Then I made a second 2 second video, same exact positions. 12 Frames. Ran it at 60 FPS. Asked me, and all of my surrounding classmates, which was about 24 students IIRC.
100% of us noticed a visible difference in the smoothness. Whether our eyes were making out each individual frame perfectly or blurring some together to create a smoother effect, it was irrelevant since there WAS a noticable difference. I was going to slowly bump the 30 and 60 FPS up higher and higher to see at what point the difference is not distinguishable, but I got lazy (High school student at the time.)
The point I think most gamers would agree on is that more frames per second are nice - but that 30 frames per second are Necessary. You can occaisonally dip down to 24 and be alright (24 is supposedly the speed that most Movie theatres play at) - but when you get around 20 or so its really does take away from the experience.
Re: (Score:2)
120 Frames* I mean. Sheesh. Not proof reading even though theres a preview button.
Re: (Score:2)
Time Splitters was the first game I played which was locked at 60fps: it was quite a remarkable transition, even from games which were locked at 30fps, never mind games that fluctuated (I'll take 30fps and locked over 30-60fps any day). Gran Turismo had a "Hi-Spec" mode which doubled the resolution and framerate too, albeit at an obvious graphical cost, and it looked like The Future.
On the subject of movie theatres, 24fps was chosen because it's pretty much as low as you can go before people notice problems
Re:Any animator knows... (Score:5, Interesting)
You can occaisonally dip down to 24 and be alright (24 is supposedly the speed that most Movie theatres play at) - but when you get around 20 or so its really does take away from the experience.
If by 'supposedly' you mean 'definitely' and if by 'most movie theaters' you mean 'all theaters and even all motion picture production processes in recent years', then yes. The difference is lost on most people, but the reason 24fps is acceptable in movies is that the frame you see isn't what happened at that instant in time when it's displayed, it's everything that happened in the last 1/24th of a second, since it's recorded on film that exposed for that 24th of a second to derive the image. When a computer does it, it only cares about what is happening at that exact 24th of a second; so the difference between a series of exact frames of motion and a series of frames that include the blur of what happens between frames is HUGE.
However, this nuance is lost on pretty much everyone who fires up a computer game, notes the FPS indicator, and goes "OMG I CAN TOTALLY TELL ITS ONLY 30FPSZZZZ!!!! HOW INFERIOR!!!". Whine about framerates all you want, but they are only a small part of the experience.
Re: (Score:3, Informative)
If by 'supposedly' you mean 'definitely' and if by 'most movie theaters' you mean 'all theaters and even all motion picture production processes in recent years', then yes.
I'm sorry but that's not quite correct. I worked as a movie projectionist for several years, so I know this from experience. While 24fps works, and is what used to be used in cinemas, it is noticeably flickery. As with most advancements in cinema technology, they came up with a bit of a hack. While there are still only 24 frames of film per second, the projector shows each frame twice, giving an effective frame rate of 48fps.
Most film cameras don't have a 'shutter speed'. (Score:5, Informative)
more accurately - most film cameras don't have a notion of a shutter 'speed'.
The film roll still goes by at 24fps, but the actual shutter is a wheel. That wheel can have various sizes of gaps (to increase/decrease exposure *time*) and sizes (to produce specific motion blur effects; e.g. an object leading its own motion blur path requires a small shutter opening at first, ending in a large shutter opening). You use fairly sensitive film and a small shutter gap, and you'll get nearly motion blur-less shots like that of Saving Private Ryan (watch explosions in that film and every speck of dirt that gets thrown about appears almost razor-sharp; some find this objectionable). Heck, you can even expose twice per frame if you want to get all experimental and stuff.
That said.. you can't - short of electronic shutters - expose for -more- than the film's fps, though. A bit under 1/24th of a second is the most you'll get (that 'bit' being required to transport the film to the next frame).
Anyway.. wiki: http://en.wikipedia.org/wiki/Rotary_disc_shutter [wikipedia.org]
Re: (Score:3, Informative)
Actually, the lower limit matters based on the person. My eyes are apparently a bit wonky and my lower limit is 15 fps, which would drive most people insane in a video game. Below that and it drives me insane. As for telling the different between 30 and 60... I can do it... Barely. Compare 60fps and anything higher and it's absolutely pointless for me to try. However, I've met people who can definitely tell the different between 100fps and 60 fps.
The motion blur picture (Score:2)
Re: (Score:2)
Whoa, the motion blur image with the birds and the mountain is nice, what game is that screenshot from??!!1
Reality 1.0, very popular MMO with billions of users. Excellent graphics but the developers have been very unresponsive to bug reports.
Friend could see 57 vs 60. (Score:2)
I had a friend who was bothered by anything less than 60fps.
The screen looked "stuttery". He would take a lower resolution to maintain 60fps.
We could verify this in one game with a built in frame rate command.
This is like the "myth of the g spot" post a few days ago. sheesh.
Re: (Score:2)
Many people can see the difference,
maybe not that small of an increment.
There are a few people like your friend who can probably see even more then just 3FPS difference.
Movies at only 24/25 FPS are horrible (Score:2)
Personally I get annoyed by the fact that although they've invented HD (woohoo) they're still shoving it out at only 24 or 25 FPS. To me, this looks really jittery! I wish they'd go up to 50FPS for HD.
Watching Avatar in 3D seemed to accentuate that problem. I'm not sure how they do the whole left/right thing in terms of projection, but it seemed to me that the left/right image was shown alternately and at nothing like a high enough speed for me to perceive it as fluid motion. Did anyone else notice this?
Age-old confusion. (Score:2)
I could tell in a glance the difference between 72fps and 100fps (both common refresh rates that translate to the max fps when v-sync is on) in Counter-Strike just by briefly moving the mouse to pan the scene.
This site has had the definitive explanation
Re:Age-old confusion. (Score:5, Insightful)
The 30-fps-is-all-you-can-see myth was probably born of the notion that the illusion of continuous movement starts to set in around 25-30fps (in film for example). Therefore actually 30fps is the minimum you need rather than the maximum you can perceive.
I think it's more likely born of the notion that film gives a completely convincing illusion of motion that is not greatly improved by higher frame rates, because the process by which it is created automatically includes motion blur because it's recording continuous data, just broken up into 24 fps. Computer games display discreet moments in time, not many moments blurred together into one picture. That's why film looks smoother than computer games with 3 times the framerate.
Nevertheless, the illusion of continuous movement is apparent at much lower framerates than even film, even in a computer game. Quake's models were animated at 10 fps, and they gave a convincing illusion of movement, and you can probably make due with a lot less since the brain fills in so much. But it's not a completely convincing illusion, and neither is 30, 60, or even 100 when using static instants in time.
But the basic myth comes from the fact that film is so convincing and thus you don't "need" more... as long as each frame is a blurred representation of the full period of time it is displayed for.
Re: (Score:3, Informative)
But the basic myth comes from the fact that film is so convincing and thus you don't "need" more... as long as each frame is a blurred representation of the full period of time it is displayed for.
Not quite. Film cameras, because of they way they work, max out about half of the time they are exposed for (180 degree shutter [tylerginter.com]). 24fps is usually shot at 1/48 second exposure time per frame. The full time (a 360 degree shutter) would be far too blurry.
Plenty numerous all right (Score:3)
Brought to you by the Department of Redundancy Department.
Thanks... (Score:2)
Now I can justify another $1000 worth of hardware to my wife, to play the same game I can get on a $300 console.
She gets it. I'm the Computer Guy. I know how it works. I know what is needed. I know how to keep her from not being able to play Farmtown. Or is it Fishville? hard to keep up with the Facebook privacy violations/games.
Ya gotta have priorities.
The 30fps myth (Score:2)
The 30fps myth is simply an over simplification. The eye+brain starts to naturally perceive movement at around 10fps, usually a little lower. Motion usually starts to appear smooth somewhere between 15 and 25fps though it depends on many factors other than just the framerate (smoothness of the frame rate, relative change velocities of objects (or parts thereof) in the image, absolute colour and tone, colour and tone contrasts within the image, the existence or not of dropped frames and other inconsistencies
I never understood why... (Score:2)
Nearly everyone these days uses LCD monitors that have a pathetic maximum of 60hz display at HD resolutions (I think because of DVI spec/bandwidth limitations, Whatever moron invented DVI needs to be shot because of that).
I still have an analog CRT monitor that supports much higher frame rates at HD resolutions which gives a very noticeable edge when playing twitch-games like Unreal Tournament.
I never understood why people claim framerates above 60hz are better when their monitor is only capable of displayi
Brightness (Score:2)
30 Fps myth (Score:2, Interesting)
There were a lot of studies done a long time ago, and there are some very accurate psycho-visual computer models of the human visual system. I had the pleasure of working with the Jeff Lubin model when I worked at Sarnoff Corp, which won an Emmy Award back in 2000.
The 30 fps requirement is not a fixed point, but depends upon a lot of other factors, including viewing distance, field of view, and lighting conditions. The reason that film operates at 24 fps is because it is expected to be viewed in a darkene
The difference in framerate (Score:5, Interesting)
Sorry, you lost me (Score:4, Insightful)
Hmm... I don't accept that premise, either on the PC (where midrange graphics cards can easily pull 60fps with any game on the market now) or on the consoles (where framerates are only going up as PS3 and 360 development matures).
I think that this article (or at least the summary) is a bit of a strawman. Most of the gamers I know recognize that good framerates are important.
Same with audio... (Score:5, Insightful)
The conclusion may be right, but... (Score:3, Insightful)
It may be true that high framerates are a good thing, but the linked article is rubbish - the author's arguments are really very stupid.
Lots of evidence for higher frame rates (Score:5, Informative)
I am a visual neuroscientist (IAAVNS). The standard idea of refresh rate comes from CRT based monitors where the image is drawn by a scanning electron beam. If you use an instrument to measure the instantaneous brightness at a given point on the screen it will rapidly peak as the beam swings by, and then decay as the phosphor continues to release absorbed energy in the form of photons. Different monitors have different decay rates, and, typically, CRTs that were designed for television use have pretty slow decay rates. CRTs that were designed for computer monitors typically have faster decay rates. If the decay rate were very very fast, then the hypothetical point on the screen would be dark most of the time and only occasionally very bright as the beam sweeps by on each frame.
As you can imagine this highly impulsive temporal profile is hard to smooth out into something closer to the constant brightness of the world around us. The human retina has an inherent dynamic response rate to it, but it's actually quite fast, and there have been studies showing clear responses in higher order visual areas of the brain up to 135 Hz. But standard phosphors used in CRTs have a little smoother response, and so at more-or-less 80 Hz, the brain stops seeing the flicker (at 60 Hz most people see flicker on a computer monitor). The exact refresh rate where perceptual blurring happens (so the flickering goes away) varies widely between individual, and with the exact details of the environment and what is being shown on the screen. More-or-less at 100 Hz refresh, no one sees the flicker anymore (although the brain can be shown to be still responding).
Contemporary screens, however, are LCD based (I'm going to ignore plasma screens since the field is still working out how they interact with the visual system). Making the same experiment as above, the temporal profile of brightness at a given spot on the screen will look more like a staircase, holding a value until the next frame gets drawn. This is a far, far smoother stimulus for the visual system, so a 60 Hz frame rate produces a perceptually far more flicker-free experience. That's why most CRTs at 60 Hz make your eyes bleed, while LCDs at 60 Hz are just fine.
Except that newer LCDs have LED backlighting which is no longer constant, but flashed (WHY? WHY? WHY? Just to save some power? Please, computer manufacturers, let *me* make that decision!), so the experience is somewhat more like a CRT.
So that's one part of the equation: flicker.
The other part of the equation is update rate, which still applies even there might be no flicker at all. Here, we have the evidence that the brain is responding at up to 135 Hz. In measurements made in my lab, I've found some responses up to 160 Hz. But the brain is super good at interpolating static images and deducing the motion. This is called "apparent motion" and is why strings of lights illuminated in sequence seem to move around a theater marquis. The brain is really good at that. Which is why even a 24 Hz movie (with 48 Hz frame doubling) in a movie theater is perceptually acceptable, but a 200 Hz movie would look much more like a window into reality. On TV you can see the difference between shows that have been shot on film (at 24 Hz) versus on video (at 30 or 60 Hz). Video seems clearer, less movie like.
For games, 60 Hz means 16 ms between frame updates -- and that can be a significant delay for twitch response. Further, modern LCD monitors have an inherent two or three frame processing delay, adding to the latency. As we know, long latency leads to poor gameplay. Faster updates means, potentially shorter latency, since it is a frame-by-frame issue.
So, just as with audio equipment where inexpensive low-fidelity equipment can produce an acceptable experience, while a more expensive setup can create the illusion of being at a concert, so too inexpensive video equipment (from camera to video board to monitor) can produce an acceptable experience, while a more expensive setup can create the illusion of visual reality.
Re:Lots of evidence for higher frame rates (Score:5, Informative)
With THAT said, I have absolutely zero idea why any sane LED driver dimmer would be anywhere near frequencies that any human could see. LED's can turn on and off in nanoseconds, so a reasonable dim signal should be in the kilohertz range, at least, not the 100hz range. It's *possible* to put a 100hz dim signal on an LED driver, but it seems really dumb to me.
Outside Looking In (Score:5, Informative)
I'm a neuroscientist that covers sensation and perception and its bidirectional interaction with cognition, particularly attention. I've got comments and questions and very few answers after reading this. I'm seeing a lot of things stated as facts that I've never heard of before. Some of them make sense, and some don't. Some of them are correct, some not, and many more than the others combined I have no experience in and can't say. Those seem to be well supported, or at least well known, particularly among those who've obviously done their homework. I can find references to these among the publications (like ACM) that are most applicable to the field in question, but I can find precious little in my customary pubs and books. That's not to say the stuff in the technically oriented pubs is wrong, just that some may not be covered much (ie. 'not of interest') in my field. My field is very cautious about experimental evidence, but I suspect in gaming's perception area there are common knowledge kids of things that came from hear say (we have many of those in rocketry too). It might do well for both fields to compare works.
What catches my eye at first is this "myth". As stated it's overly simplistic. Which humans' eye? Some have different reaction times. Those who could probably detect 30 fps discontinuity are those who see the TV screen jiggle and waver when they chew something crunchy while watching (you know who you are, here's a place to own up to it). What part of the visual field, central or peripheral? They operate differently. Jittering or blurring of objects attended to or not? Betcha it happens more to those not attended to, but that's not noticed for the same reason (hypnosis can bring that out right nicely). And how is it frame rates matter when the visual system evolved as a constant flow analog system? If a phenomenon that shouldn't make a difference does, and that frame rate is strictly due to technical considerations, how do we know that a variable frame rate might not give even better results? Since the visual system does not have full-field frames that refresh, why should artificial presentations? Why not present faster moving objects at a high change rate, slower moving at a slower rate, more or less a timing equivalent to some video compression techniques? Some of this makes good sense from my perspective, some appears goofy but may not be, and some clearly is whack according to well supported experimental evidence from my side, not sure about yours.
Here's an interesting one, apparent motion from blurring, occurring at the retina, ostensibly due to 'reaction time' of light receptor cells (rods and cones). I can see how this might occur. But if it's a time lag that causes blurring, everything should be blurred, because the layers of cells of different types in the retina between the receptors and those firing down the optic nerve operate strictly by slow potentials -- there's not a 'firing' neuron among them. Or, if their processing, though slow, accounts for motion and compensates, preventing adding to the blurring, how can that be used to increase apparent motion?
A last point which I'm fairly certain isn't covered in gaming and graphics presentation because very few know much about it and we don't understand it well: 10% of the optic nerve is feed-forward, top down control or tuning of the retina and its processing. Motion perception can be primed, can suffer from habituation, and has variance in efficacy according to several factors. What cognitive factors have an influence on this, and how can that be used to improve motion perception and/or produce motion perception that's as adequate as what's being used now but requiring less external computational effort because internal computation is being stimulated.
It's probable that both fields have things of interest and use to the other, including things the other isn't aware of. I've said much the same following another article on a different subject. From this one I can see it's probable there's a few peoples' careers worth o
24 fps (Score:3, Interesting)
What he (she?) said (Score:2)
It bugs me that 10 years ago I could play serious fps's and hit 100fps and actually see that on my monitor. It made a huge difference for the kind of competitive precision I was hitting.
My fancy new(ish) wuxga monitor has plenty of pixels, but 60fps feels real choppy to me.
I see some of these future 3d lcd's claiming 480Hz... is there a good inexpensive desktop monitor that can do 120Hz?
Re: (Score:2)
There are only 2 as of now that can do true 120hz...
http://www.newegg.com/Product/Product.aspx?Item=N82E16824116402&cm_re=120hz-_-24-116-402-_-Product [newegg.com]
http://www.newegg.com/Product/Product.aspx?Item=N82E16824001311&cm_re=Samsung_2233RZ-_-24-001-311-_-Product [newegg.com]
Re: (Score:2)
...and some high-end TVs have a 'game mode' that amongst other things switches the interpolation off to avoid the delay you speak of. Specifically, I think some Samsung models have this feature.
There is a related point though which is the fact that a number of TVs/LCD Displays claim to be 100Hz or even 120Hz but can't actually accept a 100/120Hz input. Supposedly the coming generation of '3D ready' displays will rectify this since for a comfortable 3D viewing experience 60 FPS to each eye is required.