Debunking The Need For 200FPS 263
Digital_Fusion writes: "If you follow the gaming sites at all you will hear about people that have tweaked their gaming rigs to give 200fps in games like Quake3. Do we really need this kind of framerate? Well no, according to this paper. It seems that anything over 72fps is just wasted as our visual pipeline is saturated past that point." On the other hand, I'm glad that companies make it possible to show crazy-fast framerates, for the trickle-down effect of cheaper "normal" cards.
Still can't go any faster than your display ... (Score:2)
Re:fics (Score:2)
I'm sure they're also responsible for my miserable Karma.
Re:No point exceeding monitor refresh rate (Score:1)
Bogus (Score:1)
Btw, this whole debate was held about a year and a half ago, only it was 30 vs 60 fps, because film only plays at 24 fps. 3DFX even released a 30 vs. 60 demo.
First, no matter what, your monitor is a strobe light. It is not a constant light stream. Unless you have perfectly synched frame generation with your refresh rate, there are going to be syncing issues. Some frames skipped, some drawn twice, etc. Those descrepencies are noticed.
Second of all, even if every refresh of your monitor drew one and only one frame of the world, it would still be a strobe light which is not synched up with your vision. Any person who sits at a computer all day with his/her refresh set to 100Hz can easily tell when the refresh is dropped down to 72Hz. Just as dropping down to 60Hz and opening up a solid white window is noticable to people in 75Hz land.
This was just a silly rant, but any one who plays at 100+ fps can tell the difference between the two. There are just way too many factors going into your visualization for any of us or the Canadian and his snowflake story to put a hard number on it.
Re:Sounds right to me.... (Score:3)
And I don't know many monitors that would even handle that.
Since the only way to render at framerates above monitor vertical refresh rate is (obviously) to disable vertical sync (pausing rendering until the screen is updated), then you'll get tearing effects, as part of the screen is being drawn from data rendered in one frame, and the next part of the refresh uses the next rendered frame.
In fact, this shows that your data's being wasted; say for example 200fps on a 100Hz monitor, only half the data from each frame is actually drawn.
At high frame rates, the tearing effect probably causes the 'blurring' you describe.
Mouse sampling rate. (Score:1)
Re:worst experimental procedure ever (Score:1)
I'm not sure what the eye's "maximum refresh rate" might be, but I know this is a stupid way to measure it. They should at least do it outside in full sunshine.
Not precisely, full darkness turns out to be better, which makes sense if you think about it.
But your point about interference with 60 Hz. (US) rates is dead-on correct. Working in the interactive TV biz I often have to deal with PAL monitors in the US, and there's a noticable worsening of quality of the PAL TVs (50 Hz. refresh) when they're illuminated by florescent lighting (I'm in the US, where the lighting flickers at 60 Hz.) So I turn the lights off, and work from the daylight from the window. (In my case, even then there may also be electrical interference yanking on the signal, I haven't figured out how to prove if that contibutes much to my perception of flicker on PAL TVs in the US--yet.
--j
Beyond the limits of human perception (Score:1)
While you may not be able to visually process 200fps, you can most certainly distinguish a qualitative difference between 60fps and 200fps. The sheep who patronize Blockbuster don't know that the 220 lines they're seeing don't approach broadcast quality, let alone DVD/LaserDisc/Satellite - they're still able to boggle when they see a nice quadrupled high-res image at or above 1600 lines.
More data gives the brain more to work with, whether you can process every individual frame or not, subliminally you're going to pick up MUCH more information, and the result is a more intense experience.
Of course, not everyone runs their games on a 90" screen, either.
Absolutely Correct (Score:2)
Captain_Frisk
Re:Frame Rendering and Game Cycles (Score:2)
Re:Sounds right to me.... (Score:1)
I still play Quake and some addons for Quake2. I bought Quake3 for cheap on boxing day but didn't really like it.
Why would I want a 128mb agp4x card that pushes 200fps on some new Direct3D game that I will never play?
No more than 25 to 30 fps (Score:1)
What it's important are image and colour resolutions. These discussion alreade took place when defining PAL standard. NTSC produces 30 fps, PAL only 25, but with better resolution.
And for graphic boards, it's more important the amount of polygons per seconds, which strongly depends on transfer ratio between RAM and VRAM (see PS2 threads....). I would say also that given that textures add a lot of realism with "little effort", it's more important to support lot of textures than more than 60-72 frames per seconds (yes, yes, its sounds as Murphy law, but it's serves aslo to avoid screen refresh artifacts).
--ricardo
Higer Framerates - Motion Blur (Score:2)
proper motion blurring, by adding the images to each other.
Of course... a lot of games emulate motion blur anyway.... (who builds models for the bullets streaking towards you when you can jut do a streak in the air?)
There are motion blurring algorithms, I wonder when hardware motion blurring will become required for the next generation graphics cards?
What monitor are you using? (Score:1)
Re:Max is 78, min is 55 (Score:2)
Since computer graphics are instantaneous snapshots of the scene, it accentuates the frame rate a great deal.
Movie cameras show a 'blur' of the movement over the duration of the frame... i.e. temporal antialiasing.
Graphics cards are supporting spatial antialiasing, which give the impression of a higher resolution and smoothes those nasty jaggies at the edges. Temporal antialiasing could be the 'next big thing' (although the methods using accumulation buffers are years old, the hardware has to catch up). 3dfx have their T-buffer which could do such a thing, and aren't ATI producing a card with accumulation buffer support?
wrong... (Score:1)
Indeed, the major falacy is that we live in an analog world. No we don't! We live in a quantum world, and light is quantized. That in itself should set up the rest of the system (retina/brain) to behave more like a digital system. This is why (partly) we percieve moderate flicker at the movies, 24 fps doens't cut it. Nor the 60Hz (30Hz really, taking into account the interlace). Flicker stops when some part of the retina/brain system is fully saturated. I don't know if there is such a point, our bodies seem to have very large bandwidth.
Just a side note, I can see flicker on my 21 inch monitor at 70 and 75 Hz. I can still make it out at 80Hz, and at 100Hz it seems solid. Thank god for my video card...
Kawaldeep
This is bogus! (Score:2)
There was nothing scientific about how the 72+fps limit was calculated! As far a I can tell, the author judged by how much flicker he could perceive in the refresh of his monitor, to determine how much detail.
That's crap.
The refresh rate will only tell us about our persistence of vision effects. A refresh below a certain threshold does not trigger the POV to kick in, so that we can see the flashes of the monitor, whereas a refresh above that rate means our POV will start to blend the frames together.
The argument of 72fps doesn't limit the human visual system from seeing a fast moving object; if something traveling at a certain speed gets drawn on screen twice at 72fps, it will get drawn 4 times 140fps, and with a decent monitor at the right resolutions, those four frames should be seen on screen. The real argument, then, is whether the human reflex is fast enough to react to those 4 images (whether the visual system is fast enough to see all four frames, or just blur them together into one image, is irrelevant, I think). Can a person dodge a railgun?
Well, at least that's my 2cents
The nick is a joke! Really!
Re:Framerates and the Verticle Refresh Rate (Score:1)
Perhaps useless for you, but not for your cat (Score:1)
Perhaps this is a worthy cause for pursuing high framerates. Maybe someday you'll keep your pet occupied at home (watching TV) while you're out, or have your pet participate in your video games
Re:Frame Rendering and Game Cycles (Score:1)
There would be an Int08 handler (possibly reprogrammed to have more than meagre 18.2Hz freq like in DOOM) which would simply DRAW the CURRENT view scene, that's it nothing else, except for safe check to avoid re-entering same handler if rendering was slow for any reason.
Int09 (keyboard) would handle movements, which would only be someting like, change player actor X to X-10, and Int08 handler would redraw it next time. Same would go for mouse interrupt Int33, if my memory serves me well (callback function).
So, there was no such thing as the loop you described in all more or less well built games (id's). I actually did this thing like 6-7 years ago, no one it his sane mind would do differently these days, it's no longer state of the art approach.
Will we eventually start noticing? (Score:2)
It got me thinking, maybe we can't now, but if we start getting used to incredible sound quality, would we then listen to our current 44,100 khz/sec and be confused at how we used to listen to 44,100 khz/sec every day?
I started thinking this because if you listen to a lousy stereo system all the time, you get used to it, and it starts sounding decent. Then you travel to the nearest audio store, and listen to the newer, better stereos, you go back to the previous one and you suddenly realize how horrible your's sounds.
So is that the same as video? Sure right now we (supposedly) can't see above 72frames per second, but if our eyes got used to the 200 frames per second, would our eyes... adjust, so to say?
What is ALWAYS forgotten (Score:1)
Re:30 FPS .. not even the issue (Score:1)
Better res, not fps (Score:1)
I mean if you can do 200fps at 640x480 you should be able to bump it up to 1024x768 and still maintain a decent frame rate.
Benno
Latency not throughput (Score:1)
So the reason you get that millisecond jump on someone running at a lower framerate is that you're effectively seeing a few tens of milliseconds into his future.
Re:You're missing the point. (Score:1)
rosie_bhjp
Re:Monitor refresh rate (Score:1)
what about resolution? (Score:1)
No they dont! (Score:1)
Re:Not true for Quake 3, at least (Score:1)
Sounds right to me.... (Score:1)
I do know that sometimes when playing games with a uber-high frame rate, I sometimes get a 'bluring' effect that I don't get a lower frame rates. Plus, if you don't have a top-of-the-line graphics card, the lower frame rate will allow you to process things faster. Plus, who really needs 200 fps??
Eric Gearman
--
Also related to Quake 3 and FPS, but off topic (Score:1)
Quake3World Messageboard Post [quake3world.com]
Hardcore people get a GeForce2 and play at 680*480*16 to maintain a constant high framerate because physics rounding errors are greatest at the higher framerates. Then they can make crazy jumps like the rail jump in q3dm6 and the megahealth in q3dm13. The 1.25 patch is supposed to fix this though.
And on topic, visually I can't tell the difference over 70fps in any game, so its ridiculous to play at 200fps if you can bump up the resolution or turn on other options.
Re:Monitor refresh rate (Score:1)
Monitors' refresh rates are in Hz, not MHz, you freak.
The need for high frame rates (Score:2)
Re:Dumb analysis (Score:1)
One thing to note is that this does depend on the ambient light level. If the room is bright, and the monitor is turned to maximum brightness, you notice flicker much more than if the room is more dimly lit, and the monitor brightness is turned down a little.
Personally I really like LCDs, especially when I'm working for long periods - no flicker at all!
- Fzz
Re:Motion Picture frame rate is only 24 fps! (Score:2)
>multiple times" thing is some weird urban
>legend...
Sorry, but speculation does not make fact. If you bother to go to the rec.arts.movie.tech FAQ, they refer to how the use of a double-bladed shutter does in fact cause each frame to be shown twice.
The issue of the matter is how the human eye percieves light. By cutting the display time in half, even if it is the same image, the eye percieves change, thereby creating an optical illusion that cuts down on the perception of the actual jerkiness of the changing images.
As I am not disciplined in this field, I did not retain the information on this particularly well. However I have come across a large number of sites backing this up. (Mostly stumbled across when looking into HDTV and progressive scan video technology).
Matt
Re:Frame Rendering and Game Cycles (Score:1)
"I'm surprised to read that games like Q3 don't do this. (Physics depending on your refresh rate is just nutty.)"
Quake does do this, so the physics engine is not dependent on the frame rate. *However* there is a *bug* in Quake3Arena that makes the physics engine slightly different at different frame rates, but it is a bug, and doesn't have to do with the game loop design. The physics engine runs in a .qvm module, which runs on a virtual machine, and apparently there is a floating point rounding error in the virtual machine implementation itself that causes the bug. The newest patch is supposed to fix it.
As far as LionKimbro's post goes, I don't think there's too much difference between how you've mentioned it, and parallelizing it. Parallelizing it makes things quite a bit more complex - it sounds pretty beautiful on the surface, run your model/view each in their own threads and they can update at their own rates. But somewhere along the line these two threads must exchange data - and this happens often - every time something moves, every time something new joins, or geometry changes etc etc (all the time, in other words) - you can't just update that while your rendering thread is rendering. This isn't an easy problem to solve, it requires very careful design and thought into how the threads will communicate. We currently do our stuff the parallelization way, but one of the main reasons we do it is so that the application main thread cannot "accidentally" stop the simulation - e.g. if a modal dialog box pops up on the server applications, or the user presses 'alt' by mistake and the window enters the menu loop, then you don't want everything on the network to suddenly stop updating. So we run the simulation stuff in another thread.
Re:Frame Rendering and Game Cycles (Score:1)
"or the user presses 'alt' by mistake and the window enters the menu loop, then you don't want everything on the network to suddenly stop updating"
Forgot to mention .. an interesting example of this .. grab your mouse down on the scroll bar of a Quake3 dedicated server dialog box while people are playing .. :)
Re:Not true for Quake 3, at least (Score:1)
Re:Motion Picture frame rate is only 24 fps! (Score:1)
---
What?? (Score:1)
What? I have *never* heard this one. I'm currently a med student and have been more or less "in the neuro field" for some time before that. Perhaps I'm wrong, but I'd love to see where that came from. A reference please?
Re:Yes, there is a need, penis size (Score:1)
---
FPS != Hz - Animation vs. Flicker (Score:1)
I can't say I a agree with the authors proof that we can percieve 60fps+. His simple experment involved human perception of flicker NOT animation. Yes it is true that most people can perceive flicker up to about 72Hz. But what we are talking about is a cycle between to completely different states, the moniter screen is blank for 1/120 of a second it is on for 120th of a second (ok this is over simplifing but you get the point).
This is completely different from animation. Animation is a gradual change from one scene to the next. It is much more difficult to distinguish subtle changes from one scene to the next then is to tell if something is on or off.
Imagine looking at a painting for a moment and looking away, then looking back a again. Would you notice a subtle change in the scene? Probably not (we are talking fractions of a second here). Now imagine you look at the painting, look away, look back and it is completely gone. Then I am certain you would notice. The two example are COMPLETELY different situations.
What is the human threshold of perception for fps. I really don't know but I would say it is well below 72fps or even 60fps. I would estimate it to be somewhere between 40 and 60 fps. Any thing more is a waste of CPU cylcles.
Personally I would gladly trade 60+ fps for better image quality or resolution.
(Note: many posters have also pointed out the difference between average and peak fps, so I feel no need for further comment on that here)
Re:Also related to Quake 3 and FPS, but off topic (Score:2)
---
Re:Not necessarily true (Score:2)
A good program that fixes this is PowerStrip, not free but nagware, but all you need to do once (and then any time you reinstall) is ask it to first get the best rates for your monitor, then store those in the registery, so that you can pick and choose the refresh rates to use for that particular resolution. This will work with nearly all video cards. It's also got various tweaks, but best to go with card-specific tweaking programs for that.
Re:Framerates and the Verticle Refresh Rate (Score:2)
Even in that situation, is there any point in actually rendering to the card? You're not going to see that frame, since your monitor can't keep up. Instead, they could do event handling, and then wait the amount of time it would've taken to render the frame, or perhaps even done additional event handling cycles...
Of course, doing event handling synchronously with rendering is a bad idea from the start.
Even the editor didn't agree with the article (Score:4)
Also note that if you really could see 60, flourescent bulbs would seem to strobe for you. They don't for me, but ask around and you'll be surprised. (It works best with 1 direct bulb. More bulbs, especially on different circuits, can be at different parts of the cycle and meld together.)
But you CAN see a much shorter flash than 1/60th of a second. You don't see in strobe, you see the average of all light in the slice - the "shutter" is open the entire duration. Which is why you see a blur: it's the average of all the images from 1/whatever of a second. This averaging is why the sleepy hollow cardinal trick (and many others) work.
I'm not sure what good 200 fps does when your monitor rate (for a regular monitor, admittedly) tops out in the 80s. I think there are two reasons:
1 mentioned above, is extra capacity. 200 fps average might equal 60 fps during a fight scene.
but another reason is that even if you're displaying only 60 Hz (monitor limit) to have maximum smooth you need a frame refresh every 1/60 of a second, not just an average of 1/60th. And if frames take varying amounts of time to process, which they do, you could be unlucky and have 2 frame refreshes in 1 monitor refresh and then none in another... it would look like 30 Hz because only every other monitor refresh would be an unmodified repeat. This can happen even with BOTH the monitor and fps being at 60 Hz if the fps changes size (sinusoidally, in my example) and they two are not in sync.
FPS are not regular, and the reality is the fps is a measure of speed, NOT a reliable timing device. 200 fps != 1/200th each refresh... the first one after you turn is going to have to make many more changes, so it's going to take a lot longer, whereas many things will be reused in the next one. (this assumes nothing has to move to the card on the bus, which I won't go into) so if that refresh is 4x longer than average you'd be down to 50Hz for that frame. THEN you have to use an integer number of monitor refreshes, so it's going to be 30 Hz as viewed. Too much math, perhaps.
I predict that eventually (probably 1 more generation) many of the objects will be dynamically generated in sync with the monitor refresh. The framerate will be fixed at the (variable) monitor refresh rate. For each frame, one class of objects will be redrawn each time, no more and no less. The problem is that that class has to redraw asynchronously with any other kind of redraw, and that can be bad. But it's good for many kinds of animations... and depending on the architecture it should be no worse in any case.
you heard it here first.
70hz looks like flickering to me (Score:2)
It drives me insane. I just got a new monitor that will do 87hz at acceptable resolution, but I haven't gotten around to adjusting it yet (linux).
But there's a big difference between refresh rate and frames per second. I'm guessing if you got tricky enough with simulating motion blur, you could drop the frame rate down to around 20-30fps (film is 18-24) and still get acceptable quality.
Article displays embarassing lack of knowledge (Score:2)
It's hard to say what the maxinum frame rate a human eye can perceive directly is. It depends on the viewing conditions and the observer. In daylight, I can easily watch the progress of the video beam on a 50Hz TV as it makes its way from the top to the bottom in each field of each frame.
If 'frameless rendering' can be used (an option if real-time raytracing is feasible), then the natural smearing and removal of temporal aliasing in a quickly changing scene will lessen the need for a very high frame rate. Try searching for 'Frameless rendering'. I'm looking forward to Quake XXIV Bitchfight implementing it.
For pretty pictures and interesting reading, seehttp://gwis2.circ.gwu.edu
- I mean to win the wimbledon!
Your errors... (Score:2)
And monitors have a vertical refresh rate usually around 60 - 85 hz (and higher). This is how often the beam traces from the bottom to the top of the screen (how many screens/second you get).
The horizontal sync signal is usually measured in Khz, which might be what you are thinking of, but this is only used to move the beam back to the left (or right, whatever...).
Dumb analysis (Score:3)
"The visual cortex is where all the information is put together. Humans only have so much room in the brain, so there are some tricks it uses to give us the most information possible in the smallest, most efficient structure. One of these tricks is the property of motion blur."
Some tricks that produce motion blur... However he does not explain any details of what these tricks are. How human brain compresses information is still a question but this guy even does not touch this slightly. Only "tricks of the trade". Sorry people but he is very superficial. I am no expert on these things but I saw books and I know people who would explain more clearly for the layman these things. Once, Scientifc American published an excellent book exclusively dedicated to this problem. I think it would be worth to search for it.
On what concerns 72fps. Is he nuts? I can discern a 60-70 fps picture clearly from a 110 fps! On such level it is still well seen how things go hickcups.
And on what concerns monitors. For me and several people 60Hz is deadly painful! Seat on a 60Hz monitor for the whole day and you surely get some serious headaches (specially on the temporas and inside the eyes). It looks like someone furiously turns lights on and off. On a 72-75Hz it is still visible the flickering. The minimal frequency for such aliens/mutants like me is no less than 85 Hz. And sincerly one gets tired working on such monitor. My good level is 100Hz. Yes there I can work without feeling any stress. Btw. When working, I pass more than 12 hours day in front of the bright head of the computer. In fact, my work turns frequently to 36 hour shifts (like today, I'm in the 17th hour). So guys, maybe I mutated too strongly... >:E
Well, I don't know where this guy took his theories but my everyday work tells me he's nuts. So much for the theory.
Re:Bah (Score:2)
Re:But that is only the average FPS that is over 2 (Score:2)
So, why is the common practice to quote the maximum framerate and not the minimum framerate?
Why are "timedemo" tests usually lightweight compared to actual gameplay?
You're right on, but I don't think the dicksizing motivations should be ruled out either.
Re:worst experimental procedure ever (Score:2)
Many people can notice flicker, even in a completely dark room.
I usually work in a dark room, and even at 75hz, I notice flicker on my screen. 80 is tolerable; 85 is just fine.
Interesting note about the tool shop, btw..
Re:Max is 78, min is 55 (Score:2)
I've found even a very simple scene, such as watching a wheeled vehicle move, in an Imax movie to be disorienting. The part of the view that seemed to cause me particular trouble was the wheels turning.
I think it also depends on the "velocity" of an object in motion on the screen. Very fast moving objects require faster update than slower moving objects. The test (which is admittedly imperfect) that I use is to turn my head rapidly from side to side. Even at 80 Hz refresh rate I can see the discrete frames. Again, this particular test has a lot of problems, and I may be testing the wrong thing, but I suspect that the reality lies somewhere between the 72-80'ish fps of the article and the 200 fps that some people are trying for.
Re:Motion Picture frame rate is only 24 fps! (Score:2)
Re:worst experimental procedure ever (Score:2)
Rick
fics (Score:4)
One thing he forgot to mention (Score:2)
As far as I know there is always a lag between screen frequencies & fps. On glasses systems this is quite visible. To get a 50 fps you need a 100Hz monitor as minimum. To get higher rates you need a monitor going nearly 2 times the fps rate. So it is quite logical to try to achieve 200fps as they also have to be divided in glasses systems. However then, monitors should reach a cool 200-300Hz to give a chance for your eyes.
I have never see a glasses system but some friends around here tell that presently that is the same as burning you eyes for good. So let's wait the 200's
Re:Average frame rate isn't the issue (Score:5)
In the midst of battle with body parts and rockets flying everywhere (clarification: my body-parts; someone elses rockets), my rate easily drops down to 90fps. Very rarely, I'll catch it plummeting as low as 70fps or 60fps. I can't really tell any difference between 70fps and 150fps, but anything below about 60fps is noticable to varied degrees.
As long as you can still aim and shoot fluidly, you're fine. Anyone who is still moving fluidly at the heaviest point of graphic intensity shouldn't worry about tweaking every last frame out of their system. Unless there is some revolutionary change in the industry, I don't plan to upgrade my cards for a long time to come (until we see games that drop my frame rate enough that I can notice it). I'm certainly not about to dump a few hundred more on a card just because I can achieve 200fps, when 150fps will more than do.
Besides, what is more irritating is the games with the poor net framework that makes finding a fast server impossible. While Q3's code seems to be sleek (I usualy find a lot of servers averaging between 12 and 30 ping), other games (Unreal Tournement, to name one) rarely have anything below 100 and only a few under 200 ping. Even the sweetest frame-rate can't help poor network performance.
---
seumas.com
Re:Bah (Score:2)
And tribes II is taking an awful long time, though I'm definately a customer when it comes out.
The 'render extremely large areas' shoudl read 'render extremely large areas with very little detail'. Any map that has lots of structure on it bags down in a hurry; rolling terrain is great. On that note, though, i agree, that's what made tribes really cool.
Watch what you say about rendering though.. indoors or outdoors, as soon as you have lots of structure or detail, tribes bags.
And tribes rocks on team fortress, I agree. Tribes is awesome.
I'm not a gamer, but... (Score:2)
Re:Average frame rate isn't the issue (Score:2)
If I can push 30 I'm perfectly happy.
Average frame rate isn't the issue (Score:4)
Control Lag: The eye is only half of the equation (Score:2)
Couple this with what other people have mentioned (primarily that MINIMUM fps is all that matters: difference between a normal timedemo and an intense fight can be as much as 5x to 10x, try getting 72fps minimum for that!) and the fact that the physics of all quake derived games are biased towards high fps, and 200 fps is actually kinda low. This is of course if you want to play "competively", if all you want is a relaxing frag for half an hour after work, 50 fps average will do you fine.
The post:
Re:Sounds right to me.... (Score:2)
Max is 78, min is 55 (Score:3)
But that is only the average FPS that is over 200 (Score:5)
And I promise that i can tell the difference between a computer avereging 72 FPS and 200 FPS.
Sanchi
go speed (Score:2)
Why Quake3 needs High FPS. (Score:3)
The game is designed in such a manner that there is always a server and a client, even in single player mode. Quake has this little oddity (which hardcore Quake player use a lot) that allows them to achive a bit more height with certain update ferequencies. And somehow the updates are linked to certrain FPS. For instance: Begin able to just jump normally up onto the megahealth platform without any other aid on the Q3DM13 level has certain advantages. I can only do it with a FPS of 120 and 140.
With an FPS of 140 I can rocketjump higher, and with a FPS of 120 gravity seems to work a little less harder, and I can jump from the railgun to the rocket-launcher platform (and back) on Q3DM6. (Using a combination of circle-jumping and stafe-jumping techniques that exploit some other physics feature - these are so difficult to master that they were left in from previous bugs)
Thus for Quake I need a sustained 120. It is possible in Quake3 to cap the framerate at acertain value, but then you must be sure you can keep it there. Besides, there are certain jerking phenomena with my mouse with has a update frequency of 120 Hz, and my monitor with refreshes at 120 Hz if I cannot seem to keep 120 FPS in quake. (Which makes railing more inacurate)
These things are only important in competitive playing, for which Quake3 was designed.
Secondly: Other games.
Mostly similar to reasons I stated above - Mouse jitter on certain systems, as well as sustaining the same FPS on even high difficulty scenes. Most of the FPS ratings were done with certain detail off, and was only an average. You need about 150+ on average to have >80 on worst scenario.
Err... no (Score:4)
------
Do something besides optimize (Score:2)
You are right it's not real (Score:2)
TMOICBW.
Re:FPS does make a difference (quantum effects) (Score:2)
I remember dealing with this in a moon lander program for Radio-Shack Model I. The original version did a non-realtime cycle of 1 "frame" per second. I found it annoying that you could be 3 feet up dropping 50FPS, and then, after a heavy burn, be 10 feet up climbing at 80FPS. I resolved the problem by calculating lower bound of the curve to see if you touched the ground. If you touched the ground, I'd calculate your speed at touchdown time to decide whether or not you cratered.
Later on, I did a realtime version -- peek commands for keyboard scan codes and input editing routines. I think I got it up to about 5 FPS. At the time, that was considered pretty hot.
`ø,,ø`ø,,ø!
Re:But that is only the average FPS that is over 2 (Score:4)
72 fps decaffeinated maybe (Score:2)
That's not the point! (Score:2)
Uhm... (Score:3)
The article doesn't debunk it. It supports it.
Read up before you post, Tim.Re:But that is only the average FPS that is over 2 (Score:2)
Because it's nearly impossible to measure the minimum framerate in many games, we assume that a card or system that has a higher average FPS will have a higher minimum FPS in gameplay conditions. This point is debatable, but not illogical.
Why are "timedemo" tests usually lightweight compared to actual gameplay?
We can't compare actual gamplay FPS taken from different sources, so a standard demo is used to consistent basis for comparison. As above, it isn't likely to truly represent gameplay, but the contrast between two cards or systems is likely to be the same.
In a busy situation, both will likely take the same performance hit, and the one with the higher average FPS will probably still be on the top.
--
Re:Not true for Quake 3, at least (Score:2)
Despite that, I don't think a game like Quake needs to be realistic. In fact, I like Quake because it isn't truly realistic. Do people bitch that Tetris isn't realistic? Why should Quake be any different?
I'm a little confused by the current rash of realism-based games like SOF or CS. The "realism" seems quite arbitrary and independent of its effect on gameplay. You have a gun that takes five seconds to reload, but you conviently forget the fact that you can't really reload a gun while dodging bullets, switching weapons and running for cover. Real SWAT people spend months training for fifteen seconds of action and don't particularly find it fun.
--
Not true for Quake 3, at least (Score:3)
Jump from the rail over on DM6, the swing jump to get the health in the middle of Tourney 4, etc, cannot be done with lower framerates.
Granted, this has nothing to do with perception, but gameplay is also kinda important.
Framerates and the Verticle Refresh Rate (Score:3)
Most games use this to their advantage, so that when I play Half-Life, my frame rates never go above 72 FPS since my refresh rate is around 72 Hz - this is used to prevent "tearing" when one frame is rendered during the first half of the sweep to refresh the screen and another is rendered later. Going above your refresh rate will actually make your game look worse.
Even if the card is capable of 200fps, it should never actually do that - unless you have a rediculously fast refreshing monitor, you're just drawing frames that you won't see or that will simply tear. Plus, I believe that it's been stated that the human eye cannot discern framerates above about 60FPS anyway. Although it is quite nice to be able to play Half-Life at 1280x960 at a constant 72fps (again, locked to 72fps since anything higher would tear on my display).
Not necessarily true (Score:2)
Further, as others have mentioned, average frame rates are relatively irrelevant compared to framerates while rendering complex scenarios.
Also frame rates with *existing* games are often tweaked by handling the number of polygons on screen at a time. Dynamic T&L engines can place further strain. So even if Q3 could get 200 FPS on a highly complex screen (as opposed to an average scene), the only thing that means is that it's time to build more complex scenes.
At some point we'll have photorealistic engines at 200 fps for the most complex scene imaginable, but that day is far in the future.
Re:But that is only the average FPS that is over 2 (Score:2)
Sanchi
Motion Picture frame rate is only 24 fps! (Score:2)
There is never a need to go beyond 75 fps for video because this is the refresh rate for most monitors above 1024x768. Pushing more than that, you will have frames that are rendered, but the scan gun will never pick up the pixels!
Of course, what *really* matters is sustained frame rate under scenes of high complexity, but as long as you can always manage 60-75 fps you'll never see the difference.
-p.
Re: Useful FPS (Score:2)
What 200+ FPS Is Good For. (Score:2)
Of course the human eye has limits, but that's not what 200+ FPS is for. The more frames, the faster you can do fake radiosity, environment mapping, and other effects that involve multiple frames composited to form the final image.
I would like to see 307200 FPS so we can run a separate pipeline for each pixel on a 640*480 screen. Oh... and I'd like the card that does that to be so cheap that when it burns out you just run down to the drugstore and get one for $1.98. It'll happen eventually.
Eenie meenie miney moe
Stupid voters have to go.
Inca dinca dinca do
I can do it, why can't you?
Frame rates (Score:2)
But above 100FPS, it just doesn't matter. Besides, you're limited by monitor sweep rate and phosphor decay rate.
Control is also a factor (Score:2)
depends on the monitor, too (Score:2)
after that, it is all sort of gravy, depending on the other bells and whistles and effects and such....
seriously, even for regular applications, some monitors look worse at "standard frame rates" compared to others....
not that it matters *that* much ...[smile]
Frame Rendering and Game Cycles (Score:5)
60/72 Hz is what we want when we play games, but you generally have to target above that for when your potentially viewable set (PVS) changes dramatically- if you move *really* fast (missile cam), say, or if you just turn your head 90 degrees (and look down a completely different hallway).
A current trend in games is to seperate the rendering cycle from the simulation cycle.
Historically, games have been implemented with a read-eval-print loop like this:
Now, we (FPS, 3D) seem to be moving towards the parallelization of read/eval (simulation) cycles and the print (display) cycles. That way they can be controlled independently: The display can be given just the cycles it needs to provide 60/72Hz, and simulation lives in it's own space. The display routines have their own prediction mechanisms to make sure that they can keep pace.
It's essential these cards can do 150-200+ fps (Score:2)
Re:Not true for Quake 3, at least (Score:2)
Re:Max is 78, min is 55 (Score:2)
But don't movie cameras introduce temporal anti-aliasing, which would help reduce the effects of a lower framerate?
The importance of framerate (Score:2)
A game that runs 72fps most of the time but drops down below that, even if it's only a little, will not feel as solid or be as playable as one that holds a steady, say, 60fps.
PC games have always been the worst in this regard, partially because developers assume that better hardware is going to come out that will make their game run faster in the future, and partially because hardware is so unpredictable that getting it running smoothly on one machine doesn't necessarily mean that it will run smoothly on the next.
It's excellent that hardware manufacturers keep pushing the level of performance of their products, but it's not so that we can achieve 200fps with Quake 3. It's so that the game won't drop *below* its peak framerate, ever, even on a complex level with lots of enemies. (First person games have a special need for a high framerate because of the speed with which your viewing angle changes.)
Re:I can distinguish 85 fps (Score:2)
Motion blur- author is wrong. (Score:2)
"The lack of motion blur with current rendering techniques is a huge setback for smooth playback. Even if you could put motion blur into games, it really is not a good idea whatsoever. We live in an analog world, and in doing so, we receive information continuously. We do not perceive the world through frames. In games, motion blur would cause the game to behave erratically. An example would be playing a game like Quake II, if there was motion blur used, there would be problems calculating the exact position of an object, so it would be really tough to hit something with your weapon. With motion blur in a game, the object in question would not really exist in any of the places where the "blur" is positioned."
This is just a failure to distinguish between a software *model* and it's screen rendering or *view* (Smalltalk programmers will see this at once). It is perfectly possible to maintain a precise location for an object in the game's model of the it's world, while only *rendering* a motion blurred version of the object. This would allow extremely fast moving objects (projectiles, shrapnel, etc.) to be rendered realistically, while still keeping the game's internal world model as precise as necessary to determine hits, collisions, etc.
In this context, it should be noted that movie special effects make *extensive* use of motion blur to produce extremely realistic renderings of non-existent scenes using very low frame rates. Motion blur should really be seen as the key to realistic rendering, since frame rates will never reach the threshold necessary to freeze extremely fast moving objects. After all, in the real world, one needs a very high speed strobe to freeze a bullet. Frame rates, especially in demanding frames (lots of objects, lots of motion) are not going to hit the 1000 fps mark any time soon. If fast moving objects are to be rendered realistically, then they'll have to be done with motion blur, just as film professionals, like ILM, discovered years ago.
Re:But that is only the average FPS that is over 2 (Score:2)
I play with maximum graphical details on a dual P2-450. I get about 90fps, and major gibbing doesn't kill my frame rate the same way it does when I set r_smp = 0. I should point out that at 640x480, the performance bottle-neck for Q3 is the CPU. Turning off all of the detail or increasing the resolution to 800x600 makes little difference. I just like to see things in glorious technicolour rather than hi-res.
I think you have put your hand on it (Score:2)
Try moving your hand between your eyes and your screen. Where did all those fingers come from?! It's a strobe, and that's what you see when you turn your head from side to side too. You can see the dark/light contrast.
The faster the refersh the more fluid that had motion will be and the less your screen will seem to flash as you look rapidly from one corner of your much too big monitor to the other.
200 FPS may really be better.
Poster does not play games.
Re:Not true for Quake 3, at least (Score:3)
This does not happen when you use the hard compiled DLLs, obviously. Also, you do not need a high framerate in order to exploit this bug. Rather, there are framerate points where it is exploitable. 37 frames per second is one point.
Michael Labbe