Initial Half-Life 2 Benchmarks Released 421
dfj225 writes "According to an article on ExtremeTech.com, it looks like ATI has the lead in Half-Life 2 graphics card performance. Valve benchmarked their new game using the top cards from both ATI and nVidia. Results show the ATI Radeon 9800 Pro drawing around 60 FPS while the nVidia GeForce FX 5900 Ultra only draws around 30 in Half-Life 2's DX9 full precision tests. Read the article to see results on other tests that Valve ran." Update: 09/11 13:06 GMT by M : Another article about the presentation.
Well well (Score:4, Interesting)
Re:Well well (Score:4, Informative)
Re:Well well (Score:5, Informative)
Also, the Planet Half-Life Screenshot Gallery [planethalflife.com], a page with a huge number of interviews with Valve staff and previews of the game [planethalflife.com], and Videos [planethalflife.com]. The huge one [fileplanet.com] is awesome.
September 30th! I can't wait!
Re:Well well (Score:4, Insightful)
Speaking of, any word of OpenGL support?
Re:Well well (Score:4, Insightful)
Re:Well well then your freakin stupid (Score:3, Informative)
That is just plain wrong. You used to be able to notice the difference between 1600X1200 and 1024X768 easily. Now that AA is around, the difference has blurred somewhat.
I run all of my games at 1600X1200 if I can get at least decent performance. Everything scales for the screen, looking the same size as everything on 1024X768, only much smoother. Higher resolutions also will allow for higher amounts of detail, if care has been given in that direction. You've got more pixels to play with, so you could
Re:Well well (Score:2)
This is surprising how? (Score:3, Insightful)
Re:This is surprising how? (Score:5, Insightful)
Believe it those who will, but I would certainly question the integrity of the test, and I won't buy an ATI card over nVidia over this just yet.
Re:This is surprising how? (Score:2)
Research a few independent benchmarks.
Re:This is surprising how? (Score:4, Informative)
The quotes from that second link are particularly damning -- and they're from a variety of companies, including id Software, not just Valve.
I've never owned an ATI card. My last 5 or 6 cards in all my computers (and my wife's) have been nVidia. My next card is almost certainly going to be ATI though because they're currently the performance leaders. I have some reservations about drivers still -- not with performance or stability but with long term support since ATI has still failed to deliver a unified driver architecture -- but I'm unwilling to sacrifice that much performance while still paying a higher price.
Frankly, at this point anyone who is still wondering about the validity of the benchmarks is deserving of the title "nVidia fanboy".
Re:This is surprising how? (Score:5, Informative)
And, yes, OpenGL is inferior to DX at the moment. OpenGL 2.0 fixes most of the issues (particularly in the shader department), but it's far less mature than DX9 is.
And while DX isn't immune to vendor-specific code (see the discussion by Gabe Newell on this and NV3X in HL2, or the shader issues that occurred in DX8), MS is making efforts to reduce or eliminate those occurances. I suspect we'll see some pop up as DX9 becomes more mature, but they'll be resolved in DX10 just as the DX8 issues were resolved in DX9.
I'm not a MS fanboy, but the reality is that you can get a hell of a lot more support if you develop for DX than for OpenGL. That matters to a lot of developers. The downside is that you inherently limit your platform choices... but the reality is that there's 3.5 gaming platforms out there right now -- PC/Xbox (1.5), PS2, and GameCube. Porting anything between them is a virtual rewrite of the graphics engine anyway, so portability isn't a huge concern. The Mac and Linux markets are essentially non-existant.
Re:This is surprising how? (Score:5, Informative)
You may be able to access more advanced features, but that also ties you down to writing specific code for each card you want to support. That's a freaking nightmare. API's are supposed to help you avoid doing that. As I said, both OpenGL and DX have had issues regarding this, but OpenGL's issues are far more prevelant and pervasive than DX's are at the moment. OpenGL 2.0 will fix a good bit of this, but it's not out yet (no.. it's not... all the pieces are in place but it hasn't been ratified yet).
or instance Carmack has talked about how he is better able to access some of the advanced shader features on Nvidia cards through the OpenGL exposed elements than through MS's DX9 interface which was coauthored with ATI
He's also commented on how miserably slow the nVidia cards are with the higher shader functions, even after dropping the precision back to 12 or 16-bit (compared to 32-bit in DX9, which ATI supports fully).
Hell, read the TechReport's discussion on HL2 and nVidia -- spending 5x more time optimizing the NV3X codepath than the generic DX9 codepath and still not even reaching the generic's performance is not a good way to spend your time. If I was a game developer (I'm not) I sure as hell wouldn't do that for most cards. The only reason Valve or id did so for nVidia is because they are such a huge market segment. Do you think they'll be looking at any optimizations for S3 or Matrox? Doubt it.
Until ATI stops writing crappy drivers and prematurly killing still sold hardware I won't be supporting them.
Same. Which is why my next card is probably going to be ATI -- they've ceased doing either of the above. I'd still like to see a unified driver architecture from them, but their drivers and support have been very good for the past couple years. Which also happens to coincide with them firing their entire driver team. Which also occurred at the same time as the utter lack of driver support you reference. The new team seems much better about actually doing their jobs.
Re:This is surprising how? (Score:2)
Microsoft discussing how well windows runs on AMD's chips overs Intel's. Microsoft is in the business of selling windows as valve is to selling games. Valve is just calling a spade a spade and you just plain dont like it.
Its in their best interest to make their products work as well as they can on respective hardware.
later,
epic
Re:This is surprising how? (Score:2)
Re:This is surprising how? (Score:4, Interesting)
1) 3dfx is king of 3D
2) nVidia comes along with interesting products, 3dfx still king
3) nVidia improves (TNT, GeForce), 3dfx struggles, both run neck-and-neck
4) 32-bit becomes important, nVidia take the lead
5) 3dfx struggles, plays catch-up (Voodoo4, 5), yet becomes irrelevant
Then we have:
1) nVidia is king of 3D
2) ATI comes along with intersting products, nVidia still king
3) ATI improves (Rage, Radeon), nVidia struggles, both run neck-and-neck
4) DX9 becomes important, ATI takes the lead
5) nVidia struggles, plays catch-up (FX series), yet
It's not a hard cycle to visualize. A lot of other similarities are there, as well: "fan-boys", aggressive advertising, benchmark scandals, developers' opinions, etc. It's actually pretty cool for us, as we get great advancements in 3D.
Benchmarking even shadier? (Score:5, Interesting)
Shouldn't it be this way? (Score:5, Insightful)
Re:Shouldn't it be this way? (Score:3, Insightful)
Re:Shouldn't it be this way? (Score:2)
Not always (Score:2)
That's easy to test (Score:5, Insightful)
No, it should never be that way (Score:2, Insightful)
Re:Benchmarking even shadier? (Score:2, Insightful)
Go, ATI! (Score:5, Funny)
Oh well, at least communication between hardware and game developers has improved to the point that I won't need to specify to the game whether I have a Hercules, Tandy, or Trident chipset...
Re:Go, ATI! (Score:5, Insightful)
I always rooted for Rendition, but I suppose they died when Micron bought them.
If anything, nVidia was the real underdog in the 3D wars...they were the only company with nothing going from them, and they managed to turn that around. I still hope ATI wins in the end, though. I like their technology quite a bit better than nVidia's....and you can't beat the 2d/3d quality with anything but a Matrox.
Re:Go, ATI! (Score:5, Informative)
Nothing going for them? Uh... do you know anything about nVidia's history?
nVidia was formed from disgruntled SGI employees. You know, the same SGI that created OpenGL and pioneered 3D graphics on computers? Yeah, that one. Why were they disgruntled? Because they had gone to the powers that be at SGI and said "you know, we could make a buttload of money off our technology -- we can make cards that do a large subset of the OpenGL calls and sell it to the PC market for cheap!" SGI management was all about profit margin though, and there's a lot more margin (although not as much profit) in selling a few cards for $50-100k than there is in selling hundreds of thousands or millions of cards for $150-450.
So a bunch of the top SGI graphics engineers left and went off to make their own company. The first few cards released by nVidia were actually OEM'd cards from another company. IIRC, the TNT was the first silicon and code from the ex-SGI engineers, and it was not "butt ugly with a handful of problems" by any means. There were initial problems with running 3Dfx only games (as in, it couldn't...), but Quake and OpenGL remedied that issue. The GeForce completely blew away 3Dfx and they never recovered.
Oh yeah... that little bit about them being ex-SGI engineers? Well, it came back to bite them. SGI sued the hell out of nVidia and it wound up being settled out of court. SGI retains options on advanced features in the silicon and drivers. One of the many reasons that the drivers can't be open sourced.
It seems that nVidia is now suffering from the same problem that plagues a lot of hot tech companies -- many of the primaries have made millions of dollars and decided they don't have the need/desire to work there anymore. So they retire, cash in their stock options, and then go pursue other interests, which robs the company of not only its top engineers but also its visionaries and leaders. The last couple generations of cards from nVidia appear to be due to this. They may come back still, and they're still better off than 3Dfx was, but they've certainly fallen from the lofty heights they used to occupy.
Re:Go, ATI! (Score:3, Insightful)
Dude, it's called "trickle-down." Same
Yawn... (Score:3, Interesting)
Anyhow, just who runs Half-Life or anything with all the eye candy maxed up? No serious gamers that I know of, that's for sure. At the settings that hardcore FPS addicts play at, the frame rate delivered by any card currently being shipped either ATi or nVidia will be sufficient (assuming that the rest of the system isn't subpar).
Once again, for those of us without money to burn the smart buy is that $100-$200 card that cost $600 a few months ago, not the one that costs $600 now (and which will be down to $100-$200 just as fast).
Re:Yawn... (Score:3, Informative)
Re:Yawn... (Score:3, Insightful)
So go on, spend your money on useless things, play Quake 3 at 300fps and marvel at how you see things before they happen. Fact is, you cannot see th
Re:Yawn... (Score:4, Interesting)
You're not entirely correct.
If you have 3 frames of movement displayed but your eye only registers one during that time then you get the 3 frames overlayed on each other giving a motion blur effect, which your brain uses to augment it's motion tracking.
It's how (time)cheap motion blur is achievd in 3D sometimes. For 1 frame of a clip, you render 5 (for example) subframes and composite them together (optionally gaussian blurring it slightly to meld the edges).
Another reason for high framerates in certain games (most notably Quake 3) is that the netcode is tied to the framerate. The optimal framerate for online Quake 3 is 125fps. This allows you to jump very slightly higher, enabling you to reach ledges that you otherwise couldn't.
Re:Yawn... (Score:3, Interesting)
This has many benefits such as being able to jump higher and farther, fall more slowly, and you do more da
Re:Yawn... (Score:2)
Why bother playing Quake at 300fps when your monitor refresh rate is only 85Hz. The monitor is only giving you 85FPS who cares what the Video card is generating.
Re:Yawn... (Score:2)
Re:Yawn... (Score:5, Funny)
Audiophiles are idiots and musicians are often tone deaf.
Audiophiles can supposedly hear artifacts produced by gravity waves passing through solid-gold oxygen-free "ribbon" cables. Stop paying attention to their ramblings: it only encourages them.
Actually I think you made that one up. Every movie buff knows that film frames are double shuttered to play at 48fps. Films played with single shutter are noticeably flickery. True movie buffs also know that the director can't pan a shot too fast or he'll get stutter, so they'd be aware that the human eye sees rates in excess of 24 fps.
I claim shenanigans. I don't think anybody claimed that "the human eye can only see 24 fps". You just made it up because you didn't have an argument.
Re:Yawn... (Score:4, Funny)
Re:Yawn... (Score:5, Interesting)
Well, I was pleased to see the showing the GeForce 4 Ti4600 put up in those tests. I think those can be had fairly cheaply these days (I payed $249 several months ago).
I'm running it in this Athlon 2600+ system (RH 9, fully accelerated NVIDIA drivers). I've been doing some OpenGL development lately, and it's been great on Linux! I have nothing but good things to say about NVIDIA's drivers and OpenGL implementation. Could anyone comment on the quality of ATI's OpenGL support with the 9800 Pro class cards under Linux? (I'd like to hear from the perspective of a developer, but gameplayers would be interesting too).
On the other hand, I do know one way to get great (or at a minimum good) OpenGL drivers for the Radeon 9800 Pro - buy a PowerMac G5. :-) (Yes, I know you could use Windows also...but let's keep our perspective here.)
ATI drivers on Linux (Score:3, Informative)
I've got a 9700 Pro, and the ATI drivers have given me *a lot* of grief as a developer. There are many times when they are so blatantly non-compliant with the OpenGL standards, it's not funny.
For example, the driver claims to support OpenGL 1.3. With 1.3, ARB_multitexture has been promoted into the core, so they driver _should_ export glActiveTexture & friends without the ARB suffix. Well guess what? It do
Re:ATI drivers on Linux (Score:3, Informative)
If you're having any kind of problems like that, email devrel@ati.com, someone from ATi will most likely respond rather quickly. I've emailed them before and they've been quite helpful.
Re:Yawn... (Score:3, Insightful)
Excuse me, but when I drop $300-500 on a video card I want my screen to fucking blow me away! I didn't pay for the new technology just to see it wasted. FPS are important, but getting your money's worth and enjoying what the artists put together for the game is far more interesting than simply looking to get a bigge
Well that was a waste. (Score:4, Informative)
Re:Well that was a waste. (Score:2)
Application-specific "optimizations" (Score:5, Insightful)
The article fails to mention whether they actually detect the application and run the driver through a different code path, or if they've made general driver-wide optimizations that happen to also help Half-Life. Knowing the behavior of these video card companies in the past, I would suspect they have huge chunks of code in there devoted soley to Half-Life.
So, now instead of having to hack around and catch companies cheating on drivers, we just have to read as they admit it openly? This is standard operating procedure now???
When I download the latest Detonator drivers for my nVidia card, I want to download a generic D3D/OpenGL driver, not a Half-Life driver. The amount of time they spend "optimizing" for the popular games is time they could have been spending making sure the performance and quality is adequate for ALL games and modeling apps.
Re:Application-specific "optimizations" (Score:2)
Re:Application-specific "optimizations" (Score:2, Flamebait)
It doesn't matter...
Coming is the day where the driver will optimize itself for all popular games, on-the-fly. What is wrong with that? A race car team is allowed to optimize their car for the particular track on which they are competing. This is the same thing.
Who cares? Soon, we'll have drivers that constantly monitor via internet for "best optimizations" versus the installed software. Ultimately, if the card
Re:Application-specific "optimizations" (Score:5, Insightful)
>Coming is the day where the driver will optimize itself for all popular games, on-the-fly. What is wrong with that?
Um, because it's a crock of shit? It's not optimisation, it's trading quality for frame rate, without giving you a choice in the matter. If I click the boxes for Full Scene Auntie Alienating and Dodecahedral Filtering, I damn well expect the driver to do that, regardless of whether a given game runs at 2fps or not. If I want a higher frame rate, I can turn those options off myself.
I don't want the driver second guessing me, because it's not being done for my benefit, it's being done to scam gullible reviewers and sell more cards.
Re:Application-specific "optimizations" (Score:3, Insightful)
I have *no* problem with optomizations that increase speed *without* quality loss. These optomizations, however, do not do this.
I *do* have a problem with increasing speed *with* quality loss, unless I have a checkbox that specifically says "Do no enable speed optomizations that negatively impact visual quality" or somesuch.
If they're going to optomize, then make it *known* that the do, and make it a user-configurable option to do so.
Re:Application-specific "optimizations" (Score:2)
Re:Application-specific "optimizations" (Score:3, Informative)
On top of this, these "optimizations" are degrading visual effects pretty seriously according to the article by taking away important effects.
Unreliable benchmarks - on a beta, anyway. (Score:3, Insightful)
Re:Unreliable benchmarks - on a beta, anyway. (Score:2)
Also, you do not drop features from a beta. Features get fixed at the end of the alpha. Beta is for optimisations and bug squashing, whatever the product.
Re:Unreliable benchmarks - on a beta, anyway. (Score:2)
Well, sounds like it's the developers running these test. And it sounds to me this a show-stopper performance problem - they'll alienate the huge NVidia-owning fraction of the market. Unless they're in bed with ATI (or they haven't got on with NVidia developer relation
I've only got a Voodoo 3 card ... (Score:3, Funny)
... you insensitive clod!
Seriously though, are they allowing for people with older cards? (UT 2003 ran fine on my Voodoo3 and still looked pretty darn good, even w/o transparency, anti-aliasing, or any of the other modern GFX buzzwords)
Re:I've only got a Voodoo 3 card ... (Score:2)
Probably accurate (Score:2, Interesting)
Re:Probably accurate (Score:2, Informative)
Older Hardware (Score:3, Insightful)
I prefer to save my pennies and upgrade my graphics card to the one just behind the current generation.
Oh boy here we go again. (Score:5, Insightful)
NIVIDA fanboy: blah blah blah nvidia has better support... blah blah blah!!!
I'm not sure what is going to end first, the Israel-Palestinian situtation or the ATI vs NIVIDA arguement.
The fact is both regularly cheat on performance and quality benchmarks, and if you think you can actually say one is better then the other you are a biased fanboy.
Just buy the one on sale, please.
Re:Oh boy here we go again. (Score:5, Funny)
Re:Oh boy here we go again. (Score:4, Funny)
Yeah, but the GeForce FX works better with emacs! Let's see how many flamewars we can run in parallel.
Re:Oh boy here we go again. (Score:2)
Re:Oh boy here we go again. (Score:4, Interesting)
Oh Good Lord, what kind of Trolling is that.
I'll note a few things here:
Firstly, NVidia has reigned supreme in the Direct X 8 and prior arena. Their GeForce cards are awesome.
But DX9 is all about pixel shaders. They are the future, and ATI realized that. They built their R300 core (Radeon 9600/9700) based on the DX9 spec, and it shows. The newest games, such as HL2, which rely heavily on DX9 extensions, run better on ATI hardware than NVidia's stuff because they have to use hacks to get DX9 extensions, such as pixel shaders, to work properly with the GeForce line. NVidia doesn't have it built into the hardware, and the gamers who have them will suffer because of it.
John Carmack has had to write special code in Doom 3 to compensate for the NV30 core that doesn't like DX9 as much as it should. Go read some of his
Look up your facts, and try to stay away from troll-like generalizing until you know what you're talking about.
Unfortunatly (Score:2, Insightful)
60 fps ??? (Score:3, Insightful)
I guess my GeForce4 ti4600, which is just over 1 years old, will only get 30fps or so ! Which means I'll be a sitting duck in netgames.
If these are indeeed optimized benchmarks, I doubt we'll see HL2 on the market soon. The'yll have to wait at least untill the R9800 or U5900 become mainstream. (read : at console-level prices)
Re:60 fps ??? (Score:2)
Re:60 fps ??? (Score:5, Informative)
DX (Score:2, Insightful)
Re:DX (Score:3, Informative)
Re:DX (Score:3, Informative)
This is simply not true. While it may not have OpenGL support (I'm not sure on this), it is NOT DX9 only. Valve has confirmed that the game will run on hardware that supports at least DX6.
Re:DX (Score:3, Informative)
Re:DX (Score:3, Informative)
Re:DX (Score:2)
And Carmack is pro-OpenGL for his own political and egomanical reasons, not necessarily quality reasons.
Re:DX (Score:2)
I have a 64 meg GeForce 4. In DirectX mode, everything looks really pretty but when I hit escape to go back to the menu I get a black screen.
In OpenGL mode, transparency is shot to pieces and where there should be text, I get square blocks.
I wouldn't call myself a self-respecting HL player - but I have to stick with software rendering.
Re:DX (Score:2)
IL2 Sturmovick is available both in OpenGL and in DirectX, but on my Radeon the OpenGL mode was much better than the DirectX.
Re:DX (Score:2, Informative)
That may be true for self-respecting Half-Life players but what about the self-respecting Half-Life 2 players? You know, the ones that will be playing the new game that will be running on a new engine? What do they have to say about this issue?
Nothing. Because they don't exist yet - the game needs to be released before there can be tribal knowledge about the optimal hardware configuration.
let's remember (Score:4, Insightful)
Let's also remember that once ATi was much bigger than nvidia in graphics, and charged exorbitant prices for crappy chips, with shocking driver support.
Let's also remember nvidia have much better performance so far in the more important (and independant) doom3 benchmarks (where 16bit floating point precision is used for nvidia cards, instead of 24 for ati and 32 for nvidia, as directx9 was originally going to specify before nvidia and microsoft fell out).
Also remember that nvidia's cards offer better performance in most 3d rendering apps (where both cards use 32bit fp and almost all of ati's advantages evaporate), so driver tweaking on nv's part in games does not necessarily mean they have a lesser part for that.
Finally linux support is a no brainer, nvidia have been doing it well for years (with support as far back as tnt), ATi have made a recent attempt that is not user friendly, or even support all radeon chipsets, let alone rage 128.
ATi are onto a good thing right now with the current directx9 spec giving them an advantage in games that stick to the spec instead of the optimum end user experience. That is about all they have going for them though. This battle has far from swung the other way, it's merely gotten closer than it used to be.
DX8 vs. DX9 visual differences (Score:2)
I have GeForce Ti4200 (DX8) and it looks like it will be possible to get up to 30FPS in 1024x768, no FSAA. Alternatively, I can get a new Radeon 9600 Pro for 150$ and get the same or better performance in the same resolution, but with DX9 eye-candy and FSAA enabled.
Is it worth it? I assume that the cool
Links Galore (Score:3, Informative)
FiringSquad [gamers.com]
Tech Report [tech-report.com]
Gamers Depot [gamersdepot.com]
Beyond 3d [beyond3d.com]
3DMark03 (Score:5, Insightful)
ATI runs in 24-bit, NVIDIA in 32-bit (Score:5, Interesting)
This is because ATI cards have implemented a 24-bit floating point pipeline while NVIDIA cards implement a 32-bit pipeline. It is reasonable to expect the ATI card to outperform the NVIDIA card at the expense of some round-off errors. 32 vs. 24 bits on a color pixel is probably no big deal (although some color banding might arise), but when those results apply to vertex positions you could begin to see cracks in objects and shadows.
Note that the ATI card is still faster for Half-Life 2 in 16-bit mode, so it is probably a faster card overall for that game. There are so many ways to achieve similar looking effects on modern graphics cards that even as a graphics expert, I can't tell which card is actually faster.
I've been working with both the GeForceFX and Radeon9800 for some time and both are amazing cards. They have different capabilities under the hood, and can perform different operations at different speeds. Furthermore, under DirectX both cards are restricted to a common API but on OpenGL they have totally different capabilities. I don't think a consumer would go home unhappy with either card, except for the price.
-m
Sad day ... (Score:3, Insightful)
zerg (Score:3, Funny)
Go Matrox!
That's not the way I read it (Score:3, Informative)
Also, RTFA, Nvidia is a little shy about "optimized" drivers for benchmarking certain applications. They specifically requested that the optimized drivers not be used. No indication that ATI did the same.
I doubt there will be a Linux version of HL2 either, because this new 3D engine appears to only support DirectX.
That's a shame, because the world didn't end with the America's Army developers ported AA:O to Linux. As a matter of fact, it runs quite well, and it didn't take them 5 years to produce nothing but vaporware.
They used invalid drivers in the benchmark? (Score:3)
This is NVIDIA's Official statement: "The Optimizations for Half-Life 2 shaders are in the 50 series of drivers which we made available to reviewers on Monday [Sept. 8, 2003]. Any Half-Life 2 comparison based on the 45 series driver are invalid. NVIDIA 50 series of drivers will be available well before the release of Half-Life 2".
While I, like everyone else don't like trading off quality for framerate blah blah blah. Who knows what ATI's quality is like? Maybe they optimized their DX9 drivers for the fastest possibly/crappy quality off the bat. I'm going to wait to get the reviews for the Det 50 drivers and get some reviews of what the quality looks like on each card before I'll be making any purchases.
I was actually all set to buy an nv 5600 ulta until this came out. Think I'm gonna wait for them to duke it out a little bit and get to the bottom of things before I decide...
Re:Hmm. (Score:5, Funny)
Re:Hmm. (Score:2, Interesting)
Re:Hmm. (Score:2)
NWN is OpenGL (Score:2, Informative)
Re:Hmm. (Score:2, Informative)
Re:Not at all.... (Score:2, Insightful)
ATI are still ahead in the implementation of DX9 features.
Re:Actually they've done.... (Score:2, Informative)
Re:all this boils down to: (Score:3, Interesting)
DX9: Bad
OpenGL: Good
All Valve is doing is making it harder for other OS's to get their games. So I think I speak for all the *nix users when I say they can go fornicate themselves with an iron rod.
Re:This is interesting why? (Score:5, Insightful)
Re:Why more thatn 25fps? (Score:2)
Also, getting over 200 FPS is the best way to brag about your awesome hardware and gigantic penis.
Re:Why more thatn 25fps? (Score:4, Insightful)
I often hear people say "after 30 fps you can't tell the difference", or something to that effect. That might be true if you were playing back the frames evenly spaced. However, your monitor runs at a fixed 60 Hz framerate (or 70 or 80, but let's just say 60), so a "fps" of 50 will have you showing 5 frames, showing the last frame again, and then showing 5 more, which can produce a noticable stutter even though the "fps" is 50. So that is one reason why you might want a "fps" of at least 60 (or 70, or 80). Also, the really meaningful value is "minimum fps", because that is what you're going to get when you're fighting the boss and all these guys are coming at you and all these things are happening at once. Usually, a higher average fps (say, 120) indicates that the minimum fps will also be higher. So, a high fps score can still be good even if your monitor can't display 120 frames every second.
Re:Why more thatn 25fps? (Score:4, Interesting)
Re:Why more thatn 25fps? (Score:4, Interesting)
1) The fps number is an average. If you average 25fps, then when things get busy on screen the rate can drop to 15 or something, which is very visible and ugly. I you run at 60, that doesn't happen.
2) 25fps looks bad for rapid movement and panning (ie, most games). Next time you watch a film, look at how blurry everything looks when the camera pans rapidly.
Re:Wow, the ATI's dominance is quite amazing (Score:2, Informative)
synthetic benchmarks have been showing that the geforce FX has poor directX 9 performance
just everyone dismissed them because they were synthetic benchmarks
Re:Why so high? (Score:3, Interesting)
And I'd guess you'd need more than 2. So, if TV looks nice at 30fps, you probably need something like 60-120fps to look as smooth.
Not to mention that unlike TV with its never-changing 30fps framerate, the numbers you see for games are an average. A