Quake 4 Graphics Performance Compared 71
Timmus writes "nVidia's huge lead in OpenGL performance is apparently gone. According to Firingsquad, ATI's latest hotfix driver brings major performance improvements to ATI's RADEON X1800 cards in OpenGL games like Doom 3 and Quake 4. The X1800 XT is now faster than GeForce 7800 GTX, while the X1800 XL is faster than the GeForce 7800 GT in most cases. The article also includes GeForce 6800 Ultra/GT scores, including both in SLI. It's a pretty interesting read if you like graphical benchmarks." From the article: "A little over a week ago, rumors began spreading that ATI was working on a new tool that delivered substantially improved performance to their recently launched X1000 cards in OpenGL titles such as DOOM 3, Quake 4, and many others. Some reports claimed ATIs performance improved by up to 35% in these titles in 4xAA mode. Then, posts on Beyond3Ds forums and sites like Guru3D confirmed these rumors. So how did ATI pull this off?"
Re:low end performance (Score:2)
Re:low end performance (Score:3, Interesting)
How did ATI... (Score:4, Funny)
Re:How did ATI... (Score:4, Interesting)
Re:How did ATI... (Score:2, Informative)
I still stick to Nvidia for the time being, but ATI is nowhere near as bad as they used to be (except for Linux support where they still suck).
Re:How did ATI... (Score:2, Interesting)
Re:How did ATI... (Score:3, Informative)
The X1000 series features programmable memory controllers. For Quake 4 (And Doom 3, so this may be a general OpenGL optimization) they have put together some new code for the memory controller that provides the large benefits.
X800XL (Score:3, Informative)
Re:How did ATI... (Score:3, Insightful)
Or maybe they tweaked their drivers for Doom3 and Quake4? Haven't read the article, so don't flame me if this is mentioned, but it wasn't all that long ago that it came out that ATI had been doing this for other popular games/benchmarks. It was easy to do (tweaking for specific cases instead of improving performance in the general case) and it made them look like the best on paper--win/win for ATI.
Re:How did ATI... (Score:2)
At various times BOTH manufacturers have been guilty of "internal tweaks" to degrade quality for a performance boost in a popular benchmark. I hope it's not happening again, and that the performance increase is genuine (I have a vested interest, as I have an X800), but it could be another fraud.
"Wait and see" is what I'll do.
Re:How did ATI... (Score:2)
Anybody
That's right... nVidia. It's funny, isn't it, that when I hear "unscrupulous reporting" and "smear politics" I think of two things: the Bush Administration and the Graphics Hardware marekt. I'll take it with a grain of salt, but not because of some campaign two generations ago that was entirely orchestrated by the competition.
Re:How did ATI... (Score:2)
Re:How did ATI... (Score:2)
Re:How did ATI... (Score:2, Informative)
Re:How did ATI... (Score:2)
Re:How did ATI... (Score:2)
Re:How did ATI... (Score:2)
Re:How did ATI... (Score:2)
The thing is that it is hard to tell without somebody who has both the beta driver and one of the new cards who can actually fire up some other OpenGL games. The difference in that case may be academic, since there are few (if any) major games that use OpenGL other than Q4/D3.
The way I see it, they've had performance issues with OpenGL games, and it would seem, as far as most people are concerned, that they've fixed t
Re:How did ATI... (Score:3, Informative)
You mean like NVidia did? (Score:2)
Re:Bought the game for an ATI 9600 xt, useless (Score:1)
Re:Bought the game for an ATI 9600 xt, useless (Score:1)
Re:Bought the game for an ATI 9600 xt, useless (Score:2, Informative)
Re:Bought the game for an ATI 9600 xt, useless (Score:1)
Re:Bought the game for an ATI 9600 xt, useless (Score:2)
Re:Bought the game for an ATI 9600 xt, useless (Score:1)
Re:Bought the game for an ATI 9600 xt, useless (Score:2)
Re:Bought the game for an ATI 9600 xt, useless (Score:2)
Besides, I was playing at 800x600 on medium quality. Had I decided to play at low quality at 640x480, I am certain that the game would have had much higher framerates and still looked good. Let us not forget Carmack's statements about how the Doom 3 engine was designed to look good at 640x480. I'm just a sucker for ey
Re:Bought the game for an ATI 9600 xt, useless (Score:2)
Your sig (Score:1)
Get an external cd-rom drive, or a housing to turn an old internal drive into an external drive.
Sig Help._.Way off topic (Score:2)
FeatherLinux http://featherlinux.berlios.de/ [berlios.de]
Slax http://slax.linux-live.org/ [linux-live.org]
FlashPuppy http://www.goosee.com/puppy/flash-puppy.htm>
Good Luck
Same old, same old... (Score:3, Insightful)
employGraphicsShortcuts();
}
As always, the graphics card makers quantify the leading game's usage of the API and take shortcuts as needed in order to improve gameplay. Since Doom is released, they can also release these driver shortcuts. These same shortcuts wouldn't necessarily work under another program, and may cause unintended artifacts, crashes, etc.
The only question is why hasn't nVidia released their tweaks yet?
This would only be news once they've both optimized their drivers for this game and one clearly has the advantage.
-Adam
Re:Same old, same old... (Score:3, Interesting)
What ATI is doing this time is tweaking the programmable memory controller in their new cards, not really tweaking the drivers. As I said both ATI and nVidia have already tweaked their drivers for Doom 3. So unless nVidia has some similar tweak up their sleeves (Which they may or may not) then the situation won't change with waiting. I think Doom 3 has been out enough that both companies have grabbed
So Close, Yet So Far Away.... (Score:4, Informative)
This is a step in the right direction. However, this is not the OpenGL driver fix that everyone has been waiting for. It is a manipulation of ATI's new programmable memory controller.
Re:So Close, Yet So Far Away.... (Score:4, Informative)
1) It does not only work at 4xAA, that is just where the gains are more impressive. With or without AA before they were behind, now they're ahead.
2) It is not just the X1800XT. The review was a roundup of high-end cards, and as such only included the X1800 XT and X1800 XL (Not just the XT like you suggest). The optimizations should affect ATI's entire product line fom the X1300 on up.
3) There are no other major games, to my knowledge, that still use OpenGL. As such, this can be considered a general fix for OpenGL performance. General in the sense that it fixes the problem (Poor OpenGL performance) as far as the vast majority of gamers are concerned.
OpenGL is much more than Doom3 (Score:2)
There are a LARGE NUMBER of professional OpenGL applications that push current graphics hardware and drivers to their limits. Also, Linux game environments such as Cedega translate DirectX to OpenGL calls, and would benefit from any general O
Re:OpenGL is much more than Doom3 (Score:3, Informative)
And note that I said major games. Cegeda is a niche product at best, and don't really matter in the grand scheme of things. ATI has a dedicated Linux driver team anyhow, so it is up to that team to put the effort into porting these optimizations to the Linux drivers.
I should point out that I missed one
Re:OpenGL is much more than Doom3 (Score:3, Interesting)
Of course there are also a ton of games using the various iterations of the Unreal Engine [wikipedia.org]. There is one game [wikipedia.org] with its own engine that will get me to upgrade though.
Re:OpenGL is much more than Doom3 (Score:1)
You've got a good point with the Q3 engine, it was really widely used in major games and D3 probably will be too.
And as far as Oblivion goes, great choice ;-) Morrowind is my all-time favorite game, and I upgraded to a 6800GT just for Oblivion (since it'll use lots of SM3.0 optimizations).
Re:OpenGL is much more than Doom3 (Score:2)
It is important to keep in mind, though, that Unreal Engine is a Direct3D engine. It contains an unsupported beta OpenGL renderer, but considering few people if any use it, it's not worth mentioning. Unless you mean the Linux port that relies on that renderer, but I've already put forth my opinion on why Linux gaming shouldn't enter into the discussion (Yes, it would be nice to get better support for it, but it is uni
Re:OpenGL is much more than Doom3 (Score:2, Insightful)
I believed for a long time, too, that the fps in HL1 was capped at 100fps, but not so. If you enable developer mode (either "developer 1" in console, or -developer added to the shortcut, I don't remember exactly), it uncaps it.
In addition to D3/Q4, Starbreeze's Chronicles of Riddick uses OpenGL (and has all the features of th
Re:OpenGL is much more than Doom3 (Score:2)
I did forget it, though, yes, as I pointed out in another one of my replies. However since it is of the era of DX6 capable cards, no optimization made to modern graphics cards is going to mean anything. Th
Re:OpenGL is much more than Doom3 (Score:1)
I'm sure you're right about HL1 defaulting to software, I just remembered having to manually change to OpenGL, so I assumed it was D3D it defaulted to.
I had heard that CoR did what D3 did (real-time lighting, normal mapping) so I guess I took it literally to mean everything D3 did. Apparently not so, but I'm sure we can both agree that CoR and D3 both look amazing :-)
And we can also agree that
Re:OpenGL is much more than Doom3 (Score:2)
This is a largely a myth. A significant portion of openGL-based professional applications work just as well on "consumer" cards as they do on a "professional" line. I help develop one at the openGL level and I've done the benchmarks personally. A significant segment of the userbase uses it on "consumer
That takes smart people (Score:1)
Maybe they have an aspie?
Or its the new NVIDIA drivers (Score:3, Informative)
Not sure if that's related, but if NVIDIA is accidentally breaking support for OpenGL in apps (perhaps deprecated API calls? I dunno) that could have something to do with it.
Re:Or its the new NVIDIA drivers (Score:2)
Re:Or its the new NVIDIA drivers (Score:2)
Yes but (Score:3, Insightful)
*ducks*
Seriously, have they made the same improvements in the linux native drivers?
Re:Yes but (Score:2)
great line from article (Score:4, Insightful)
Of course it's a comparison between two companies, so they were either going from last to last or last to first, there wasn't any other possibility.
Memory controller features arbitration logic.... (Score:1)
One cool feature ATI has added to their X1000 family is the ability to make changes to the memory controller's arbitration logic and/or its algorithms via software. This allows ATI to make adjustments with a simple driver update.
This is what ATI has done with their new hotfix driver, they've simply optimized memory access inside their memory controller to better handle OpenGL titles. This is all invisible to the end user: once an O
Isn't the bigger question... (Score:4, Funny)
However (Score:3, Funny)
1) Launch Product
2) Benchmark Onslaught
3) Release better drivers
4) Benchmark Onslaught that beats nVidia
5) Marketing and sales blitz
6) Design product
7) Produce product
8) Announce product availability schedules
9) Look for factory to start making cards
10) ??
11) Profit!
Not that I'm suggesting ATI has severe production issues - if nVidia can kill paper launches, surely ATI could at least try to keep up.
As the end of the review says... (Score:1)
If your card is the fastest, but i cant go into any old computer store and buy it, why do i care?
How about some Linux benchmarks now? (Score:2)
Fuck ATI.
AT who? (Score:2)
Decent Performance... (Score:1)
Q4 OpenGL? (Score:2)
Quake4's installer told me it *required* directx 9c. How does one make it go with OpenGL instead?
Re:Q4 OpenGL? (Score:2)