ATI Drivers Geared For Quake 3? 511
alrz1 writes: "HardOCP has posted an article wherein they accuse ATI of writing drivers that are optimized for Quake 3, just Quake 3, and only Quake 3. Apparently, using a program called quackifier, which modifies the Quake3 executable by changing every "Quake" reference to "Quack" and then creating a new executable called "Quack3", they have demonstrated to some extent that the Quack3.exe benchmarks are around 15% slower than with the original Quake3.exe (same box, os, drivers, etc). The slant seems to be that there is something inherently wrong about writing game-specific optimizations into drivers, if in fact this is what ATI has done. I think this is perfectly acceptable: Quake 3 is the biggest game out there on Windows, and if ATI has invested a little extra time into pumping a few extra (meaningless) frames out of your Radeon 8500, is this really an act of treachery?"
Wha?? (Score:5, Interesting)
Re:Wha?? (Score:2)
Probably because the engine is old and doesn't strain today's crop of computers it isn't worth bothering with. My old P200 with VooDoo2 card ran the HalfLife mods passably, if not exactly smoothly (15-40fps @ 640x480).
Most reviews don't use CounterStrike for benchmark (Score:3, Insightful)
ATI knows that just about every review compares cards primarily based upon Quake 3 (looks at any of a large number of sites to see this), often under the premise that it's totally relevant because so many current and up-and-coming games are based upon the Quake 3 engine.
Maybe deyl 'optimise' for HL/CS in the next driver (Score:2)
Everyone knows their current driver is horribly immature & the new Radeon 'II' drivers are due out (officially or otherwise) anytime now.
Re:Wha?? (Score:2)
Re:Wha?? (Score:5, Informative)
"how do we know that these optimizations don't indeed effect other games as well"
If you actually read the article, you'd know the answers to these questions. I suggest reading the HardOCP article... it's a good article.
I highly disagree with the original posters assertion that "The slant seems to be that there is something inherently wrong about writing game-specific optimizations into drivers"... I think that HardOCP is completely NEUTRAL about the issue; they simply want to know the truth.
Remember, they run a LOT of benchmarks on video cards. Q3 is a common benchmark program... lots of people buy cards based in part or in whole on Q3 performance, under the assumption that Q3 performance is fairly representative of the card's performance in other games. So if ATI is skewing results only for Q3... well that's not "wrong", but testers and buyers NEED TO KNOW THIS that so that they can interpret Q3 benchmarks accordingly. I applaud HardOCP for raising this important issue.
Re:Wha?? (Score:2)
Then again, I could be speaking out of my ass again.
- Ed.
A paralell question (Score:5, Insightful)
No, but... (Score:5, Informative)
It is unethical to then use that software for a competitive benchmark, without telling anyone you've done the optimizing.
The first is an example of giving your customers what they want. The second is an example of manipulating independent reviews to give misleading data.
Re:No, but... (Score:2)
Sounds to me like ATI's behavior is a response to that, and so is unethical.
Re:A paralell question (Score:2)
I don't see how using publically available knowledge like SSE2 is cheating. It's only "at the expense of AMD" in that Intel made a deal with Adobe to make the mods and AMD didn't get any special optimization.
I'd have to say yes... (Score:4, Insightful)
This is nothing new, and I don't think the fact that they're catering to a real program rather than an artificial benchmark makes it any less reprehensible.
Re:I'd have to say yes... (Score:5, Interesting)
ATI repeats itself (Score:3, Insightful)
Re:No way (Score:3, Insightful)
READ THE ARTICLE BEFORE POSTING.
Image quality is worse in Q3 due to the optimizations. If you do the Quack3 rename, the image quality gets better and the frame rate gets worse.
You're telling me that you're okay with a graphics card manufacturer deliberating reducing the image quality of Q3A in order to get better frame rates, when it just-so-happens that Q3A framerates are an important benchmark? And not giving you any indication (other than reduced quality) that this is happening, nor any way to change it?
I stand by my original view.
Since I never Run Quake (Score:5, Insightful)
Yes it is. It's writing for the benchmark rather than writing for the user.
I'm reminded of a Richard Feynman quote "For a sucessful technology, honesty must take precedence over public relations for nature cannot be fooled."
Re:Since I never Run Quake (Score:3, Insightful)
Re:Since I never Run Quake (Score:2)
This sounds like a marketing payoff, where the publisher of Quake is paying ATI to slow down competing games. I can't think of any other rational explanation for the drivers to care about the name of the executables.
Alternate suggestion: The driver surely has performance tradeoffs that need to be tuned, many complex performance-critical drivers do. Perhaps when ATI got the driver tuned well for general purpose applications, they found it worked badly for Quake. Rather than make everything work less well so Quake could be happy, they watch for quake.exe and give Quake its own custom tuning.
I'm not saying this is the case, it's merely another rational explanation, in my opinion.
Re:Since I never Run Quake (Score:2)
It's writing for the benchmark rather than writing for the user.
Completely irrelevant. Quake 3 may be a benchmark but it's a benchmark which users use on a daily basis - because it's a game first and a benchmark second. Your argument is equivilant to stating that Seagate engineers for the benchmark instead of the user when they make hard drives with higher throughput. Seagate optimizing for Microsoft Disk Performance BusinessBenchmark 2001 would be cheating, optimizing for latency and throughput are not (nor would optimizing for opening a large file in photoshop...but I don't see that happening in a disk drive).
Where ATI did screw up is in their shoddy engineering. The developers should be ashamed of themselves. If there's a 15% (!) performance gain to be had using the same hardware between the "default" behavior of the drivers and some optimized behavior, they should really find a better way than looking at the name of the freaking program being run. Maybe that means having intelligent drivers that benchmark themselves during the first few minutes of gameplay, seeing which set of code produces the best results, and using that afterwards; but I'm skeptical of saying this couldn't be done in a more generic way.
Re:Since I never Run Quake (Score:2)
Image quality DECREASES in quake3 to INCREASE the frame rate.
In case you still didnt comprehend:
in quack3 the IMAGE QUALITY IS BETTER, but the FRAME RATE IS WORSE.
Informed minds will deduce that the ATI driver will select a lower image quality in Quake3 _ONLY_. (which co-incidently happens to be one of the widest used benchmarks in the industry) This method uses up less time per pixel, thus achieving a better frame rate.
Now rethink your position.
Comment removed (Score:3, Interesting)
Re:The real question is . . . (Score:2)
I'm beginning to think this is is some sort of collusion, although the benefit seems lost to me...
Re:The real answer is... (Score:2)
I mean if Quake III looked really bad, the card wouldn't sell; so there's checks and balances here atleast.
But it would have been much better if they had given the user some control of the video quality.
It IS wrong... (Score:5, Informative)
Re:It IS wrong... (Score:2)
Re:It IS wrong... (Score:3, Informative)
Sounds a bit cheaky to me. The kind of screenrate you get with these cards is already very high, dropping the framerate for better resolution would be better for most people I suspect. If all this is right, the company has basically screwed their customers for a better benchmark, to sell more cards or to push the price up on the cards they sell. (IMHO).
Still, if you pay more for a graphics card for 10% extra performance when the performance is as high as this anyway, you are practically begging for them to trick you I suppose. Doesn't make it right though.
Given that, is it really wrong? (Score:4, Insightful)
They always have to make trade-offs between frame rate and image quality, what makes tweaking this trade-off for certain games necessarily some kind of trickery?
Id's games have always tended to be a bit freakish, based on unusual, privately researched approaches. Maybe the standard approach isn't perceived as being as playable for Quake 3.
Ideally, they could tweak the system for every individual game, but maybe it's just a case of focusing such efforts on a particularly popular title. Others have pointed out that there are more popular high-performance games, but it would make sense that the default would be optimized for the most popular games, and exceptions coded only for those nearly as popular but different enough for the default settings to be sub-optimal.
Re:Given that, is it really wrong? (Score:2)
The point is that users can already lower the settings to get better framerates. Why don't they let the users decide? Because that wouldn't help their scores in one of the most popular benchmarks, of course. The main problem with the claim that they are just tweaking the driver to help out with a certain game is that they didn't disclose this little tidbit to reviewers! If this is such a noble effort to help the gamers get more performance, why haven't they said anything about it?
Re:Given that, is it really wrong? (Score:2)
But can they do it in the same way?
Just how complicated should user configuration be? If it's a matter of juggling a dozen different variables to come up with an optimal view, then it's better not to give that control to the user, though it would be impolitic for them to say so. Regardless of how complex the problem is, most gamers (especially the more vocal ones) would believe they could come up with a better configuration, so they'd fiddle with the settings, then blame any problems this causes on the video card and be bitchy over the time they "had to waste to get it to work half-decently."
The problem doesn't likely reduce to a single slider-control that runs from high-quality/low speed to low-quality/high speed.
Re:Given that, is it really wrong? (Score:2)
It's quite easy to change the detail levels and such in Quake3. Anyone could do it without even knowing a thing about graphics. Of course it's possible to get even finer control by juggling a dozen different variables, but it's by no means necessary. And ATI isn't giving them that fine control with this hack either. It does the same thing that the simple Q3 options let anyone do (i.e. increase framerate at the expense of image quality), except that ATI doesn't let the user choose for himself.
Re:Given that, is it really wrong? (Score:2)
That doesn't mean that they aren't exercising that fine control. My whole point was that it would be counterproductive to offer that complex, fine control to the end-user.
It does the same thing that the simple Q3 options let anyone do
There is no evidence to support this claim. There are many aspects of image quality ("more things in heaven and earth than are dreamt of in your philosophy"), and nobody has presented evidence that the exact performance of the Quake exe can be duplicated in the Quack exe merely by changing settings available to the user.
I don't accept that the Quake images are simply of worse quality, either. I've looked at the images, and while there are distinct differences, it's hard to say which is better. In the zipped tgas provided, there is more detail in the textures of of the Quack sample (especially noticable on the teeth), but some of the complex relief shading (particularly on the armor) seems indistinct by comparison. Also, the jaggies seem worse in the Quack version. It's even harder to say which looks better when you're playing just by looking at stills.
Besides, maybe Quake 3 has a few scenes that cause it to really bog down with the default settings: you go around, happily accepting your 15% FPS hit for the slightly better image quality, then you go around a corner and suddenly it's an 80% FPS hit in a crucial scene, and practically unplayable. It's better not to allow users to set it that way if it's not sustainable, because people get really annoyed at that kind of inconsistant performance.
The fact of the matter is that we don't know why they did it, and until you try it for yourself, you don't even know what the real effect is. Of course it's possible that this was an underhanded trick, but we don't know that yet. I'm sure we'll hear more as the testers dig into it and ATI responds, so keep an open mind instead of jumping to conclusions.
Re:Given that, is it really wrong? (Score:2)
This IMHO is deception because when I see benchmarks for Q3A High Detail I'm expecting a representation of how the card perfoms when rendering textures at the highest possible detail, not at detail levels lowered to boost benchmark scores. This would be like ATI advertising their cards' Medium Detail scores compared to nVidia's High Detail scores, and justifying it by saying that when I'm playing Q3A I probably won't notice the difference.
Re:It IS wrong... (Score:2)
If tweaked benchmarking takes off- we might see reviews such as the following:
nVidea Geforce 4 Extreme Titatinum Edition Plus
240 fps-- looks like crap, though. Nvidea needs to stop using 12bit textures.
ATI Radeon III 65536
210 fps-- slightly slower speed, but sharp images.
nVidea Geforce 4 (with firmware hacked to use "Pro" drivers)
200 fps-- absolutely beautiful.
Re:It IS wrong... (Score:2)
Re:It IS wrong... (Score:2)
Yes, that is true, but that misses the point. When I read a review/comparison of video cards, I am assuming that the benchmarks are based on fairly real-world situations. That what makes Quake 3 such a great benchmark - almost every gamer has played it, and is at least generally familiar with how torturous it is to the standard video card. Everyone also knows that you can improve the FPS by lowering resolution, using 16-bit textures, etc etc
When I look at a comparison of Quake benchmark scores on, say, Tom's hardware, they are meaningful because they are a comparison of video cards, all else being equal. If ATI is specifically lowering quality on Quake3 (not just the Quake3 engine, but the game itself) to increase FPS rate, clearly they are doing this to stand out in such comparisons, and clearly this is - if not outright fraudulent - misleading and manipulative.
Re:It IS wrong... (Score:2)
People who buy fast videocards usually do so because they want high FPS in Id games -- Quake 3 more recently (god only knows why). So they're giving the people what they want. If only the music industry worked like this...
This also affects gameplay in quake (Score:2)
To apreciate the importance of textures and structured surfaces for depth perception try q3a with different texturesettins and see what a difference that makes in a map like q3dm19. In quake 1 you could even switch of all textures, then, staring at a plain wall, you couldn't tell if it was 10cm, 1m or 10m away.
Re:It IS wrong... (Score:2)
But all people seem to care about is 3D speed. Can you really blame a company for catering to what the people who pay the bills want?
I certainly can, when they are deceptive about it. They are automatically dropping the image quality. The user could do this himself if he wanted to, the options are there. So, helping the users is obviously not why they are doing this. What does that leave? Helping themselves in benchmarks, of course. By not disclosing this tactic, they have committed fraud IMO.
Fraud?? (Score:2)
Fraud?? I guarantee that there is no claim about image quality, nor even one about frame rates. Furthermore, I'll bet the EULA for the drivers makes it clear that they are not making any particular claims and are not liable if it doesn't somehow live up to your expectations.
Personally, I'm glad to see application-specific enhancements. This whole thing is wildly overblown.
Re:It IS wrong... (Score:2)
Your comment that, "Nvidia has been doing this for years," isn't really fair, anyway. nVidia has always been largely upfront about what rendering techniques are being used. ATi's gaffe here is that they're sacrificing quality for speed and not telling anybody about it. 3D benchmarks try to compare apples to apples as much as possible, but ATi is trying to sneak in an orange.
Re:It IS wrong... (Score:2)
Well, what happened according to the German article they linked to was that they dropped their textures from 32-bit to 16-bit when the drivers detected that they were running Quake. This would not happen in the spoofed Quack executable. Doesn't this strike you as a bit shady? People running benchmarks might not notice the degraded quality, but would certainly notice the higher framerates.
On a loosely related note, where'd you pick up a GF3 Ti 500? I just finally got one ordered yesterday because nobody online had any stock of them before then. At least for the Leadtek one I was looking for.
Actually, I found the Visiontek one staring me in the face at Best Buy of all places. Yeah, you can do better than $350, but not by more than 10-15% or so. A bit of an impulse buy, but I decided that being able to get it right then and be able to take it back myself if there was a problem was worth the extra money.
pick your battles, slashdot (Score:2, Interesting)
The update or improvement of such software is probably intended, first, for the new Quake buyers. It's a company that occasionally serves a fan base, it's not enslaved to the fan base that has all previous versions.
It's like a new model of a car with a beautiful v-8 engine that previous models have always used. If the new model is configured to optimize engine performance, it's not discrimination against collectors of previous models.
Re:pick your battles, slashdot (Score:2)
What ATI's doing is more like this: A beautiful V8 engine that makes 300 bhp on most roads, but can detect when it's on I-80. There, it makes 320 bhp, but knocks like an over-caffeinated Jehovah's Witness. Oh, and you have to change all the signs along I-80 to read "J-80" to get the engine to quit knocking and behave right.
If ATI could have found a few more frames per second without making the textures all fuzzy, more power to 'em. But I can get the same effect from another card by turning down the texture settings myself. This stunt served only ATI's PR flaks, up to the point where they got caught.
Re:pick your battles, slashdot (Score:2)
When I asked how these tires would work on my uber-blown V8 Pinto wagon for those straightlines down the quarter-mile, he told me that they wouldn't do much good unless I was trying for the best ET down at the duck marshes.
Can you believe it? The tire manufacturers make tires that perform exceptionally well in specific situations. There ought to be a law against it, that's for sure.
Re:pick your battles, slashdot (Score:2)
I believe the phrase you are looking for is truth in advertising.
If ATI had said upfront that the drivers were enhanced for Quake III, there wouldn't be nearly as much fuss (Except maybe us Unreal Tournament fans.
hey guys (Score:4, Funny)
Cmon editors (Score:2)
Quake 3 is the biggest thing... for people who haven't found that everything else is better. HA! Now let the flamewar begin!
Probably benchmarking (Score:2)
Not if they:
a) didn't sacrifice performance for other games ONLY to get more out of quake3 (probably not the case)
b) admit that its true, if it is true
I suppose the alterior motive isn't better quake3 frame rates for ATI owners, but rather more impressive benchmarks, seeing as quake3 is such a standard graphics card benchmark. So if they are claiming that quake3 didn't get any special attention, but they DID give it special attention for benchmarks, well, thats a little misleading. Otherwise, I don't see anything inherently wrong with adding some post-design juice for the benifit of all the quake3 players out there.
Personally, I think they did it for better visibility in benchmarking.
Just a guess, but... (Score:3, Insightful)
It's also possible that the Quack-quake transfer screwed something up in Quake- I'd be interested to see how a quackified exe performed on NVidia's chipset.
See this statement at the end of the text:
John B. Challinor II APR - Director, Public Relations at ATI Technologies Inc. "ATI optimizes its drivers on many different levels, including the application level, the game engine level, the API level, and the operating system level. That is, some optimizations work only on specific games, while others work only on specific game engines or only on specific operating systems. In the case of Quake III and Quake III Arena, we were able to achieve certain optimizations specifically for that game, as we do for other popular games. "
Bah, I don't even see where the "Optimizing for Quake 3 only" comes in. The Quake series has been and still is the benchmark of 1st person, 3D FPS graphics.
That being said, it would be convenient to have a checkbox in their control panel "(X) Enable unstable 3D support. May speed up certain apps, may cause problems. Use at own risk."
Try reading the artical (Score:2)
Well, then why don't you read the artical...
In closing we would like to say that all the same testing was run on the latest set of NVIDIA DetonatorXP drivers without any of the same issues.
Read the article, and look at the screenshots. (Score:2)
First off, the only thing the "quackified" executable screwed up was the string Quake.
Second, if you look at the screenshots taken with Quake vs. Quack, you'll see that the Quake screenshots are of far worse quality. Their "optimization" was to detect that quake was running and reduce quality to get higher benchmark scores.
There is always the chance that their honestly-intended game-specific optimizations somehow had as a side effect reduced quality in Quake AND breaking any other game, such that these results were produced. But Occam's Razor cuts that to pieces, since at that point you're still not doing anything but turning down a detail slider.
Re:Read the article, and look at the screenshots. (Score:2)
You can write drivers that are unstable, or drivers that are slow, but you don't mistakenly add in a tweak that speeds you up and also destroys your image quality. I mean, that'd only make sense if the guy really was drunk, and I doubt ATI would let their workers come in wasted.
This is not simply optimizing for the game. (Score:5, Insightful)
The problem here is that it appears ATI has gained performance by reducing the image quality -- forcing a reduced texture resolution specifically in Quake 3. Compare the screenshots shown on the site. This means comparing their benchmark scores on Quake 3 with other cards is meaningless -- their card isn't performing the same task. This was a bad decision on ATI's part.
Alex Mohr
Somewhat unavoidable... (Score:2)
When a developer is making a game they end up doing tricks to get the best performance out of the most common cards. So what happens is the more prominent developers make contacts with the driver developers at the video card companies, who make specific changes to accomodate some feature or design that the game developer needs. This often works both ways, with the driver developers guiding the game developer on how to get better performance, etc.
The point is that 3d graphics are complicated enough to not make it as simple as having an API that performs the same on multiple games and cards. Both the game developers and the video card manufacturers are doing this stuff. I doubt you will see this changing in the near future. But I don't think it's a conspiracy.
Beta ATI "Quackified" Drivers Released (Score:4, Funny)
With the release of ATI's newest Radeon 8500 and 7500 graphics cards, hardware review sites have been proportedly using ATI drivers that have been sepecifically optimized for Quake III.
Various ATI fan sites are now reporting new "Quackified" drivers, originally authored by Kyle Bennett of [H]ard|OCP. Rumors are flying about this unofficial driver's unfair optimization of games such as "Duck Hunter 5: More Buckshot" and "Donald Duck's Red-Light District Exploration".
"Wow, the animated ducks are faster, and die better," one anonymous gamer said on a forum. "And Donald gets so better action with these new drivers!"
ATI spokeduck, Rob Erduckie, denies any involvement in these modifications. "The claims are just false," said Rob. "We do not believe in unfairly offering advantages to one side or another."
Rob also made reference to cheating, "We also vehemently oppose offering cheat options, such as Asus's 'See-Through Duck' modification. We're totally about fair game play."
Environmentalists have been picketing federal facilities today in protest of unfair portrayal of their favorite bird today, with writings on picket signs such as "Free the Ducks!," "No luck for Ducks," and "Ducks Need Rights Too!."
Department of Fish & Game officials were unavailable to comment.
The Linux penguin released a brief statement: "I understand the pain that ducks are going through right now. Did you read what Linus said about me? 'A happily drunk penguin who just got some'? Sheesh!"
What I'd like to see... (Score:2)
Question: Is this just a benchmark-boosting hack or does it actually improve the frame rate while playing the game?
Observation: With frame rates of 80+ at even the highest resolution on the HardOCP test box, it's difficult to see if there is any ACTUAL BENEFIT resulting from using ATI's drivers.
Suggestion: Repeat their tests with the original and with the quackified executables on a less powerful box so that the actual framerates are more like 10-15 fps.
Result: If the drivers actually help the game play, at that low frame rate, it should be readily apparent. If there's NO difference in the game play, then it's just a hack to boost the benchmark scores.
ATI good, NVidia bad (Score:2)
Did I miss something? (Score:2)
It's quite possible that rendering the different letters could account for the different frame rates. I'd be surprised if it were 15% but I think that if Nvidia dropped as well with the modified text, then that would show that the text simply took longer to render.
Wrong again (Score:4, Funny)
Quake III is NOT the biggest game on the PC (Score:2)
The Sims
Quake isn't anywhere near the biggest game on the PC. The Sims is a $100 Mil industry unto itself at this point.
The argument could be made that The Sims isn't a game. But, it gets charted with other PC entertainment sales, so for this argument it must be treated as a PC game product.
Unfair to competition (Score:2)
On the other hand, this benchmark seems sketchy to me. There are a lot of variables that go into large applications such as Quake and an example might be (although this is purely hypothetical) that there are resource files that are tied to "Quake.exe" first and then have alternate, slower methods being accessed. When the name is changed from Quake to Quack, the slower methods have to be used. That's just a made up example but it's they type of thing that needs to be taken into consideration. However, like I said before, if these people actually did make optimizations for Quake and only for Quake, I think what they did was unfair and harmful to computer users.
Umm... (Score:2)
From a business point of view it's not the wisest thing to do. PC games have a tendency to be an extremly rabid bunch. Buying mobo's, processors, graphic's cards and anything else that lets them milk that last bit of performance out of games. They do this frequently,by keeping up on all the latest hardware and it's associated benchmarks and purchasing accordingly. They will even go to silly lengths to make sure what they are buying is the best, such as doing a grep for Quake3 and changing it to Quack3, then seeing if the performance is the same. Even without such lengths, a gamer would be sure that more than Q3 was fast on their hardware, so that when the next rage comes along, they can buy the game and expect it to run fast. So, ATI is shooting themselves in the foot by focusing on one game's performance, rather than going for general performance and as such games won't buy their cards....
Comparing the effects of the quake3 "optimization" (Score:2)
Good. (Score:2)
Everyone and their mom can do a review with Quake 3 and report claiming to know what they're talking about. Reviewers will now have to come up with their own benchmarking tools to convince end-users of the validity of their benchmarks.
Even though it is underhanded of ATI, it'll all work out in the end. Sort of a "can't fool all of the people all of the time."
I'm curious about one thing (Score:2)
What exactly had them poking around in the first place, looking for evidence of this? Not that I think they're being disingenous or have anything to hide, but it's not like we all just get the idea into our heads to run strings on drivers and come up with ways to "quackify" binaries :)
Didn't they do this before? (Score:2)
Quake 3 is the biggest game out there on Windows (Score:2)
The only place I know of where Quake3 is the biggest game out there is for benchmarks. That's why it's unethical. ATI is trying to manipulate benchmarks to make their product seem better than it really is.
A more complete article... (Score:2)
Optimizing drivers (Score:3, Insightful)
Which gets us back to the issue at hand. I don't know anything about the inner workings of the Radeon driver, but there are probably a number of similar tradeoffs involved in its design. The most reasonable interpretation is *not* that Radeon has optimized for Quake 3 at the expense of other programs. If that were true, it would run at the same rate whatever it were named. The better explanation is that when the driver knows what program is being run (such as Quake 3) it optmizes itself to the known characteristics of that program, and when a program which the driver knows nothing about (such as "Quack") is run, it uses default settings.
Thus, it's not necessarily favoring Quake 3 over other applications, but is instead using optimizations for for known programs which are not available for unknown ones. There's nothing in this article to indicate that similar optimizations haven't been made for Counter Strike, Half Life, or any other popular 3D programs.
remember Dhrystone? (Score:5, Informative)
So fucking what? (Score:2)
This story really seems to be nothing more than turd-stirring, especially since they haven't looked for any other improvements. Poor journalism brought to us by a bunch of hacks.
My simple test case... (Score:2, Insightful)
So. If ATI didn't think they were doing anything wrong, there'd be something like a sort of freaky stepchild of an iD/ATI agreement where ATI would plaster "Quake 3 optimized!" all over their boxes and take underhanded swipes at Nvidia et al. in their press releases about it being an exclusive.
They didn't. So it's clear, to me, that regardless of what the Slashdot/HardOCP/etc. community thinks, ATI thought it was scummy enough to keep it under wraps, AND make a non-statement regarding it once they'd been caught.
Pretty damning, I'd say. I can't wait to hear what Tom has to say about this. (Or has he spoken up already?)
Conclusions and delusions: (Score:2)
The unmitigated gall of some companies, what is next? A post sequil to Moon Over Orion being called Moo3.exe? When will the horror ever stop.
Ok, my sarcasm and stupidity stops now.
In essence ATI is trying to make their hardware look better. Compared to the current GF3's I understand they stack up very well, but in "classic ATI fashion" their drivers blow goats/ducks/chunks (insert colorful phrase).
ATI has always been a mass hardware producer and now is trying to break into the high end gaming market...they have the visual quality, that is a given, but their speed is/has always been lacking until recently.
But, their lack of quality drivers has been dogging them and they have always tried to duck the issue...so maybe there is a subliminal message to this "quack.exe" thing.
( i could not resist, sorry, that was too good to pass up).
I just sincerely hope we do not end of with a ATI only version of D3D or GL or Gl-ATI-ide.
They will just be painting themselves into a corner like 3dfx did to some extent.
Yes, it is. (Score:3, Insightful)
Quake ]|[ is THE standard for PC game benchmarks. John Carmack's engines are generally regarded as the best and fastest in the industry, and test overall performance of a system without getting bogged down on the CPU like other engines do. The Quake X engines also tend to support just about every performance enhancing feature they can (Even if the games themselves may not take advantage of it.). Quake X engines also tend to be the most OpenGL compliant engines around - something that figures greatly into why ATI would do this.
By focusing driver development on Quake ]|[, ATI is able to produce a card that will perform very well on the standard PC benchmark. Honestly, I would rather have a card that performs well on any system out there. ATI has always had horrible problems with OpenGL performance caused by weak drivers, and this has long been one of the biggest criticism of their cards. By rewriting the driver to show a great amount of Quake ]|[ performance, ATI is able to convince potential buyers that they have been fixing the OpenGL code; which if Kyle's speculation is correct, is probably one of the sleaziest things in the history of computer hardware.
I will be keeping a close eye on this one in the next few days. If this is true, I will be changing my plans to buy a new Radeon to buying a new nVidia card - because nVidia has never given me such a reason to distrust them. On top of that, nVidia drivers are custom hacked for specific cards by other vendors, so if nVidia did try this, people would leak the truth.
This has the potential to really harm ATI. If ATI loses the faith of gamers, OEMs will continue to abandon ATI for nVidia. At a time when the global economy is already faltering, ATI does not need any lost sales, and if they look weak they could lose the support of companies like Dell and Apple that are already moving to nVidia.
Carmack Troll (Score:2)
Re:Carmack Troll (Score:2)
Why the heck is this string-based? (Score:2)
Why would they _not_ implement this speed/quality cheat for everything they could? If they were worried about benchmark programs noticing quality problems, the logical thing to do would be to special-case WinBench, not Quake.
Re:Why the heck is this string-based? (Score:2)
Game optimizations aren't new people (Score:2, Interesting)
What about Tribes 2? (Score:2)
Quake3 not the biggest game. (Score:2, Redundant)
According to GameSpy [gamespy.com], Half-Life is a little over 10 times as popular as Quake3, and Unreal Tournament is slightly more popular.
Hercules (Score:2)
This is not tuning for a program, this is cutting corners. If the simple change of "quake" strings to "quack" causes a 15% drop in speed (if the hack really doesn't change anything else - would need to try other cards as well or something like that), then what they're doing is jumping routines.
I remember that Hercules once did this a good few years ago. Their drivers watched for repetitive procedures and then skipped some of the repititions, giving falsely high results. They got a pounding in the news because of it.
Re:Uh. Something isnt right here (Score:2)
The drivers check to see if the program calling them has the string "uake3" in the name. If it does, they use a certain set of internal quality settings. If it doesn't, they use a different set of internal quality settings.
What they are doing is having the video cards cut corners just for Q3 to make the benchmarks run better.
Re:Uh. Something isnt right here (Score:2)
Look at the screenshots! Look! (Score:4, Informative)
The screenshot from Quake is clearly of a lower quality than the one from Quack -- it's especially obvious on the texturing of the teeth of the "mouth". From this I can only conclude that they are getting the extra boost by sacrificing image quality for a specific game used in benchmarks.
As to why they don't have a checkbox - because anyone who actually wanted to get higher framerates at the expense of quality will do so within a game's settings menu. What compromise you want to make between quality and speed will vary from game to game. This checkbox would be system-wide, and not satisfactory.
Plus, no benchmarker would have ran with the "15% faster" option, as that would violate the benchmarks run under "highest quality". So if they did that, their little hack wouldn't have helped their quake scores.
Re:Uh. Something isnt right here (Score:2)
(grin)
Re:Uh. Something isnt right here (Score:2)
Re:Uh. Something isnt right here (Score:2)
Re:Uh. Something isnt right here (Score:2)
Re:Uh. Something isnt right here (Score:2)
Re:Uh. Something isnt right here (Score:2)
Re:Not only quake 3 then (Score:2)
Re:The right way? (Score:3, Interesting)
Because that would require actual work.
Knowing the coding community as we do, which is more likely: this was written by work-obsessed coders who want to make the best drivers possible, or written by a handful of people who are pissed at management and just want ot make it -look- faster so they can get more money for the least ammount of effort, go home, and be with their families?
Re:The right way? (Score:2, Informative)
IMHO, what probably happened is a developer actually implemented a speedup / namecheck and forgot to disable it before checking it in. Or management has gone insane. You decide.
Re:The right way? (Score:2)
Are we forgetting this is ATI? Your situation isn't even remotely plausible... remember this is the ATI that has had significant difficulty releasing a stable Windows 2000 driver. And remind me, how long has Windows 2000 been released (let alone how long the developers have had it)?
I specifically chose to buy an NVidia product because ATI's Windows 2000 drivers were/are terrible. I don't care if I get
Re:When you change references is that all you are (Score:2)
Possible, but it would be unlikely that a string replace would result in a lower framerate - If the affected code branch was executed, an immediate crash would be much more likely. If it wasn't executed, then no differences would be evident, whether they were performance related, or crashes.
In summary I doubt that they have code in their driver saying If quake3 then overclock else underclock. or something. That would make no sense.
On the page, they mention that the string "uake" in fact shows up in the ATI drivers. It actually seems to be that they degrade image quality in favor of framerate for Quake. As for making no sense, it is very common for drivers to be optimized for benchmarks at the expense of general use.
Re:The nerve of ATi! (Score:2)
You are rushing to judgement. It's not "optimised" it's vistually crippled to gain scores in benchmarks. At the same quality the 8500 runs SLOWER than a $99 ti-200 in Q3.
That wacky ATI... (Score:2, Interesting)
If it wasn't for that goddamn ad ATI ran in Computer Shopper this month (S&M sells, bay-bee!) I would be very enthusiastically defending them. Right now, I am more likely to spend the extra bucks and grab a Matrox because the idea of using an image of violence against women to sell video cards is repugnant to me.
Oh yeah, I buy my DVDs used, too.
Re:ATI Replies (Score:2)
Re:These Quake 3 optimizations arent new (Score:2)
Nonsense. If that was what their users wanted, their users could have selected something other than 'High Quality' in the Quake3 video configuration screen.
ATI did this because it was what ATI wanted, a high framerate number on benchmarks.