Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Quake First Person Shooters (Games)

ATI Drivers Geared For Quake 3? 511

alrz1 writes: "HardOCP has posted an article wherein they accuse ATI of writing drivers that are optimized for Quake 3, just Quake 3, and only Quake 3. Apparently, using a program called quackifier, which modifies the Quake3 executable by changing every "Quake" reference to "Quack" and then creating a new executable called "Quack3", they have demonstrated to some extent that the Quack3.exe benchmarks are around 15% slower than with the original Quake3.exe (same box, os, drivers, etc). The slant seems to be that there is something inherently wrong about writing game-specific optimizations into drivers, if in fact this is what ATI has done. I think this is perfectly acceptable: Quake 3 is the biggest game out there on Windows, and if ATI has invested a little extra time into pumping a few extra (meaningless) frames out of your Radeon 8500, is this really an act of treachery?"
This discussion has been archived. No new comments can be posted.

ATI Drivers Geared For Quake 3?

Comments Filter:
  • It IS wrong... (Score:5, Informative)

    by levendis ( 67993 ) on Wednesday October 24, 2001 @02:47PM (#2473644) Homepage
    Take a look at this article [3dcenter.de], its in German, but the pictures are worth 1000 (english) words. Mouse over the ATI pics to see the "cheat" version versus the normal ATI version. Clearly they are sacrificing image quality for speed.
  • Pretty lame cheat (Score:1, Informative)

    by Anonymous Coward on Wednesday October 24, 2001 @02:55PM (#2473715)
    It is perfectly acceptable to write optimizations for specific games, applications, etc. It is not acceptable to make those optimizations at the expense of quality in order to acheive better benchmark results. It is not acceptable to represent your results (quality of rendering) as being equal in quality to the competition when in fact they are not. (they don't tell anyone they are switching to a lesser quality mode)

    Let me explain:

    It would appear from the images posted on hardocp that the drivers are running in a reduced color depth/dithered mode. There is obvious banding in the images in shallow gradient areas (shallow color ramps) such as the on screen text.
    http://www.hardocp.com/files/cool_stuff/quackcompa re.zip

    If ATI is really doing this (and I think they are) and they are quietly not telling the rest of the world that when they detect quake 3 they switch to this mode, then I totally agree with hardocp, its a cheat, not an optimization. If they provided the user with the ability to choose the mode themselves, then it would be ok, but they don't and its a sneaky little cheat in order to gain an extra few percentage points on quake 3 benchmarks.

  • Re:It IS wrong... (Score:3, Informative)

    by WolfWithoutAClause ( 162946 ) on Wednesday October 24, 2001 @03:25PM (#2473908) Homepage
    I ran babelfish on the page, the Germans suspect that the drivers run the display at 32 bit resolution, but drop the textures down to 16 bits.

    Sounds a bit cheaky to me. The kind of screenrate you get with these cards is already very high, dropping the framerate for better resolution would be better for most people I suspect. If all this is right, the company has basically screwed their customers for a better benchmark, to sell more cards or to push the price up on the cards they sell. (IMHO).

    Still, if you pay more for a graphics card for 10% extra performance when the performance is as high as this anyway, you are practically begging for them to trick you I suppose. Doesn't make it right though.
  • Re:The right way? (Score:2, Informative)

    by Galahad ( 24997 ) on Wednesday October 24, 2001 @03:32PM (#2473952) Homepage
    As a former slav^H^H^H^H employee of ATI (Rage Pro D3D drivers) I can add a little bit here. When I worked there, we used games (many titles to boot!)and WinBench as test platforms. Many times we would find a way to speed up routines used in these games only to find they broke some stupid little D3D app that had to be 'perfect' or Microsoft would not pass the driver through WHQL (so the driver could not be 'certified' and not on the Windows CDs -- and it was very important to be there.) These apps -- Rock'em Robots, Twist, etc -- came with the DirectX SDKs and had to run and run well. We'd try to massage the optimization so that we'd sacrifice some of the speed gain in favor of the test apps. Sometimes that wasn't possible. Back then, we discussed checking for application names but never implemented the checks because the PR would be too bad.

    IMHO, what probably happened is a developer actually implemented a speedup / namecheck and forgot to disable it before checking it in. Or management has gone insane. You decide.
  • by Chris Burke ( 6130 ) on Wednesday October 24, 2001 @03:40PM (#2474001) Homepage
    In the HardOCP article there is a link to a .zip with two uncompressed screenshots -- one from their run of Quake, and the other from "Quack".

    The screenshot from Quake is clearly of a lower quality than the one from Quack -- it's especially obvious on the texturing of the teeth of the "mouth". From this I can only conclude that they are getting the extra boost by sacrificing image quality for a specific game used in benchmarks.

    As to why they don't have a checkbox - because anyone who actually wanted to get higher framerates at the expense of quality will do so within a game's settings menu. What compromise you want to make between quality and speed will vary from game to game. This checkbox would be system-wide, and not satisfactory.

    Plus, no benchmarker would have ran with the "15% faster" option, as that would violate the benchmarks run under "highest quality". So if they did that, their little hack wouldn't have helped their quake scores.

  • Re:Wha?? (Score:5, Informative)

    by John_Booty ( 149925 ) <johnbooty@NOSPaM.bootyproject.org> on Wednesday October 24, 2001 @03:57PM (#2474088) Homepage
    "Have they tested other games as well to see?"
    "how do we know that these optimizations don't indeed effect other games as well"

    If you actually read the article, you'd know the answers to these questions. I suggest reading the HardOCP article... it's a good article.

    I highly disagree with the original posters assertion that "The slant seems to be that there is something inherently wrong about writing game-specific optimizations into drivers"... I think that HardOCP is completely NEUTRAL about the issue; they simply want to know the truth.

    Remember, they run a LOT of benchmarks on video cards. Q3 is a common benchmark program... lots of people buy cards based in part or in whole on Q3 performance, under the assumption that Q3 performance is fairly representative of the card's performance in other games. So if ATI is skewing results only for Q3... well that's not "wrong", but testers and buyers NEED TO KNOW THIS that so that they can interpret Q3 benchmarks accordingly. I applaud HardOCP for raising this important issue.
  • No, but... (Score:5, Informative)

    by mblase ( 200735 ) on Wednesday October 24, 2001 @04:19PM (#2474173)
    It's not unethical to optimize your hardware for a particular piece of software.

    It is unethical to then use that software for a competitive benchmark, without telling anyone you've done the optimizing.

    The first is an example of giving your customers what they want. The second is an example of manipulating independent reviews to give misleading data.
  • remember Dhrystone? (Score:5, Informative)

    by jejones ( 115979 ) on Wednesday October 24, 2001 @04:19PM (#2474174) Journal
    Remember the compiler whose authors hacked it so that it would recognize the Dhrystone benchmark and perform optimizations that happened to work for Dhrystone but which couldn't be applied in general? (It's mentioned in Hennesy and Patterson, if memory serves.) This is the same sort of thing--doing something special for the benchmark that can't be done in general. It makes the benchmark figures misleading for their supposed purpose. Based on other messages already posted, this case is in fact worse than the compiler hack, because the compiler hack resulted in a program that would at least generate the expected output; the driver hack, according to the referenced pages on other posts, degrades the display quality to get speed. If I had bought that graphics card, heck yes, I'd be upset.

So you think that money is the root of all evil. Have you ever asked what is the root of money? -- Ayn Rand

Working...