Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Quake First Person Shooters (Games) Graphics Software

Quake 4 Graphics Performance Compared 71

Timmus writes "nVidia's huge lead in OpenGL performance is apparently gone. According to Firingsquad, ATI's latest hotfix driver brings major performance improvements to ATI's RADEON X1800 cards in OpenGL games like Doom 3 and Quake 4. The X1800 XT is now faster than GeForce 7800 GTX, while the X1800 XL is faster than the GeForce 7800 GT in most cases. The article also includes GeForce 6800 Ultra/GT scores, including both in SLI. It's a pretty interesting read if you like graphical benchmarks." From the article: "A little over a week ago, rumors began spreading that ATI was working on a new tool that delivered substantially improved performance to their recently launched X1000 cards in OpenGL titles such as DOOM 3, Quake 4, and many others. Some reports claimed ATIs performance improved by up to 35% in these titles in 4xAA mode. Then, posts on Beyond3Ds forums and sites like Guru3D confirmed these rumors. So how did ATI pull this off?"
This discussion has been archived. No new comments can be posted.

Quake 4 Graphics Performance Compared

Comments Filter:
  • by Bin_jammin ( 684517 ) <Binjammin@gmail.com> on Tuesday October 25, 2005 @01:40PM (#13873781)
    pull this off? Money, hard work, and development. Did you think it came from aliens?
    • Re:How did ATI... (Score:4, Interesting)

      by Enti ( 726249 ) on Tuesday October 25, 2005 @01:55PM (#13873921)
      ATI's drivers tend to be a bit shabby from the get-go (based on my personal experience over the past five-ish years). Assuming that the initial driver support for the X1000 series was horrible, such a large performance boost is understandable. It used to be that ATI cards were on par with Nvidia's and the poorly written drivers held them back. Can't say when exactly, but ATI started turning things around a bit before the 9800Pro was released.
      • Re:How did ATI... (Score:2, Informative)

        by LLuthor ( 909583 )
        Actually the last few months have seen nothing but great drivers from ATI. I have an X800 in one of my machines and every release from ATI is better than the last. I haven't seen any crashes for a long time, and although I am not a big gamer, I do play games frequently and they have been running great.

        I still stick to Nvidia for the time being, but ATI is nowhere near as bad as they used to be (except for Linux support where they still suck).
        • Re:How did ATI... (Score:2, Interesting)

          by Enti ( 726249 )
          I'm with you on that, having been on the ATI bandwagon with an X800 ever since my TI4400 exploded on me. I was just recalling days when some of the more popular games would never render quite right when running on an ATI card.
      • Re:How did ATI... (Score:3, Informative)

        by Guspaz ( 556486 )
        This isn't a question of ATI having poor drivers, it's a question of taking time to do optimizations.

        The X1000 series features programmable memory controllers. For Quake 4 (And Doom 3, so this may be a general OpenGL optimization) they have put together some new code for the memory controller that provides the large benefits.
      • X800XL (Score:3, Informative)

        by alexo ( 9335 )
        Unfortunately it seems that the previous generation (X800XL) was hurt by the the driver upgrade.
    • Re:How did ATI... (Score:3, Insightful)

      by Sancho ( 17056 )
      Maybe.
      Or maybe they tweaked their drivers for Doom3 and Quake4? Haven't read the article, so don't flame me if this is mentioned, but it wasn't all that long ago that it came out that ATI had been doing this for other popular games/benchmarks. It was easy to do (tweaking for specific cases instead of improving performance in the general case) and it made them look like the best on paper--win/win for ATI.
      • Ya, I'll be taking this with a grain of salt until there's a lot more independant confirmation. Everybody remember the quack vs quake debacle?

        At various times BOTH manufacturers have been guilty of "internal tweaks" to degrade quality for a performance boost in a popular benchmark. I hope it's not happening again, and that the performance increase is genuine (I have a vested interest, as I have an X800), but it could be another fraud.

        "Wait and see" is what I'll do.
        • Yeah, and who shopped the "quack" story around the hardware news outlets until someone (*cough* HardOCP) traded scruples for site hits and bought it?

          Anybody

          That's right... nVidia. It's funny, isn't it, that when I hear "unscrupulous reporting" and "smear politics" I think of two things: the Bush Administration and the Graphics Hardware marekt. I'll take it with a grain of salt, but not because of some campaign two generations ago that was entirely orchestrated by the competition.
          • Ok, maybe it wouldn't have come to light without nVidia and maybe nVidia has done it too; so what: the Quack story wasn't made up and I am glad I heard about it. Through that story I learned a lot of the dirty stuff that both manufacturers do to artificially boost their numbers.
        • Programmable memory controller in new cards. Tweaks to it for these games. No quality degredation, though it is still directed at the two specific games.
        • Re:How did ATI... (Score:2, Informative)

          by DudemanX ( 44606 )
          You may have an X800 but if you read the article you'ld see that this performance boost doesn't apply to you. It only works for X1800(and maybe X1600 and X1300) cards that have the programable memory controller. The X800 actualy loses one or two FPS using the new driver, sorry :(

          • I noticed that, but I hope they address that by the time they make this the "official" release, so that either we can ALL enjoy an increase in performance, or at the least they don't screw something up that's already working fine.
      • They have and they haven't. The tweak is targeted at Doom 3 and Quake 4, and it is indicated that there is some game-detection going on, but this isn't a quality-sacrificing optimization. The X1000 series has a programmable memory controller, and it is obvious that some games benefit from certain memory controller settings. Since D3 and Quake 4 are popular games, they obviously deemed that it was worth the effort to try to find more optimal settings for the memory controller for these two games. They did, a
        • I never said there was quality-sacrificing giong on--that was someone who replied to me. And my point stands--popular games might run better on this card if ATI decides to make a memory profile for them, but it tells us nothing about the general case.
          • It boils down to if the tweak is applied only to Quake 4/Doom 3, or OpenGL apps in general.

            The thing is that it is hard to tell without somebody who has both the beta driver and one of the new cards who can actually fire up some other OpenGL games. The difference in that case may be academic, since there are few (if any) major games that use OpenGL other than Q4/D3.

            The way I see it, they've had performance issues with OpenGL games, and it would seem, as far as most people are concerned, that they've fixed t
      • Re:How did ATI... (Score:3, Informative)

        by theantipop ( 803016 )
        Actually one of ATi's lead developer's explained that they are simply taking advantage of some of the properties of the new memory controller in the X1000 series. They have optimizations (I would guess) specific to some types of memory calls, and it seems that they just now had time to perfect them in driver. As I understand it, you won't see these performance gains with older Radeons.
      • Don't forget NVidia made "optimizations" back in May '03 to detect 3DMark03 to improve performance by as much as 24% in the benchmark. That would have no impact on any game and could only have been used to mislead the public as to the performance of their cards. Extremetech found it using a BETA of 3DMark that didn't follow the standard benchmark, it would let you roam around the scenes. When flying around they would see things that didn't render correctly at all or missing objects. Now that is low...
  • by stienman ( 51024 ) <.adavis. .at. .ubasics.com.> on Tuesday October 25, 2005 @01:56PM (#13873933) Homepage Journal
    if(Window.Title=="DOOM"){
    employGraphicsShortcuts();
    }

    As always, the graphics card makers quantify the leading game's usage of the API and take shortcuts as needed in order to improve gameplay. Since Doom is released, they can also release these driver shortcuts. These same shortcuts wouldn't necessarily work under another program, and may cause unintended artifacts, crashes, etc.

    The only question is why hasn't nVidia released their tweaks yet?

    This would only be news once they've both optimized their drivers for this game and one clearly has the advantage.

    -Adam
    • nVidia already HAS optimized their drivers for it. They did that back when Doom 3 was news. ATI did the same.

      What ATI is doing this time is tweaking the programmable memory controller in their new cards, not really tweaking the drivers. As I said both ATI and nVidia have already tweaked their drivers for Doom 3. So unless nVidia has some similar tweak up their sleeves (Which they may or may not) then the situation won't change with waiting. I think Doom 3 has been out enough that both companies have grabbed
  • by DeadBugs ( 546475 ) on Tuesday October 25, 2005 @02:12PM (#13874083) Homepage
    Apparently it only works at 4XAA and only on the X1800XT. There are also performance differences when playing multi-player versus running time demos.

    This is a step in the right direction. However, this is not the OpenGL driver fix that everyone has been waiting for. It is a manipulation of ATI's new programmable memory controller.
    • by Guspaz ( 556486 ) on Tuesday October 25, 2005 @02:59PM (#13874592)
      Apparently you should RTFA.

      1) It does not only work at 4xAA, that is just where the gains are more impressive. With or without AA before they were behind, now they're ahead.

      2) It is not just the X1800XT. The review was a roundup of high-end cards, and as such only included the X1800 XT and X1800 XL (Not just the XT like you suggest). The optimizations should affect ATI's entire product line fom the X1300 on up.

      3) There are no other major games, to my knowledge, that still use OpenGL. As such, this can be considered a general fix for OpenGL performance. General in the sense that it fixes the problem (Poor OpenGL performance) as far as the vast majority of gamers are concerned.
      • 3) There are no other major games, to my knowledge, that still use OpenGL. As such, this can be considered a general fix for OpenGL performance. General in the sense that it fixes the problem (Poor OpenGL performance) as far as the vast majority of gamers are concerned.

        There are a LARGE NUMBER of professional OpenGL applications that push current graphics hardware and drivers to their limits. Also, Linux game environments such as Cedega translate DirectX to OpenGL calls, and would benefit from any general O

        • Professional applications are best suited to professional graphics cards, not gaming graphics cards. Professional cards use different drivers for good reason. So they are totally irrelevant in this discussion.

          And note that I said major games. Cegeda is a niche product at best, and don't really matter in the grand scheme of things. ATI has a dedicated Linux driver team anyhow, so it is up to that team to put the effort into porting these optimizations to the Linux drivers.

          I should point out that I missed one
          • Wasn't Half-Life 1 a Quake engine game? Which brings me to a point...look at the number of games that used the Q3/TA engine, these include MoH series, CoD series JK2 series, RTCW. You can bet a good number will probably be using the Q4 engine.

            Of course there are also a ton of games using the various iterations of the Unreal Engine [wikipedia.org]. There is one game [wikipedia.org] with its own engine that will get me to upgrade though.
            • Yes, HL1's engine was based on the Quake engine, but basically only for really low-level rendering stuff (I think the figure Valve gave was something like 10% of the final engine was Quake code).

              You've got a good point with the Q3 engine, it was really widely used in major games and D3 probably will be too.

              And as far as Oblivion goes, great choice ;-) Morrowind is my all-time favorite game, and I upgraded to a 6800GT just for Oblivion (since it'll use lots of SM3.0 optimizations).

            • Yeah, HL1 was OpenGL and Quake 1. See my other comments for that and Quake 3 (and similar).

              It is important to keep in mind, though, that Unreal Engine is a Direct3D engine. It contains an unsupported beta OpenGL renderer, but considering few people if any use it, it's not worth mentioning. Unless you mean the Linux port that relies on that renderer, but I've already put forth my opinion on why Linux gaming shouldn't enter into the discussion (Yes, it would be nice to get better support for it, but it is uni
          • HL1 was OpenGL, Direct3D, and Software. When it came out, D3D was the default setting, although now with Steam OpenGL is (and it always ran way better in OpenGL on both ATI and NVidia).

            I believed for a long time, too, that the fps in HL1 was capped at 100fps, but not so. If you enable developer mode (either "developer 1" in console, or -developer added to the shortcut, I don't remember exactly), it uncaps it.

            In addition to D3/Q4, Starbreeze's Chronicles of Riddick uses OpenGL (and has all the features of th

            • Actually HL1 never defaulted to Direct3D. When it came out, software mode was the default, and OpenGL was the suggested 3D accelerated model. Direct 3D was only included because OpenGL support wasn't as good back then as it was now; it was a fallback, albeit one that ran much slower and with stuff missing.

              I did forget it, though, yes, as I pointed out in another one of my replies. However since it is of the era of DX6 capable cards, no optimization made to modern graphics cards is going to mean anything. Th
              • Not sloppy of you, since you're completely right that optimizations won't help Q3 or HL1 at all on current hardware.

                I'm sure you're right about HL1 defaulting to software, I just remembered having to manually change to OpenGL, so I assumed it was D3D it defaulted to.

                I had heard that CoR did what D3 did (real-time lighting, normal mapping) so I guess I took it literally to mean everything D3 did. Apparently not so, but I'm sure we can both agree that CoR and D3 both look amazing :-)

                And we can also agree that

          • Professional applications are best suited to professional graphics cards, not gaming graphics cards. Professional cards use different drivers for good reason. So they are totally irrelevant in this discussion.

            This is a largely a myth. A significant portion of openGL-based professional applications work just as well on "consumer" cards as they do on a "professional" line. I help develop one at the openGL level and I've done the benchmarks personally. A significant segment of the userbase uses it on "consumer

  • So how did ATI pull this off?

    Maybe they have an aspie?

  • by shoptroll ( 544006 ) on Tuesday October 25, 2005 @02:16PM (#13874125)
    I don't know what NVIDIA did with the drivers (81.85) released about 1 week ago, but they broke OpenGL support in Doomsday 1.8.6 (3D Source Port of DOOM). According to the changelog it adds OpenGL 2.0 support.

    Not sure if that's related, but if NVIDIA is accidentally breaking support for OpenGL in apps (perhaps deprecated API calls? I dunno) that could have something to do with it.
    • Yes, I also noticed problems with the 81.85 drivers. I am no longer able to play Unreal Tournament on my Gefore 6200 TC card like I was before. Granted, it's not the fastest card - but UT played fine before. The new drivers did drastically speed up UT2004 for me, but absolutely killed the original UT. I was surprised by this.
    • The 81.85 drivers cause all sorts of havoc such as breaking dual screen support on 6800 cards. They rushed out the drivers as an answer to ATI without properly testing them. I recommend staying away from that particular upgrade until they release a fixed version.
  • Yes but (Score:3, Insightful)

    by Xarius ( 691264 ) on Tuesday October 25, 2005 @02:51PM (#13874505) Homepage
    does it run on linux?

    *ducks*

    Seriously, have they made the same improvements in the linux native drivers?
    • I can say without doubt that, no, they have not made the same improvements. Why bother with such a minor tweak when their Linux drivers are so far behind the Windows drivers? UT2004, for example, is too choppy to play under Linux with settings that are fine under Windows.
  • by Surt ( 22457 ) on Tuesday October 25, 2005 @02:53PM (#13874523) Homepage Journal
    As shown by our testing, with one simple driver update, ATI's gone from last to first place in Quake 4 performance. There's a wealth of data you can glean from these benchmarks.


    Of course it's a comparison between two companies, so they were either going from last to last or last to first, there wasn't any other possibility.
  • From http://www.firingsquad.com/hardware/quake_4_high-e nd_performance/ [firingsquad.com]

    One cool feature ATI has added to their X1000 family is the ability to make changes to the memory controller's arbitration logic and/or its algorithms via software. This allows ATI to make adjustments with a simple driver update.

    This is what ATI has done with their new hotfix driver, they've simply optimized memory access inside their memory controller to better handle OpenGL titles. This is all invisible to the end user: once an O
  • by bconway ( 63464 ) * on Tuesday October 25, 2005 @03:25PM (#13874894) Homepage
    How did the ATi fare when running quack4.exe?
  • However (Score:3, Funny)

    by obeythefist ( 719316 ) on Tuesday October 25, 2005 @09:13PM (#13877559) Journal
    Let's look at ATI's business process

    1) Launch Product
    2) Benchmark Onslaught
    3) Release better drivers
    4) Benchmark Onslaught that beats nVidia
    5) Marketing and sales blitz
    6) Design product
    7) Produce product
    8) Announce product availability schedules
    9) Look for factory to start making cards
    10) ??
    11) Profit!

    Not that I'm suggesting ATI has severe production issues - if nVidia can kill paper launches, surely ATI could at least try to keep up.
  • Having the faster card means nothing until they can get adequate volume to retail.

    If your card is the fastest, but i cant go into any old computer store and buy it, why do i care?
  • I can't imagine ATI faring as well against nVidia under Linux.

    Fuck ATI.
  • I won't even consider an ATI card given the state of their linux support.
  • I run Doom 3 on my system with few issues. My specs... SIS K7 Mobo...Custom Built PC AMD Athlon XP 2000+ 1.6GHz 512MB DDR-RAM 8x AGP nVidia GeForce FX 5600 w/256MB DDR-RAM 80GB UDMA IDE Hard Drive DirectX 9.0c nVidia Forceware 78.01 Drivers Creative SoundBlaster LIVE! 5.1 Doom 3 settings: Resolution: 640x480 Shadows: On High-Res Special Effects: Off Bump-Mapping: On VSync: Off I'll have to try Quake 4. Doom 3 restored some of my confidence in Id Software. Quake 3: Arena was a very poorly designed game
  • "...OpenGL titles such as DOOM 3, Quake 4,..."

    Quake4's installer told me it *required* directx 9c. How does one make it go with OpenGL instead?
    • I bet Quake4 on windows uses DirectX for input (mouse/keyboard handling, etc. (although they say they're using SDL on Linux; I don't see why they can't use it on windows, too)). I'm pretty sure all the rendering is done in OpenGL.

"Hello again, Peabody here..." -- Mister Peabody

Working...