Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
First Person Shooters (Games) PC Games (Games) Entertainment Games

Initial Half-Life 2 Benchmarks Released 421

dfj225 writes "According to an article on ExtremeTech.com, it looks like ATI has the lead in Half-Life 2 graphics card performance. Valve benchmarked their new game using the top cards from both ATI and nVidia. Results show the ATI Radeon 9800 Pro drawing around 60 FPS while the nVidia GeForce FX 5900 Ultra only draws around 30 in Half-Life 2's DX9 full precision tests. Read the article to see results on other tests that Valve ran." Update: 09/11 13:06 GMT by M : Another article about the presentation.
This discussion has been archived. No new comments can be posted.

Initial Half-Life 2 Benchmarks Released

Comments Filter:
  • Well well (Score:4, Interesting)

    by Snaller ( 147050 ) on Thursday September 11, 2003 @06:36AM (#6929236) Journal
    I take it you guys have seen the ingame movies? Looks very nice, and seems to take game physics to a whole new level, but at the same time it looks as if you need a Pentium 5 to get it to run properly!
    • Re:Well well (Score:4, Informative)

      by Spy Hunter ( 317220 ) on Thursday September 11, 2003 @06:44AM (#6929274) Journal
      OTOH, it *sounds* as if you need "An 800 MHz P-III and a DX6 level hardware accelerator (e.g. TNT)." -- Gabe Newell, Valve Software General Program Manager
  • by mmol_6453 ( 231450 ) * <<short.circuit> <at> <mail.grnet.com>> on Thursday September 11, 2003 @06:36AM (#6929238) Homepage Journal
    How is objectivity of this study any different from, say, a study by Microsoft promoting Windows?
    • by MoonFog ( 586818 ) on Thursday September 11, 2003 @06:40AM (#6929258)
      From the article : Valve's General Program Manager Gabe Newell gave the presentation at an analyst conference being held by ATI in Seattle. And while the circumstances may seem slightly suspect given the event and the Valve/ATI OEM deal, Newell was quick to dispel any such conflict of interest.

      Believe it those who will, but I would certainly question the integrity of the test, and I won't buy an ATI card over nVidia over this just yet.
      • Several sites have already called Nvidia's DX9 performance into question, including www.gamersdepot.com

        Research a few independent benchmarks.
      • by Zathrus ( 232140 ) on Thursday September 11, 2003 @08:55AM (#6930082) Homepage
        Which is why, I'm sure, that every single real DX9 [beyond3d.com] benchmark [gamersdepot.com] has shown nVidia falling far, far behind ATI.

        The quotes from that second link are particularly damning -- and they're from a variety of companies, including id Software, not just Valve.

        I've never owned an ATI card. My last 5 or 6 cards in all my computers (and my wife's) have been nVidia. My next card is almost certainly going to be ATI though because they're currently the performance leaders. I have some reservations about drivers still -- not with performance or stability but with long term support since ATI has still failed to deliver a unified driver architecture -- but I'm unwilling to sacrifice that much performance while still paying a higher price.

        Frankly, at this point anyone who is still wondering about the validity of the benchmarks is deserving of the title "nVidia fanboy".
    • actually the analogy your looking for is:
      Microsoft discussing how well windows runs on AMD's chips overs Intel's. Microsoft is in the business of selling windows as valve is to selling games. Valve is just calling a spade a spade and you just plain dont like it. :)

      Its in their best interest to make their products work as well as they can on respective hardware.

      later,
      epic
      • Sure, whatever you want to believe. Nvidia and ATI cards have been running neck and neck for a couple of years now and all of a sudden at an ATI event the ATI card has twice the framerates? Color me skeptical.
        • by cK-Gunslinger ( 443452 ) on Thursday September 11, 2003 @09:31AM (#6930399) Journal
          Take a look at history:

          1) 3dfx is king of 3D
          2) nVidia comes along with interesting products, 3dfx still king
          3) nVidia improves (TNT, GeForce), 3dfx struggles, both run neck-and-neck
          4) 32-bit becomes important, nVidia take the lead
          5) 3dfx struggles, plays catch-up (Voodoo4, 5), yet becomes irrelevant

          Then we have:

          1) nVidia is king of 3D
          2) ATI comes along with intersting products, nVidia still king
          3) ATI improves (Rage, Radeon), nVidia struggles, both run neck-and-neck
          4) DX9 becomes important, ATI takes the lead
          5) nVidia struggles, plays catch-up (FX series), yet ...

          It's not a hard cycle to visualize. A lot of other similarities are there, as well: "fan-boys", aggressive advertising, benchmark scandals, developers' opinions, etc. It's actually pretty cool for us, as we get great advancements in 3D.
  • by dmayle ( 200765 ) on Thursday September 11, 2003 @06:38AM (#6929243) Homepage Journal
    Forget ExtremeTech's article, and go check out the one at The Tech Report. [techreport.com] According to Gabe Newell of Valve, one of the graphics card companies was trying to detect when a screen shot was being made, so that it could output a higher resolution frame, hiding the quality trade-offs made by the driver. From the article: "He also mentioned that he's seen drivers detect screen capture attempts and output higher quality data than what's actually shown in-game."
    • by Absurd Being ( 632190 ) on Thursday September 11, 2003 @06:47AM (#6929289) Journal
      If a user wants to take a screenshot, shouldn't it be at the highest available resolution? If they can do it with a low overhead, they should. It's the lying on the benchmarks that's the problem here.
      • They call it a screenshot for a reason. If your screen can't show it, it ain't a screenshot.
      • Say that now. How will you feel when you try to take a picture of the bug that popped up in Windows, and for some reason the screenshot shows the system working fine? When you take a screenshot, you want it to be a picture of "what you see now".
      • The screenshots are used by many reviewers to do a detailed comparison of image quality. If a screenshot shows a better quality image than you see in-game, they are no longer useful tools for determining image quality and as such may cause reviewers to draw the wrong conclusions. I agree that for the purpose of chronicling your own adventures, the highest-quality screenshots are nice, but for review purposes they should represent exact in-game quality. Moreover, the difference should be clear to everyone.
      • by roystgnr ( 4015 ) <roy AT stogners DOT org> on Thursday September 11, 2003 @07:43AM (#6929576) Homepage
        If you take a screenshot while running at 1280x1024, and it outputs a 1600x1200 picture, then it's providing "the highest available resolution". If you take a screenshot while running at 1280x1024, and it gives you the same size image but with all the ugly "trade visual quality for speed without the user's request" hacks turned off, then it's just lying.
      • a "screenshot" should capture what is on the "screen" and save that to an image file. That's what a screenshot has been historically, and it's what people expect from similarly named features. What you're talking about would be an entirely different feature. You're talking about "Render this scene to a file", in which case you might want to increase the resolution or quality settings. What would be a more valuable feature in certain games would be something like "Render this entire map to a file." The
    • That is awful. I'd much rather driver authors spent their time actually improving the drivers, rather than coming up with ways of fooling people into thinking they are improved.
  • Go, ATI! (Score:5, Funny)

    by Faust7 ( 314817 ) on Thursday September 11, 2003 @06:39AM (#6929248) Homepage
    Is it just me, or is ATI pulling a real turnaround? They used to be the underdog for so long -- their drivers weren't the greatest, their marketshare was second-fiddle, and they initially missed out on the Xbox contract. I never thought I'd see the day where nVidia, which is practically the industry standard for gaming, might be challenged on such a thing as actual performance.

    Oh well, at least communication between hardware and game developers has improved to the point that I won't need to specify to the game whether I have a Hercules, Tandy, or Trident chipset... ;-)
    • Re:Go, ATI! (Score:5, Insightful)

      by ProppaT ( 557551 ) on Thursday September 11, 2003 @07:42AM (#6929569) Homepage
      Forget about ATI, I never thought I'd see the day when nVidia was the standard. Back in the old days of the 3D wars, 3DFX was fast, Rendition was pretty, and nVidia was just butt ugly with a handful of problems.

      I always rooted for Rendition, but I suppose they died when Micron bought them.

      If anything, nVidia was the real underdog in the 3D wars...they were the only company with nothing going from them, and they managed to turn that around. I still hope ATI wins in the end, though. I like their technology quite a bit better than nVidia's....and you can't beat the 2d/3d quality with anything but a Matrox.
      • Re:Go, ATI! (Score:5, Informative)

        by Zathrus ( 232140 ) on Thursday September 11, 2003 @09:07AM (#6930186) Homepage
        If anything, nVidia was the real underdog in the 3D wars...they were the only company with nothing going from them

        Nothing going for them? Uh... do you know anything about nVidia's history?

        nVidia was formed from disgruntled SGI employees. You know, the same SGI that created OpenGL and pioneered 3D graphics on computers? Yeah, that one. Why were they disgruntled? Because they had gone to the powers that be at SGI and said "you know, we could make a buttload of money off our technology -- we can make cards that do a large subset of the OpenGL calls and sell it to the PC market for cheap!" SGI management was all about profit margin though, and there's a lot more margin (although not as much profit) in selling a few cards for $50-100k than there is in selling hundreds of thousands or millions of cards for $150-450.

        So a bunch of the top SGI graphics engineers left and went off to make their own company. The first few cards released by nVidia were actually OEM'd cards from another company. IIRC, the TNT was the first silicon and code from the ex-SGI engineers, and it was not "butt ugly with a handful of problems" by any means. There were initial problems with running 3Dfx only games (as in, it couldn't...), but Quake and OpenGL remedied that issue. The GeForce completely blew away 3Dfx and they never recovered.

        Oh yeah... that little bit about them being ex-SGI engineers? Well, it came back to bite them. SGI sued the hell out of nVidia and it wound up being settled out of court. SGI retains options on advanced features in the silicon and drivers. One of the many reasons that the drivers can't be open sourced.

        It seems that nVidia is now suffering from the same problem that plagues a lot of hot tech companies -- many of the primaries have made millions of dollars and decided they don't have the need/desire to work there anymore. So they retire, cash in their stock options, and then go pursue other interests, which robs the company of not only its top engineers but also its visionaries and leaders. The last couple generations of cards from nVidia appear to be due to this. They may come back still, and they're still better off than 3Dfx was, but they've certainly fallen from the lofty heights they used to occupy.
  • Yawn... (Score:3, Interesting)

    by WIAKywbfatw ( 307557 ) on Thursday September 11, 2003 @06:41AM (#6929259) Journal
    And just how long will it be before someone finds out that one or both of those video card manufacturers has been "tweaking" their benchmarks to improve the acheived frame rate?

    Anyhow, just who runs Half-Life or anything with all the eye candy maxed up? No serious gamers that I know of, that's for sure. At the settings that hardcore FPS addicts play at, the frame rate delivered by any card currently being shipped either ATi or nVidia will be sufficient (assuming that the rest of the system isn't subpar).

    Once again, for those of us without money to burn the smart buy is that $100-$200 card that cost $600 a few months ago, not the one that costs $600 now (and which will be down to $100-$200 just as fast).
    • Re:Yawn... (Score:3, Informative)

      by SilentSheep ( 705509 )
      60 fps is more than enough for a 1st person shooter. I doubt you can tell the difference against higher frame rates, i know i can't.
    • Re:Yawn... (Score:4, Funny)

      by adrianbaugh ( 696007 ) on Thursday September 11, 2003 @07:20AM (#6929441) Homepage Journal
      It took me ages to realise that "FPS games" were "first-person shooter games" - owing to all the hardcore gamers posting on /., losing sleep over small video card performance enhancements I always thought it meant "frames per second games"...
    • Re:Yawn... (Score:5, Interesting)

      by Glock27 ( 446276 ) on Thursday September 11, 2003 @07:55AM (#6929639)
      Once again, for those of us without money to burn the smart buy is that $100-$200 card that cost $600 a few months ago, not the one that costs $600 now (and which will be down to $100-$200 just as fast).

      Well, I was pleased to see the showing the GeForce 4 Ti4600 put up in those tests. I think those can be had fairly cheaply these days (I payed $249 several months ago).

      I'm running it in this Athlon 2600+ system (RH 9, fully accelerated NVIDIA drivers). I've been doing some OpenGL development lately, and it's been great on Linux! I have nothing but good things to say about NVIDIA's drivers and OpenGL implementation. Could anyone comment on the quality of ATI's OpenGL support with the 9800 Pro class cards under Linux? (I'd like to hear from the perspective of a developer, but gameplayers would be interesting too).

      On the other hand, I do know one way to get great (or at a minimum good) OpenGL drivers for the Radeon 9800 Pro - buy a PowerMac G5. :-) (Yes, I know you could use Windows also...but let's keep our perspective here.)

      • (mind you, there's just been a new driver release from ATI, and I haven't installed that one yet)

        I've got a 9700 Pro, and the ATI drivers have given me *a lot* of grief as a developer. There are many times when they are so blatantly non-compliant with the OpenGL standards, it's not funny.

        For example, the driver claims to support OpenGL 1.3. With 1.3, ARB_multitexture has been promoted into the core, so they driver _should_ export glActiveTexture & friends without the ARB suffix. Well guess what? It do
        • I own a Radeon 9800 Pro, and I've found just the opposite, at least with the Windows drivers, and their latest Linux drivers (3.2.0 or newer).

          If you're having any kind of problems like that, email devrel@ati.com, someone from ATi will most likely respond rather quickly. I've emailed them before and they've been quite helpful.
    • Re:Yawn... (Score:3, Insightful)

      I've got lots of friends who do exactly what you're talking about: buying top-of-the-line games and hardware then cranking their visual settings down to pong-esque mode just for faster gameplay.

      Excuse me, but when I drop $300-500 on a video card I want my screen to fucking blow me away! I didn't pay for the new technology just to see it wasted. FPS are important, but getting your money's worth and enjoying what the artists put together for the game is far more interesting than simply looking to get a bigge
  • by INMCM ( 209310 ) on Thursday September 11, 2003 @06:42AM (#6929265) Homepage
    What a terrible article. It didn't even say what resolution all that was happening at.
    • You are right. I had to look through the net this morning to find the resolution. Unfortunately, they are only at 1024x768, no aa, no af. So, there is no card on earth that can run the game at 1600x1200...
  • by Stiletto ( 12066 ) on Thursday September 11, 2003 @06:45AM (#6929275)
    nVidia has been circulating its Det50 driver to analysts in hopes that we would use it for our Half-Life 2 benchmarking. The driver contains application-specific optimizations

    The article fails to mention whether they actually detect the application and run the driver through a different code path, or if they've made general driver-wide optimizations that happen to also help Half-Life. Knowing the behavior of these video card companies in the past, I would suspect they have huge chunks of code in there devoted soley to Half-Life.

    So, now instead of having to hack around and catch companies cheating on drivers, we just have to read as they admit it openly? This is standard operating procedure now???

    When I download the latest Detonator drivers for my nVidia card, I want to download a generic D3D/OpenGL driver, not a Half-Life driver. The amount of time they spend "optimizing" for the popular games is time they could have been spending making sure the performance and quality is adequate for ALL games and modeling apps.

    • They'd have to be awfully fast at coding to have a ton of half-life 2 specific optimizations already. It was just announced not that long ago, and I doubt ATI immediately got a snapshot of the very secret "Source" engine to start tweaking with.
    • or if they've made general driver-wide optimizations that happen to also help Half-Life.

      It doesn't matter...

      Coming is the day where the driver will optimize itself for all popular games, on-the-fly. What is wrong with that? A race car team is allowed to optimize their car for the particular track on which they are competing. This is the same thing.

      Who cares? Soon, we'll have drivers that constantly monitor via internet for "best optimizations" versus the installed software. Ultimately, if the card
      • by Rogerborg ( 306625 ) on Thursday September 11, 2003 @07:37AM (#6929534) Homepage

        >Coming is the day where the driver will optimize itself for all popular games, on-the-fly. What is wrong with that?

        Um, because it's a crock of shit? It's not optimisation, it's trading quality for frame rate, without giving you a choice in the matter. If I click the boxes for Full Scene Auntie Alienating and Dodecahedral Filtering, I damn well expect the driver to do that, regardless of whether a given game runs at 2fps or not. If I want a higher frame rate, I can turn those options off myself.

        I don't want the driver second guessing me, because it's not being done for my benefit, it's being done to scam gullible reviewers and sell more cards.

        • /agree with parent.

          I have *no* problem with optomizations that increase speed *without* quality loss. These optomizations, however, do not do this.

          I *do* have a problem with increasing speed *with* quality loss, unless I have a checkbox that specifically says "Do no enable speed optomizations that negatively impact visual quality" or somesuch.

          If they're going to optomize, then make it *known* that the do, and make it a user-configurable option to do so.
    • What's wrong with this? There are no cards out there that are perfect in every respect. Not all games work the same. If a vendor chooses to include optimizations that increase the performance of a game and simultaneously doesn't degrade visual appearance, I'm all for it. Especially if I don't need to do anything and it is all auto-detected. I don't consider this 'cheating', it's called adapting their product to meet the needs of their buyers. The only people going out and buying the super high end cards are
      • Except that these optimizations are choosing FOR you to drop detail settings. Yeah, the game says you have X, Y, and Z special options, but your driver is doing otherwise.

        On top of this, these "optimizations" are degrading visual effects pretty seriously according to the article by taking away important effects.
  • by Channard ( 693317 ) on Thursday September 11, 2003 @06:45AM (#6929278) Journal
    I wouldn't value these benchmarks too much, given they're from a game that hasn't yet gone gold. Features could be dropped from the graphics engine that will affect the way each card deals with the graphics.
    • Half life is set to ship on the 30th, which means it should go gold round about now, or the 15th at the latest. Assuming the shipping date does not change (and, other than editorials there is no indication it will), the version they used is most probably the final one the publishers have, giving them a couple of days to ok the product for duplication.

      Also, you do not drop features from a beta. Features get fixed at the end of the alpha. Beta is for optimisations and bug squashing, whatever the product.

      • Assuming the shipping date does not change (and, other than editorials there is no indication it will), the version they used is most probably the final one the publishers have, giving them a couple of days to ok the product for duplication.

        Well, sounds like it's the developers running these test. And it sounds to me this a show-stopper performance problem - they'll alienate the huge NVidia-owning fraction of the market. Unless they're in bed with ATI (or they haven't got on with NVidia developer relation
  • by BabyDave ( 575083 ) on Thursday September 11, 2003 @06:47AM (#6929286)

    ... you insensitive clod!

    Seriously though, are they allowing for people with older cards? (UT 2003 ran fine on my Voodoo3 and still looked pretty darn good, even w/o transparency, anti-aliasing, or any of the other modern GFX buzzwords)

  • Probably accurate (Score:2, Interesting)

    by BinBoy ( 164798 )
    GeForces just don't work right on some systems. I upgraded from a Voodoo3 to a GeForce 3 a couple years ago on a 700Mhz Athlon and went from being pegged at 70fps in Team Fortress and Counter-Strike to dropping as low as 30fps in the same game on the same computer. Now I have a faster computer with a 9800Pro and I'm at 70fps or higher in every game so far. Ready for HL2 and Deus Ex 2. Whoohoo!

  • Older Hardware (Score:3, Insightful)

    by Nighttime ( 231023 ) on Thursday September 11, 2003 @06:48AM (#6929292) Homepage Journal
    It's all very nice seeing how the latest and greatest cards perform but how about some test results for older cards.

    I prefer to save my pennies and upgrade my graphics card to the one just behind the current generation.
  • by GregoryD ( 646395 ) on Thursday September 11, 2003 @06:50AM (#6929305)
    ATI fanboy: blah blah blah blah nvidia cheated blah blah blah ATI ROCKORZ!!!

    NIVIDA fanboy: blah blah blah nvidia has better support... blah blah blah!!!

    I'm not sure what is going to end first, the Israel-Palestinian situtation or the ATI vs NIVIDA arguement.

    The fact is both regularly cheat on performance and quality benchmarks, and if you think you can actually say one is better then the other you are a biased fanboy.

    Just buy the one on sale, please.

    • by glwtta ( 532858 ) on Thursday September 11, 2003 @07:01AM (#6929361) Homepage
      All I know is that my Radeon 9800 makes my vi look damn good!
    • Yeah, I reckon my 3dfx Voodoo 2 would still kick both their assess if I could get updated drivers for it. Glide fucking ruled.
    • by Obiwan Kenobi ( 32807 ) * <evan@@@misterorange...com> on Thursday September 11, 2003 @08:05AM (#6929695) Homepage
      The fact is both regularly cheat on performance and quality benchmarks, and if you think you can actually say one is better then the other you are a biased fanboy.

      Oh Good Lord, what kind of Trolling is that.

      I'll note a few things here:

      Firstly, NVidia has reigned supreme in the Direct X 8 and prior arena. Their GeForce cards are awesome.

      But DX9 is all about pixel shaders. They are the future, and ATI realized that. They built their R300 core (Radeon 9600/9700) based on the DX9 spec, and it shows. The newest games, such as HL2, which rely heavily on DX9 extensions, run better on ATI hardware than NVidia's stuff because they have to use hacks to get DX9 extensions, such as pixel shaders, to work properly with the GeForce line. NVidia doesn't have it built into the hardware, and the gamers who have them will suffer because of it.

      John Carmack has had to write special code in Doom 3 to compensate for the NV30 core that doesn't like DX9 as much as it should. Go read some of his .plan files for proof.

      Look up your facts, and try to stay away from troll-like generalizing until you know what you're talking about.
  • Unfortunatly (Score:2, Insightful)

    by SirLestat ( 452396 )
    I want a good framerate and I dont have a ATI Radeon 9800 Pro ... Did they realize that that card was $750 over here? I got two 10k hardrive, a raid card and 512 meg of ram for that price !
  • 60 fps ??? (Score:3, Insightful)

    by selderrr ( 523988 ) on Thursday September 11, 2003 @06:54AM (#6929325) Journal
    on the fastest cards on the market ?

    I guess my GeForce4 ti4600, which is just over 1 years old, will only get 30fps or so ! Which means I'll be a sitting duck in netgames.

    If these are indeeed optimized benchmarks, I doubt we'll see HL2 on the market soon. The'yll have to wait at least untill the R9800 or U5900 become mainstream. (read : at console-level prices)
    • They have said forever that it is coming out on September 30th 2003. I would be very surprised if it didn't come out on September 30th since they have been making a big deal out of this (ie the fact that it will ship when they said it would ship) forever.
    • Re:60 fps ??? (Score:5, Informative)

      by Rogerborg ( 306625 ) on Thursday September 11, 2003 @07:43AM (#6929571) Homepage
      Actually, if you bother to read the god damn article, you'll find that your 4600 (and my NV28 4800) beat the NV30 cards when the DX9 gubbins is turned off. Given that Valve are saying that it'll run on a DX6 or later card, it looks like this'll be a viable option for us poor bastards with 6 month old hardware.
  • DX (Score:2, Insightful)

    by Apreche ( 239272 )
    The thing worng with these benchmarks is they only cover DirectX9. Any self respecting Half-Life player always keeps it in OpenGL mode, especially if it's in the land of NVidia. I can't think of a single game that lets me choose between DirectX and OpenGL where I have chosen OpenGL over the dx. Carmack likes opengl, and he knows more about it than anyone I know.
    • Re:DX (Score:3, Informative)

      by Dot.Com.CEO ( 624226 ) *
      Half Life 2 is DX 9 only.
      • Re:DX (Score:3, Informative)

        by entrager ( 567758 )
        Half Life 2 is DX 9 only.

        This is simply not true. While it may not have OpenGL support (I'm not sure on this), it is NOT DX9 only. Valve has confirmed that the game will run on hardware that supports at least DX6.
        • Re:DX (Score:3, Informative)

          by Dot.Com.CEO ( 624226 ) *
          Direct X works that way. You can write the game in DX9 and it will use hardware functions for whatever it can and software for all the rest. However, the game itself will be DX9 only in that you will have to have DX9 installed in your PC to run it even if your graphics card is only DX6.
        • Re:DX (Score:3, Informative)

          by Zathrus ( 232140 )
          He should've said "Half Life 2 is DX only" -- yes, there are code paths for DX6 up to DX9. There is no OpenGL support in Source, and Gabe Newell has said repeatedly that there never will be any.
    • Don't knock it 'til you try it. Have you tried DX9 on a DX9 card ? Valve wouldn't spend so much money on DX9 if it didn't make things look better.

      And Carmack is pro-OpenGL for his own political and egomanical reasons, not necessarily quality reasons.

    • Any self respecting Half-Life player always keeps it in OpenGL mode, especially if it's in the land of NVidia

      I have a 64 meg GeForce 4. In DirectX mode, everything looks really pretty but when I hit escape to go back to the menu I get a black screen.

      In OpenGL mode, transparency is shot to pieces and where there should be text, I get square blocks.

      I wouldn't call myself a self-respecting HL player - but I have to stick with software rendering.

    • by renoX ( 11677 )
      > I can't think of a single game that lets me choose between DirectX and OpenGL where I have chosen OpenGL over the dx.

      IL2 Sturmovick is available both in OpenGL and in DirectX, but on my Radeon the OpenGL mode was much better than the DirectX.
    • Re:DX (Score:2, Informative)

      by Snowmit ( 704081 )
      Any self respecting Half-Life player always keeps it in OpenGL mode, especially if it's in the land of NVidia.

      That may be true for self-respecting Half-Life players but what about the self-respecting Half-Life 2 players? You know, the ones that will be playing the new game that will be running on a new engine? What do they have to say about this issue?

      Nothing. Because they don't exist yet - the game needs to be released before there can be tribal knowledge about the optimal hardware configuration.
  • let's remember (Score:4, Insightful)

    by Anonymous Coward on Thursday September 11, 2003 @07:10AM (#6929406)
    ATi is bundling HL2 with there cards soon so anything in this article at best gives you an idea as to HL2's performance.

    Let's also remember that once ATi was much bigger than nvidia in graphics, and charged exorbitant prices for crappy chips, with shocking driver support.

    Let's also remember nvidia have much better performance so far in the more important (and independant) doom3 benchmarks (where 16bit floating point precision is used for nvidia cards, instead of 24 for ati and 32 for nvidia, as directx9 was originally going to specify before nvidia and microsoft fell out).

    Also remember that nvidia's cards offer better performance in most 3d rendering apps (where both cards use 32bit fp and almost all of ati's advantages evaporate), so driver tweaking on nv's part in games does not necessarily mean they have a lesser part for that.

    Finally linux support is a no brainer, nvidia have been doing it well for years (with support as far back as tnt), ATi have made a recent attempt that is not user friendly, or even support all radeon chipsets, let alone rage 128.

    ATi are onto a good thing right now with the current directx9 spec giving them an advantage in games that stick to the spec instead of the optimum end user experience. That is about all they have going for them though. This battle has far from swung the other way, it's merely gotten closer than it used to be.
  • Have anybody seen a comparison of DirectX8 and DirectX9 visual quality? It doesn't have to be Half-Life2 (although that would be preferable), just something to see what value brings DirectX9 for the player.

    I have GeForce Ti4200 (DX8) and it looks like it will be possible to get up to 30FPS in 1024x768, no FSAA. Alternatively, I can get a new Radeon 9600 Pro for 150$ and get the same or better performance in the same resolution, but with DX9 eye-candy and FSAA enabled.

    Is it worth it? I assume that the cool
  • Links Galore (Score:3, Informative)

    by shione ( 666388 ) on Thursday September 11, 2003 @07:31AM (#6929505) Journal
    Toms Hardware [tomshardware.com]
    FiringSquad [gamers.com]
    Tech Report [tech-report.com]
    Gamers Depot [gamersdepot.com]
    Beyond 3d [beyond3d.com]

  • 3DMark03 (Score:5, Insightful)

    by 10Ghz ( 453478 ) on Thursday September 11, 2003 @07:43AM (#6929574)
    These results mirror 3DMark03-results perfectly. It seems that NV's DX9-support is horribly broken. Why else would their cards need separate codepath (In HL2 and in D3(Although D3 is OpenGL-game, it uses many of the same features)) whereas Ati-cards do not? Carmack has said that if D3 does not use the NV-specific codepath, NV-cards will have poor performance.
  • by magic ( 19621 ) on Thursday September 11, 2003 @08:10AM (#6929721) Homepage
    I'm not 100% certain about the specific cards tested, but for several of the highest end NVIDIA and ATI cards a head-to-head comparison for performance doesn't tell the whole story.

    This is because ATI cards have implemented a 24-bit floating point pipeline while NVIDIA cards implement a 32-bit pipeline. It is reasonable to expect the ATI card to outperform the NVIDIA card at the expense of some round-off errors. 32 vs. 24 bits on a color pixel is probably no big deal (although some color banding might arise), but when those results apply to vertex positions you could begin to see cracks in objects and shadows.

    Note that the ATI card is still faster for Half-Life 2 in 16-bit mode, so it is probably a faster card overall for that game. There are so many ways to achieve similar looking effects on modern graphics cards that even as a graphics expert, I can't tell which card is actually faster.

    I've been working with both the GeForceFX and Radeon9800 for some time and both are amazing cards. They have different capabilities under the hood, and can perform different operations at different speeds. Furthermore, under DirectX both cards are restricted to a common API but on OpenGL they have totally different capabilities. I don't think a consumer would go home unhappy with either card, except for the price.

    -m
  • Sad day ... (Score:3, Insightful)

    by vlad_petric ( 94134 ) on Thursday September 11, 2003 @09:15AM (#6930245) Homepage
    Technology and FPS aside, Nvidia's support for Linux shines in comparison to ATI's offer. I'd really hate it if they follow 3dfx's path.
  • zerg (Score:3, Funny)

    by Lord Omlette ( 124579 ) on Thursday September 11, 2003 @10:03AM (#6930766) Homepage
    Matrox! Matrox! Matrox!

    Go Matrox! ... What?? It could happen!
  • by ThisIsFred ( 705426 ) on Thursday September 11, 2003 @11:07AM (#6931780) Journal
    About the only thing this is illustrating is that the performance problems with D3D are pretty severe now. DX couldn't correctly render fog or water in the original Half-Done(tm) engine, and going to OpenGL drivers would not only boost the frame rate by as much as 66%, but would also correctly render those effects.

    Also, RTFA, Nvidia is a little shy about "optimized" drivers for benchmarking certain applications. They specifically requested that the optimized drivers not be used. No indication that ATI did the same.

    I doubt there will be a Linux version of HL2 either, because this new 3D engine appears to only support DirectX.

    That's a shame, because the world didn't end with the America's Army developers ported AA:O to Linux. As a matter of fact, it runs quite well, and it didn't take them 5 years to produce nothing but vaporware.
  • by gid ( 5195 ) on Thursday September 11, 2003 @02:25PM (#6934955) Homepage
    From an article on firingsquad:

    This is NVIDIA's Official statement: "The Optimizations for Half-Life 2 shaders are in the 50 series of drivers which we made available to reviewers on Monday [Sept. 8, 2003]. Any Half-Life 2 comparison based on the 45 series driver are invalid. NVIDIA 50 series of drivers will be available well before the release of Half-Life 2".

    While I, like everyone else don't like trading off quality for framerate blah blah blah. Who knows what ATI's quality is like? Maybe they optimized their DX9 drivers for the fastest possibly/crappy quality off the bat. I'm going to wait to get the reviews for the Det 50 drivers and get some reviews of what the quality looks like on each card before I'll be making any purchases.

    I was actually all set to buy an nv 5600 ulta until this came out. Think I'm gonna wait for them to duke it out a little bit and get to the bottom of things before I decide...

To be awake is to be alive. -- Henry David Thoreau, in "Walden"

Working...