Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Graphics Software First Person Shooters (Games) Entertainment Games Hardware

Half-Life 2, ATI, NVIDIA, and a Sack of Cash 412

Latent IT writes "If you're into games, and unless you've been living under a rock for the past few days, you've heard a bit of a rumble from Valve on the relative quality of ATI vs. NVIDIA cards. Starting with articles like this one (previously reported), Valve told the world that the ATI 9800 Pro was nearly three times faster in some cases than the formerly competitive NVIDIA offering, the 5900 Ultra. Curiously, this happened at an ATI sponsored event, "Shader Day". But the story hasn't stopped there. NVidia released this response, essentially claiming that their new drivers, that were available to Valve at the time of their press conference, would make for vast, legitimate performance improvements. An interview with Massive, the creators of the Aquamark 3d benchmark, seems to confirm this opinion - that the NV3x chipset wasn't designed around any certain API very well, and the drivers are critical in achieving good performance. Anandtech writes here about the restrictions Valve placed on what benchmarks could be run. However, the key to this whole story may be this: an article, which I haven't seen get much coverage in all this, seems to make everything a little clearer - Valve stated that their OEM bundling deal with ATI came from the fact that ATI's cards were so superior, and that they were "performance enthusiasts". However, if the Inquirer is to be believed, the bundling deal was a result of an outright auction, on what will probably be the most popular game of the year. Which year that might be, is another issue altogether. Whatever happened to just making hardware, and making games?"
This discussion has been archived. No new comments can be posted.

Half-Life 2, ATI, NVIDIA, and a Sack of Cash

Comments Filter:
  • by Loie ( 603717 ) on Saturday September 13, 2003 @10:25PM (#6954570)
    "Whatever happened to just making hardware, and making games?" unfortunately..where there's a multi-billion dollar industry, there's shady business deals.
    • by The Clockwork Troll ( 655321 ) on Saturday September 13, 2003 @10:45PM (#6954669) Journal
      You mean shader business deals; many in the pipeline, in fact.
    • by Apiakun ( 589521 ) <tikora AT gmail DOT com> on Saturday September 13, 2003 @10:47PM (#6954681)
      I think it's infinitely easier to write and optimize a program around a specific hardware architecture than it is to try to write for everything as a whole, and thereby bringing the quality of your software down the LCA (Lowest common API).
      • There written for some type of graphics API, DirectX and/or OpenGL. The days of writing to bare hardware were over more then a decade ago.
        • You're both right (Score:2, Insightful)

          by BlueA ( 676985 )
          There written for some type of graphics API, DirectX and/or OpenGL. The days of writing to bare hardware were over more then a decade ago.

          Most developers will find at some point that they need to optimize their graphics API code for specific chip sets.

          Oh, and mentioning DirectX before OpenGL in the same breath is what Microsoft WANTS you to do.

          • by CaptCanuk ( 245649 ) on Sunday September 14, 2003 @01:36AM (#6955333) Journal
            With specific reference to OpenGL, games are written in many paths based on the acceleration available on various graphics cards exposed through vendor specific and ARB approved extensions to GL. Drivers optimizations are written both to speed up GL calls and all sorts of other common calculations as well as speed up games by cutting corners. Corners to cut often include what assumptions certain games makes. If a game or game engine makes an assumption such as a static camera, a lot of variable dependencies can be chucked out the window (PTP: pardon the pun) and an "optimization" is born. I would find it hard to believe that a GFX 5800 Ultra would ship with anything less than 75% of the optimal general driver (i.e. nothing game specific or context specific) -- so me thinks the new Detonator 50 has some nice "halflife2.exe" code :P

            Oh, and mentioning DirectX before OpenGL in the same breath might mean you like serializing items in a list in alphabetical order... oh no!

    • Bullshit (Score:3, Insightful)

      by ShieldW0lf ( 601553 )
      I'd hardly call "bundling rights" shady business deals. Unless there are facts missing from the article, this is a bullshit take on an otherwise innocent business deal.

      That said, if I was a game development company, I would be putting the boots to nVidia any way I could right now. Today, it's "We'll get around to making your game work with our drivers when it's popular" but tomorrow it could be "You want your game to work well with our drivers? That will be $3,000,000 please." The shit that nVidia are
    • by Kashif Shaikh ( 575991 ) on Sunday September 14, 2003 @01:37AM (#6955338)
      And you know when a Game Developer has gone big-time? When the phrase "gaming community" is replaced with "our customers", "installed base" with "market share", and "we love to do" with "our interests". Not that its bad or anything, but it has a cold touch of "the guys in suits". And this was how Gabe sounded like on Shader Day. Times have changed.

      OT, one thing I like about Id software is that they are down-to-earth and very objective about the strengths and weaknesses of vid cards.

  • by gargantunad ( 697514 ) on Saturday September 13, 2003 @10:27PM (#6954573)
    Nvidia didnt create a card that far behind the curve - it has to be drivers.
    • by SHEENmaster ( 581283 ) <travis@uUUUtk.edu minus threevowels> on Saturday September 13, 2003 @10:36PM (#6954631) Homepage Journal
      And let us take a crack at them. Suddenly you'll have NetBSD running directly on the card, twice the framerate in Linux as in windows, and (worst of all) both companies' products will be advanced, eliminating the advantage over one's competitor by tossing more money at the problem.

      Betterment serves no profitable purpose unless it is unatainable by one's competitor. If someone can show how they'll make more money by making a better product while also aiding their competitor in the same endeavor, they might help us out a bit more.
    • Re:cant be that bad (Score:5, Interesting)

      by Anonymous Coward on Saturday September 13, 2003 @10:37PM (#6954632)
      The problem lies in the way the FX deals with Pixel Shader 2.0 instructions. AFAIK, the ATI card follows DirectX standards pretty well and the Microsoft DirectX compiler will produce code that the 9800 will process quickly. ATI's drivers can rearrange the pixel shader commands a little bit to improve performance.

      The Geforce FX processes PS2.0 instructions in a whole different way. Using Microsoft's compiler produces slow code when using PS2.0. Nvidia still doesn't have a JIT compiler in their drivers to reorder the PS2.0 instructions for maximum performance. The Detonator 50 series drivers are supposed to fix this. How well it's fixed is still up in the air.
      • by mcbridematt ( 544099 ) on Saturday September 13, 2003 @11:42PM (#6954904) Homepage Journal
        But then.. ATI hasn't always given a shit about OpenGL, while NVIDIA has.

        "And thats why I'm with NVIDIA"
        • by Nogami_Saeko ( 466595 ) on Sunday September 14, 2003 @12:42AM (#6955115)
          OpenGL was great when there wasn't an alternative (except perhaps 3DFX's Glide).

          Times have changed however, and DirectX development has lept forward in a way that would be nearly impossible for OpenGL to do as quickly. Mainting platform compatbility is great, but it does severely limit the development speed of the language when it comes to new features that developers need. With DirectX, there's a single codebase for all developers that's updated fairly frequently with new features available to everyone.

          I'm not bashing OpenGL, it's a great language that is well suited to jobs where platform cross-compatibility is of paramount importance, industial graphics applications, 3D, etc. That said, most of those said applications now support DirectX as well, but retain OpenGL for compatibility reasons.

          OpenGL is just not all that valuable for games anymore, with DirectX being a better alternative for Windows games where porting to other platforms isn't a concern.

          N.
          • Re:cant be that bad (Score:5, Informative)

            by be-fan ( 61476 ) on Sunday September 14, 2003 @01:52AM (#6955380)
            Shows how much you know. OpenGL was relatively static for a long time. In fact, I wrote an article about this on OSOpinion awhile ago. But the ARB seems to have gotten its ass in gear, and as a result, OpenGL is managing to keep pace with DirectX in the programmable hardware department. When OpenGL 2.0 comes out, OpenGL will take a leap forward. Also, if you actually take a look at the APIs, you'll see that there are very few platform compatibility issues. Both APIs are pretty much self-contained, so OpenGL's progress really isn't affected by its cross platform nature.
      • Re:cant be that bad (Score:2, Interesting)

        by Anonymous Coward
        False. The real problem lies with nVidia's chips having half the fillrate at the same clock when using full compliant precision compared to ATI's. A little comparison :

        R350:
        8 pipelines
        8 FPUs

        NV35:
        4 pipelines
        4 FPUs

        NV35 will always be half as slow as R350 per clock when using full compliant precision. All the D50 drivers will do is introduce more cheats, and even lower image-quality (driverheaven.net previewed 51.75 in AquaMark3 (DX9 program), and found IQ to be significantly worse than 44.03 and 45.23).
  • by richman555 ( 675100 ) on Saturday September 13, 2003 @10:27PM (#6954575)
    What my Rendition Verite card is old now? Come on guys, is this difference really that much at all?
  • Cash (Score:4, Funny)

    by Lawbeefaroni ( 246892 ) on Saturday September 13, 2003 @10:27PM (#6954579) Homepage
    Hmm, cash and industry. How does it pan out? If "Shader Day" wasn't enough for you, keep having fun trashing the chipset you chose.
  • GOD (Score:2, Interesting)

    by Anonymous Coward
    What ever happened to the good old days. Back when you just went out and bought a console, either Nintendo or Sega; if you pick Sega of course you where a loser; if Nintendo then you had the best games in the world...sigh

    Just give me FFVI or give me..well Metal Gear Solid.
    mac
  • Yes but (Score:2, Interesting)

    by Anonymous Coward
    are nVidia & ATI really ethically different from each other either way?

    What concerns me is whether the practice of producing games that work with _nothing_ other than recent nVidia and ATI cards continues. Game after game comes out which simply does not work on other brands' video cards.
    • id Software seems to be interested in just making games that work. They're friendly to older cards, work fine on both ATI and nVidia (that's why they're used as a benchmark so often), and they actually care about the Linux and Apple markets. They only games I've bought or played in recent memory have been id games, or at least based on an id engine.

      If Half-Life 2 isn't going to work on nVidia cards, or if they decide to completely ignore everyone but the Windows market, I'll just wait for Doom 3. I can be
  • by Nogami_Saeko ( 466595 ) on Saturday September 13, 2003 @10:31PM (#6954592)
    About all the article in the inquirer says is that Valve put the bundling rights for HL2 up for grabs. Makes sense.

    I don't think that article says anything about one hardware platform being better than the other, and I don't doubt that had NVidia won the bundling deal, they would've had a "NVidia Shader Day" event, regardless of the performance of the product.

    I still find the most interesting point being that Valve says that they had to put in a lot more time and effort making the gaming experience on NVidia cards good than on ATI cards, to the point of developing a seperate graphics path for NVidia chips.

    If the solution to the performance issues was a simple driver update from NVidia (WITHOUT degrading quality in any way), then surely Valve would've left it to Nvidia to handle and proceeded to spend their time working on the game iteself...

    N.
    • I don't think that article says anything about one hardware platform being better than the other, and I don't doubt that had NVidia won the bundling deal, they would've had a "NVidia Shader Day" event, regardless of the performance of the product.

      Err, I think that was sort of the point of the article...

      If the solution to the performance issues was a simple driver update from NVidia (WITHOUT degrading quality in any way), then surely Valve would've left it to Nvidia to handle and proceeded to spend their
      • That was the point of the article that was linked to, however the original submitter seemed to infer that the benchmark results are inaccurate in some way or biased towards ATI because they got the exclusive bundling deal for HL2.

        At least that's the way I read it.

        N.
    • As a side note, on various articles...

      Valve has said that they do not have said drivers. To wit, valve has stated, really, until they're public it's not appropriate to bench with any driver. NVidia says they do, but that's irrelevant by their above arguement.

      So Valve has spent a *LOT* of time optimising code since until it's actually in a release it really isn't a useable driver. And as history has shown, a "benchmark driver" and a public official driver are often very different performance wise.

      If I wa
  • that gaming hardware is a billion dollar industry. In circumstances like these, combined with the collapse of the dot-com "everything should be free" mindset there is little chance of specs being made available for open-source developers.
  • by nemaispuke ( 624303 ) on Saturday September 13, 2003 @10:33PM (#6954600)
    Maybe I'm just dumb but it doesn't seem to make much sense to release new hardware without drivers optimized to take full advantage of the hardware. If you (or a hardware site) has to wait for a new driver to get the performance the vendor specifies for the hardware, I would be real leary of buying hardware from them. From what I saw of the ATI/NVIDA test, the NVIDA card was trounced, so maybe NVIDA should hold off on releasing new cards until their drivers catch up to the hardware.
  • by swdunlop ( 103066 ) <swdunlop AT gmail DOT com> on Saturday September 13, 2003 @10:33PM (#6954604) Homepage
    I'm starting to wonder if HL2's numbers are going to be quite as good as HL1, considering the aggressive marketing, shady practices, tie-ins with the less-friendly-than-advertised Steam, and a lot of other publisher-related snafus. Sierra and Valve seem to be regarding Half-Life 2 as such a massive potential success that they can get away with pretty much any customer-abuse they want.
    • I'm looking forward to Doom3 much more than HL2. But will they be coming out in the same year?
      • by @madeus ( 24818 ) <slashdot_24818@mac.com> on Saturday September 13, 2003 @11:53PM (#6954945)
        Doom 3 this year looks doubtful, Activision certainly don't expect to ship it till 2004, though they have said it is in the hands of ID Software.

        I would make business sense to not have them clash and get released at the same time, so I expect Doom 3 won't ship this year, but in the first quarter of next (unless they aim for Christmas, though I can't see it being much of a 'Chirstmas Title', what with the evil-scary-hell-spawned-zombies that make you want to turn all the lights on and hide under the sink with a big kitchen knife).

        As impressive as HL2's physics/environment engine (and use of DX9) clearly is, Doom 3 is still going to have the edge in rendering jaw-dropping indoor environments with stupid amounts of eye candy, so at least it won't look 'aged' or suffer from the later release date.
  • bla bla bla bla (Score:5, Insightful)

    by jimius ( 628132 ) on Saturday September 13, 2003 @10:34PM (#6954608)
    the 45.xx detonator drivers were used for the Nvidia cards because that is the final working driver Nvidia released. The 50.xx which NVidia says should have been used doesn't show fog, which they call a bug and just so happens to create better results. Also the 50.xx drivers were still beta last time I heard. So Valve chose a stable driver over a "bugged" one. Not to mention NVidia's earlier actions surrounding "driver enhancements" wouldn't make them suspicious.
    • Re:bla bla bla bla (Score:5, Interesting)

      by DataPath ( 1111 ) on Saturday September 13, 2003 @11:00PM (#6954739)
      nVidia is rewriting the ENTIRE shader engine with dynamic re-ordering for the 50 series drivers. I'm not sure you understand - this is NOT a trivial task. The shader problem has been that you either optimize for ATi's shaders, or you optimize for nVidia's. The 50 series drivers with the dynamic re-ordering is supposed to help alleviate this - the driver will optimize at run time what the developers may not have done at compile time.

      The 50 series drivers were incomplete during HL2 development. The driver samples that nVidia was providing to Valve were milestone drivers - incomplete featurewise, but each completed feature was "complete" (written to specs and considered stable). The fact that fog was not rendering is likely not a speed hack, but an as-yet incomplete (as in not even started in that driver release) feature.

      Trust is a hard thing to earn, and easy to lose. I'm withholding judgement until nVidia's promised 50 series drivers come out.
      • Re:bla bla bla bla (Score:5, Insightful)

        by Badaro ( 594482 ) on Sunday September 14, 2003 @01:51AM (#6955378) Homepage

        The 50 series drivers were incomplete during HL2 development. The driver samples that nVidia was providing to Valve were milestone drivers - incomplete featurewise, but each completed feature was "complete" (written to specs and considered stable). The fact that fog was not rendering is likely not a speed hack, but an as-yet incomplete (as in not even started in that driver release) feature.

        Even if this is a driver bug and not a speed hack, if there are missing graphical elements in Half-Life 2 with the 50.xx drivers then Valve certainly did the right thing when they asked reviewers not to use them for benchmarking.

        []s Badaro

    • Not to mention NVidia's earlier actions surrounding "driver enhancements" wouldn't make them suspicious.

      Don't single out nVidia for this. You must have a short or selective memory. Remember the Quack 3 [tech-report.com] fiasco?

      Bottom line is, any company's going to do whatever they think they can get away with to sell more cards. Doesn't make nVidia any more evil than ATI.
  • Conspiracy Theorists (Score:5, Interesting)

    by Anonymous Coward on Saturday September 13, 2003 @10:34PM (#6954610)
    Don't accuse Valve of any foul play. Even Carmack has said that unless you use Nvidia specific extensions for pixel shaders, the performance will not be very good, due to the FX series of cards using 32bit percision by default.
    • ...that unless you use Nvidia specific extensions for pixel shaders, the performance will not be very good...

      Doesn't that defeat the purpose of having a generic DirectX API? One size fits all, and all that.
  • by Dark Lord Seth ( 584963 ) on Saturday September 13, 2003 @10:34PM (#6954616) Journal

    Let's see how Half Life 2 will run on my 3DFX Voodoo 1 & S3 Virge!

  • ati and nvidia dx9 (Score:3, Interesting)

    by dpw2atox ( 627453 ) on Saturday September 13, 2003 @10:36PM (#6954623) Homepage Journal
    personally I could really care which card has better DX9 support then the other.....im just worried about their linux drivers and Nvidia has definantly got ATI beat.
    • by dinivin ( 444905 ) on Saturday September 13, 2003 @10:51PM (#6954698)
      im just worried about their linux drivers and Nvidia has definantly got ATI beat.


      Hardly... Having used the most recent versions of both the linux ATI drivers and the linux nVidia drivers, I can honestly say that ATI's drivers are much more stable, and perform just as well as nVidia's drivers. In my opinion, each release from nVidia (in the last year or so) has gotten much less stable, while ATI's drivers keep improving.


      Dinivin

      • Having not used ati's cards under linux--don't own one--I can't say exactly what their support or performance is like. However, I can say that I strongly disagree, and feel that nVidias offering in linux arena has improved tremendously in the past year. They are now releasing windows-equivalent versions, offering an easy to use installer, and run a massive forum for users. Can ATI say the same? All that said, however, platform support is only a marginal issue to the real question of performance. An im
    • by Plug ( 14127 ) on Saturday September 13, 2003 @11:18PM (#6954822) Homepage
      ATI has just released official XFree86 4.3 drivers [ati.com].

      The driver even handled an upgrade to Kernel 2.6 without flinching. NVidia AGPGART support doesn't have to be hacked in any more either, it would seem.

      No more mucking around with the FireGL drivers from the German branch of ATI.
  • by zulux ( 112259 ) on Saturday September 13, 2003 @10:36PM (#6954626) Homepage Journal


    BitBoys will come back I tell you!

  • it's going to be difficult not to find things like this happening what with only two real companies.
    here's hoping that the same thing doesn't happen in the future with doom.
    however, as long as the games work, regardless of which card you choose, doesn't matter in the end. i think this might be one case where microsoft is helping rather than hurting- were it not for directx, i think we'd be in a really confusing situation. i sure don't miss dos games.

    i can't believe i said that about microsoft. ah well.
  • What about OpenGL? (Score:5, Insightful)

    by BillKaos ( 657870 ) on Saturday September 13, 2003 @10:38PM (#6954639) Homepage
    What scares me is people doing those benchs in DirectX, and most, people doing games using DirectX. Nvidia certainly didn't made its card to perform good in DirectX's new API, and I don't see the problem.

    What's about OpenGL; I only purchase OpenGL games, because I mostly can make them run in Linux, and WineX is only a ugly workaround to run games in non native enviroment. If I'd a game company, I'd take care of potential Linux customers.
    • except that most game companies look at what happened to loki games and put developing for linux right below developing for the c64 on their list of priorities.
      • except what happened to loki had little to do with the size of the linux-using, game-playing market (per wired). Sounds like it was mostly mis-management to almost illegal level (ie: not paying employees and shifty accounting practices).
    • by Lord_Dweomer ( 648696 ) on Sunday September 14, 2003 @12:02AM (#6954984) Homepage
      " If I'd a game company, I'd take care of potential Linux customers."

      Aside from the comments about how Linux users might be more likely to pirate the games instead of buying them.....I'd like to point out the fact that the Linux userbase is literally NOTHING compared to the Windows userbase. I'm sorry, Linux is nice...but you Linux advocates have to realize that while your system might be superior in many ways....it still just lacks the pure numbers of Windows, or even Mac.

      So, of course you'll get karma for making a pro-linux comment, but you'd never get modded down here for the fact that your idea of taking care of linux users is just a BAD BUSINESS IDEA (at the time). It's a waste of money on support and development when you could make a lot more money for a lot less by developing/support a Windows market.

      So in summary, wishful thinking never hurt anyone....but your idea is not good business move. No hard feelings.

  • Clairity (Score:2, Interesting)

    Silly technical politics like this shows why consoles always manage to trump the PC games industry. What good is an open system if nobody can agree what works?

    Is a powerful system with no cohesive graphics standard really that much better than a consistent, albeit more primitive piece of hardware?
  • Both sides (Score:5, Funny)

    by SD-VI ( 688382 ) on Saturday September 13, 2003 @10:41PM (#6954653)
    The view of nVidia fanboys is this: Valve and ATi are in bed together and have been for a while, and Valve sabotaged Half-Life 2 so it wouldn't run on NV3x properly in return for a whole bundle of money from ATi. Never mind that this wouldn't make any business sense-- you see, Majestic 12 are the REAL ones behind this, and we can't possibly know what they have in store for the world.

    The view of ATi fanboys is this: Anyone who bought a GeForce FX is an idiot, as they obviously should have had a stolen timedemo of Half-Life 2 on hand to benchmark with. If they didn't break into Valve's offices and steal the code, that's their own fault. Also, nVidia is clearly exactly like 3dfx, because they slipped up, JUST LIKE 3DFX! Dun dun dunnn!(The Quake/Quack scandal involving ATi never existed, of course.)

    The view of most sane, rational human beings is that this is just another stage of the highly competitive video card market, and that anyone who spends time arguing over which company is better needs to be tranquilized, preferably with something meant for very large animals.
  • by C3ntaur ( 642283 ) <centaur@@@netmagic...net> on Saturday September 13, 2003 @10:42PM (#6954659) Journal
    Let's face it, both vendors have top-end products that are screaming fast. They'll put up more polygons per second than anything that came before, and just about any game that's currently out there is going to look fantastic on either brand. Provided you run Windows...

    Which I don't. So when it came time to upgrade my system (about 2 weeks ago), Nvidia won hands-down -- and it was because they are Linux friendly, not because some rigged benchmark somewhere said they are a few frames per second faster than the other guy. Nvidia has been providing quality Linux drivers for their products for a long time, and I hope they'll continue to do so.

    I've been playing a lot of Neverwinter Nights on my 5900 and it looks beautiful. I'm planning to purchase more Linux games as soon as my budget permits. Yes, there are people out there running Linux who appreciate high-end graphics cards. Probably more than the marketing types think; after all, most hacker types I know are also hardcore gamers.
  • Screw Valve and HL2 (Score:3, Interesting)

    by Anonymous Coward on Saturday September 13, 2003 @10:48PM (#6954683)
    Valve made a great game four or five years ago, and someone else made an even better game by modifying it. However resting on their laurels all these years and then coming out with a windows only game, selling themselves into a hardware vendor fight, and trying to tie the game into a subscription service has me really steamed. Chances are they won't have lightning in a bottle the second time around. As a matter of fact, I'm starting to think that Savage(www.s2games.com) might really be the next Half Life. It's a first time release from a small start up that supports Linux and Windows on the same retail Cd. They are also promising heavy support for modding the game and after just a few days of playing I'm completely hooked.
    • Because we all know that what makes a game really successful is having a linux port...

  • but.... (Score:5, Interesting)

    by Cassius105 ( 623098 ) on Saturday September 13, 2003 @10:49PM (#6954684)
    if Valve did ptimize HL2 for ATI

    then how come these programs also show Nvidia shader performance as pathetic

    halo PC
    tomb raider angel of darkness
    shadermark
    3dmark03

    and why have the det 50 drivers which nvidia recomended that valve used been proven to reduce image quality by a substantial amount?

    is ATI really rich enough to buy off all of these companies and also manage to sabotage Nvidias drivers and PR team? :P
    • Valve spent 5x the time optimising HL2 for (the much larger market of) nVidia hardware. It's just that nVidia optimised their hardware for Doom 3, not for DX9. This is a turnaround from their TNT days, and now it's biting them.

      BTW, where is this "proof" of reduced image quality to which you refer? All I'd heard of was some incorrect fogging (which is obviously bad, and is doubtless a bug that we can hope will be fixed before the Det50s are out of beta).

  • First, I think it's important to note that Anand was instructed to not use the Det50s in his tests because they failed to render fog in the demos, which would obviously impact performance.

    Second, check out this image quality comparison [driverheaven.net] over at DriverHeaven with Aquamark 3. It sure looks to me like nVidia is back to their old tricks again.
  • by Maul ( 83993 ) on Saturday September 13, 2003 @10:57PM (#6954729) Journal
    Both ATI and nVidia are guilty of trying to stack things in their favor dishonestly. ATI making deals with Valve to get HL2 to work better on the ATI cards by design is just the most recent example, and while it might be a major example, both sides have done this before.

    At the same time, both card makers are really putting out insane results that wouldn't have been thought of even a couple of years ago.

    My decision in graphics cards is based on my past experience and driver support. In this area nVidia still winds hands down. If ATI wants to sell me a card, they're going to need to beef up their Linux driver support big time.
    • They made a deal with Valve, eh? Then why is it that many independent sources (including the developers of AquaMark 3 and John Carmack) have noticed that NV3x has a lot of trouble with PS2.0 and that to get good performance out of it you have to program a special path? And why is it that NV3x clearly only has 16-bit and 32-bit precision when the DX9 specs call for 24-bit? Don't you think this could account for NV3x's terrible "real" DX9 performance? Don't be so quick to jump to conclusions. That having bee
      • Word.

        As Valve has said. If the driver isn't an official release, it's not appropriate to bench with it. NVidia may turn their driver around, but given the industry's history it's a reasonable expectation. Valve knows and has stated a number of times in a couple of recent interviews that the majority of their customers are NVidia users. Hence the great deal of time optimising of their code.

        The auction is just good business.
    • If ATI wants to sell me a card, they're going to need to beef up their Linux driver support big time.

      Check out this comment [slashdot.org] a few posts above yours. Looks like ATI is cleaning up its act when it comes to Linux.

  • by llZENll ( 545605 ) on Saturday September 13, 2003 @11:13PM (#6954799)
    "Whatever happened to just making hardware, and making games" I'll tell you what happened, a little thing called market growth. The more the market grows the more this stuff will happen, in maybe 1-2 years the games industry will become much like the movie/music industry. With games taking 3-5 years and 20-200 people to create only big studios will be able to foot the bill and suck up the costs if the game tanks. Not to mention ad costs. This will lead to higher quality titles, but less of them and they will be even more of the same crap (just like the movie industry today). In 2-5 years the games industry will surpase the movie industry in tearms of sales and revenue, because games cost 40-80/copy and movie just can't hang with that. When that happens expect this sort of stuff to happen daily.
    • Actually... (Score:2, Informative)

      The videogame industry ALREADY generates more revenue than the movie industry (at least in the u.s).
    • Games, like movies, will always have small companies (or individuals) producing quality content. Movies, unlike games, have a huge audience. Most people's mothers are in the movie theaters and not waiting around for the next version of The Sims. There's also the fact that it's much harder for a game company to

      Step 1. Produce crap
      Step 2. Convince enough people to buy it before they realize its crap
      Step 3. Profit

      Now, the reason Step 2 is so difficult is that i can always download a demo, try it at the store,

  • by Namarrgon ( 105036 ) on Saturday September 13, 2003 @11:15PM (#6954807) Homepage
    After Valve blasts nVidia for having sucky hardware, and nVidia is like, but, but, what about our new Det 50 drivers, one might be left wondering why Valve didn't even mention the existance of drivers that would improve the situation (supposedly by a lot). Not only does Valve of course have the beta Det 50s - and so did the press - but they refused to even entertain the thought of testing with the supposedly much more optimised drivers (nVidia claim that all their driver effort for the last few months has been devoted exclusively to the upcoming Det 50s).

    Why? Well, one stated reason was a policy to test only with "publicly available hardware, and publicly available software". Laudable enough, considering that non-public drivers could have any number of bugs or "optimisations" that could render the game incorrectly and thus misrepresent its performance.

    Indeed, Valve referred to an issue where fog was completely left out of an entire level, and though they didn't point any fingers, it was later revealed that yes, the beta Det 50s were the culprit.

    For further info, you should read this [amdmb.com] report on the performance of the beta Release 50 Detonators. Summary: not much difference - at least for DX8-level games. DX9 is where the focus supposedly was, and there is a 25% gain in the PS2.0 test in 3DMark03, which is something.

    However, who knows if it'll translate to a 25% gain in HalfLife 2 - probably not, in itself. And given recent 3DMark/nVidia events, even that much is uncertain, until the drivers are released for public examination. In any case, it's a long way short of the 100% gain needed for the 5900 Ultra to just draw even with the 9800 Pro.

    nVidia apparently have a strong lead in Doom 3 scores, though (admittedly with the partial-precision NV3X-specific code path), so they will no doubt be hoping that Doom 3 outsells HalfLife 2... Myself, I have a 9600 Pro in my sights, just in time for the HL2 release :-)

    BTW, regarding the release delay? According to Gabe Newell, "First I've heard of it". So there you are. Only 16 days to go...

    • by canajin56 ( 660655 ) on Sunday September 14, 2003 @12:07AM (#6955003)

      They would have to result into a 50% gain in HL2 in order for the FX5900 Turbo to catch up to the Radeon 9800 Pro, not 100%. The graph with the customized nVidia code path has 40fps vs 60 fps. Although, of course, the nVidia path is lower quality, since the 5900 doesn't do 24bit precision.

      Also, I wouldn't call it a CLEAR lead in Doom 3. The nVidia scores 20% higher on medium quality, but the Radeon takes the lead on high quality. Again, nVidia calls driver problem.

      Myself, I will be upgrading for Christmas, when I will know for sure which one works best, and how the drivers are. This is also the time when the FX6000 Super Mega Turbo and Radeon10K Elite Pro Plus Plus(Or whatever) push the prices down on the "older" cards ;)

    • nVidia apparently have a strong lead in Doom 3 scores, though (admittedly with the partial-precision NV3X-specific code path), so they will no doubt be hoping that Doom 3 outsells HalfLife 2... Myself, I have a 9600 Pro in my sights, just in time for the HL2 release :-)

      Ironically, I bought my 9700 Pro the day (well, afternoon) that I got ahold of the Doom3 leaked alpha, so I could tinker with it under considerably improved performance than my old GeForce2GTS.

      The fact that it turns out to have far supe
  • The newest DX9 benchmark is Aquamark3 which uses a real game engine also. The official release is 15-Sep but here are some early benchmarks: http://www.guru3d.com/article.php?cat=article&id=7 6&pagenumber=9 [guru3d.com] testing both nvidia's current 4523dets and the upcoming 5175dets.
    • What's interesting is that some of the image quality comparisons in Aquamark for the new NVidia drivers show image quality loss.

      From driverheaven.net [driverheaven.net]:

      Now, im sure most of you have read Gabes recent comments regarding the detonator 51.75s, and Nvidia's offical response but I really do have to say, having seen this first hand it confirms to both myself and Veridian that the new detonators are far from a high quality IQ set. Alot of negative publicity is currently surrounding Nvidia, and here at driverheave

  • UM (Score:3, Funny)

    by Black Hitler ( 687112 ) on Saturday September 13, 2003 @11:32PM (#6954869)
    However, if the Inquirer is to be believed
    That's where everyone should stop reading.
  • The title of this post says it all, really.
  • Seems to me Nvidia has a crap card and they have been covering it up for a while now. 1. Bad Future Mark results Nvidia: We stopped participating a while ago, thats a ATI benchmark. 2. Poor Tomb Raider Performance. Nvidia: Who cares. 3. Poor HL2 Performance. Nvidia: You should of used our 50.xx drivers that don't render fog, and aren't out yet. Someone posted this picture. I think it says it all... http://myweb.cableone.net/jrose/Jeremy/HL2.jpg Apoptosis
  • by d3am0n ( 664505 )
    While this is a rather pathetic instance of a corporation buying thier way towards being number one (I hate this sort of propoganda for products). They did have a point about a few things, the NVIDIA card needed updates before it was anywhere near compedative, if NVIDIA had gotten thier technology correct the first time they wouldn't have had such an increadibly lousy showing. That still begs the question of wether or not the ATI card had it's latest drivers installed, in which case this was a complete an
  • by Anonymous Coward on Sunday September 14, 2003 @12:03AM (#6954990)
    If you look at the FX architecture, it has a serious problem.

    It can't run "true" DX9 spec games worth crap.

    Why?

    Because to save die space, nVIDIA engineers decided it'd be best to use 32 bit FP units, compared to ATi's more numerous 24 bit FP units. DX9 specs call for 24 bit precision computations, which is the ATi native precision (which can then be mapped to 16 or extended to 32 bit precision, if asked for) whereas the FX which has to operate in 32, 16, or 12(?) bit modes basically loses half its registers (or more, if you are comparing to 12 bit registers) because it must run in 32 bit mode to be compliant.

    End result? Less high speed registers on the FX part, more swapping from ram and less FP computational power to go around.

    And this is only a simple example. I believe it has been noted that that Carmack eluded to many ugly optimizations in using lower precision math or proprietary shader paths he had to make to the D3 engine for the benefit of the FX not sucking utterly in terms of performance. It isn't really a playable DX9 part, all in all.

    If valve says they spent serious time working for the Geforce codepath (and indeed, it is quite a bit faster in hyrbid mode, but now they are making it well known that it isn't running "true" dx9, which it the truth. It should also be noted that this hybrid mode is what the D3 benchmark was run in which offered the nVIDIA part such stellar performance, specifically noted by Carmack.) then they probably did so. Either that or they would have mentioned nothing.

    Drop the "it must be corporate scandal" bit. If you read some of the specs and dev notes you will note that they more or less universally have their gripes in getting DX9 performance out of the FX part.
  • Blame Microsoft (Score:3, Insightful)

    by jettoblack ( 683831 ) on Sunday September 14, 2003 @12:19AM (#6955034)

    After nVidia's falling out with them over the Xbox chipset pricing, its likely MS changed the DX9 spec mid-development and only gave the new specs to ATI. Thats why ATI's cards are perfectly designed to run DX9 but nVidia's specs are off. For example, DX9 calls for 24bit FP, which ATI does, while nVidia only supports 16 or 32bit, forcing developers to choose between correct rendering or improved performance.

    Also nVidia is to blame for their driver cheating fiasco, which makes developers especially weary to trust beta or "optimized" drivers, and for expecting every game company to optimize for their cards just because they're the biggest.

  • "Whatever happened to just making hardware, and making games?"

    Um, capitalism unless I missed my guess. More specifically, the relationship between gaming hardware and software is finally maturing to the point to realize one of the more advanced techniques used in making money-- Networking. Both markets are now not only making more money than before, but are increasingly reliant on one another. Something like this was only a matter of time, IMO. You may have noticed it in that "Exclusive Game Demo" story (
  • Views (Score:3, Interesting)

    by SynKKnyS ( 534257 ) on Sunday September 14, 2003 @12:34AM (#6955080)
    This isn't the first time something like this has happened. Everquest (Sony is butt buddies with NVIDIA in regards to this game) runs amazingly fast on my NVIDIA GeForce2MX 220 at 60 fps at 1280x1024 with a lot of details turned on, yet runs like garbage on my ATI Radeon 9700 Pro on a similarly configured system. Sometimes it even becomes a slideshow. I am not the only person to experience this as many other people have complained about it. The unfortunate side to this is that most people complain about the hardware rather than the software.

    Now this issue is quite different. There was a write up recently on why NVIDIA hardware is so much slower than ATI hardware when using 2.0 pixel shaders. I don't remember the URL, so if anyone would be so kind to post it that would be great. Basically, it was stating that the Detonator 40 drivers needed to be rewritten to better take advantage of 2.0 pixel shaders. Detonator 50 drivers are a lot faster and fix this problem, but they do reduce image quality [driverheaven.net] quite noticeably. This could be the reason that swayed Valve's decisions.

    The fact of the matter is, we need next generation GeForce chips.
  • "We see 'Half-Life 2' as a new benchmark for the type of amazing experiences that can be delivered on the Windows(R) platform, and DirectX 9.0 is clearly serving as the catalyst for the development of these state-of-the-art games," said Dean Lester, general manager of Windows Gaming and Graphics at Microsoft Corp. "'Half-Life 2' emphasizes the trend we are already seeing: Games for Windows now deliver the most cutting-edge technology and immersive entertainment available anywhere."

    See here [yahoo.com] for the full ad
  • by AntiGenX ( 589768 ) on Sunday September 14, 2003 @03:16AM (#6955606)

    "Whatever happened to just making hardware, and making games?"

    Whatever happened to the good ole days when people didn't believe everything they heard or read?

    I'm just skeptical of an article that says we "heard from a friend of a friend." It's all too speculative, with little evidence of any real wrongdoing. Newel expressed concerns about the drivers that Nvidia was offering. He also said it took three times as long to write the codepath for NVIDIA, implying that they had to account for a lot more problems. If you want to speculate, look at the slides from "shader day." [tomshardware.com]

    To qoute: "During the development of that benchmark demo Valve found a lot of issues in current graphic card drivers of unnamed manufacturers:


    Camera path-specific occlusion culling
    Visual quality tradeoffs e.g. lowered filtering quality, disabling fog
    Screen-grab specific image rendering
    Lower rendering precision
    Algorithmic detection and replacement
    Scene-specific handling of z writes
    Benchmark-specific drivers that never ship
    App-specific and version specific optimizations that are very fragile"

    And we know that several of these have been explicitly tied to NVIDIA.

  • by Canis ( 9360 ) on Sunday September 14, 2003 @09:29AM (#6956457)
    Whatever the terms of the ATI/Valve deal, it's irrelevant: As a videogame developer (not for Valve -- not even on the same continent as them -- and with no such deal with anyone) I can assure you that NVidia's chips have serious problems.

    Here's Valve's problem: They make moddable games. That's at the core of their business. They didn't just make HalfLife as a game (although they did that, and very well) -- they made it as a platform upon which anyone was free to develop their own FPS games: CounterStrike being the most famous, but there are many others, such as Natural Selection or Day of Defeat.

    Likewise, they are not just making HalfLife 2, but a platform upon which mods will be made. But why is this relevant to the videocard debate? Here's where we get back to the drivers.

    The drivers -- the mythological r50 drivers that noone's actually gotten their hands on yet -- might well provide a speed boost to HL2 as it stands. Maybe. But if they do, it is because they have hand-tuned those drivers for HL2. See Mr Burke's quote:

    [...] we had been working closely with Valve to ensure that Release 50 (Rel. 50) provides the best experience possible on NVIDIA hardware.

    What he omits is, the best experience possible for the specific subset of vidoecard functionality currently present in HL2 at this time. A little background for those of you who haven't kept up on recent videocard technology: Modern videocards have Vertex Shaders and Pixel Shaders. These are essentially short programs written in assembler (and now a variant on C) that the driver compiles and executes on the videocard, not the CPU (taking load off of it) that customise rendering in various ways. Vertex shaders typically perform lighting, animation or mesh deformation effects, while pixel shaders provide surface material effects, such as water distortion or bump mapping.

    ATI's cards appear to be able to handle any pixel shader program you throw at them. Whether this is because the cards are just that fast and general they can cope with it, or whether the compiler in their driver cunningly optimises any GPU program you throw at it (the same way a C compiler optimises CPU code, by reordering instructions to avoid stalls, factoring out loop invariants, etc) we don't know. Frankly, we don't care: The important thing is, we write code, and it works.

    NVidia's cards do not work this way. NVidia's cards are fast, but only if you hand-tune your assembler to precisely match their architecture. Except we don't know enough about their rules to do this (proprietary NVidia technology blah blah).

    When Valve have written shaders, found them to be fast on ATI cards and slow on NVidia's cards, NVidia have released new drivers and, lo... it's fast on NVidia's cards. NVidia go "hey, uh, our bad... driver bug... fixed now...". But make even a tiny, trivial change to the shader, and bam: it's slow again. With a little more experimentation along these lines, it's easy to come to the conclusion that there was no bug, there is no fix, NVidia simply have a lookup table of shaders they 'recognise', and when one of those comes along, they replace it with one they wrote themselves, hand-tuned for their card.

    There's a problem with this, of course. For a start, if you're not as big as Valve, NVidia aren't going to set aside an engineer to go around optimising shaders for your game or release new drivers. Secondly, if you make any changes you're back to square one and need to resubmit your shader to them and get it fixed up. Thirdly, if like Valve you care about modders, you're not going to be happy with this "solution" -- because even once your game is complete and on store shelves, and NVidia have stopped making new driver releases to 'fix' it, modders can make new shaders. And suddenly find their game runs like ass. You think NVidia are going to go chasing after modders? Bwahaha.

    I suspect this is why Valve were careful about the benchmarks they let be

  • by digitalwanderer ( 49695 ) on Monday September 15, 2003 @10:53AM (#6963913) Homepage Journal
    This is no great mystery and no surprise in the graphics community, this is the bloody break everyone has been waiting for! The FX's shortcomings have been known for quite some time and have been analysized/discussed to death within the quiet confines of such places as www.beyond3D.com and www.nvnews.net, in fact the latter site's mods/admins are the ones who are shutting up the remaining nVidiots who seem to still think this is some big conspiracy.

    It IS a conspiracy, but entirely of nVidia's own doing and creation...their hardware simply can't do DX9 well as it was never designed to. There's many reasons for this, but it mainly comes down to nVidia tried to redefine the standards of the graphics industry and failed and now are paying the consequences for their hubris.

    The only thing surprising here is the size of Gabe Newell's balls to come out and directly address this in such a fashion, and I truly respect and admire him for it. He HAD to, the game is going to come out and if he didn't customers would be blaiming him and Valve for FX's shortcomings!

    I'm terribly disapointed in the coverage I've seen of this on slashdot, I really thought you folks would be able to appreciate the subtle (and not so subtle) aspects of a giant company that has been resting on it's laurels and using PR fud to make up for it's hardware's shortcomings...it's just now there is really a game coming out that will highlight this and the rest of the world seems to be noticing it.

    There is excellent coverage of this at www.beyond3d.com for in depth analysis, and www.nvnews.net has the best of the fanboys/ex-fanboys discussing it. (Our team at www.elitebastards.com is still the best at keeping up with all the latest stories though... ;) )

Understanding is always the understanding of a smaller problem in relation to a bigger problem. -- P.D. Ouspensky

Working...