Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
First Person Shooters (Games) Graphics PC Games (Games) Software Entertainment Games

NVidia Recommended Graphics Card For Doom 3 81

Griffon4 writes "Nvidia announced today that they have been branded the recommended graphics card for id's forthcoming PC FPS Doom 3. Now I'm wondering: Recall a little over a year ago that Carmack said the Nvidia card at the time was just a slight bit more worthy than the ATI? Jump forward to today - are we getting a real recommendation based on id's own experience or just seeing the end result of a financial deal that benefits both?" Other possible factors (apart from NVidia simply being the better graphics card for Doom 3) include the alleged origination of some Doom 3 Alpha version leaks, unlikely to endear ATI to id, and, of course, ATI already having a major bundle deal in place with a certain other FPS.
This discussion has been archived. No new comments can be posted.

NVidia Recommended Graphics Card For Doom 3

Comments Filter:
  • Look At Origin (Score:5, Insightful)

    by Babbster ( 107076 ) <aaronbabb@NOspaM.gmail.com> on Tuesday March 02, 2004 @03:05AM (#8438340) Homepage
    Considering this is coming from Nvidia, this is clearly, purely, a financial deal between Nvidia and Id. This isn't a .plan file from Carmack or even an interview with the same, which might be more interesting from a technical point of view. This is a cobranding, most likely decided upon in light of ATI's association with Valve and Half-Life 2.
    • Re:Look At Origin (Score:5, Insightful)

      by thumperward ( 553422 ) <thumperward@hotmail.com> on Tuesday March 02, 2004 @04:23AM (#8438615) Homepage
      Or maybe, just maybe, the fact that OpenGL support on ATi cards has been historically rubbish hasn't endeared id to them?

      I'd imagine Nvidia are keener to push this, given that it's a direct advert for their products. id gain nothing from doing press releases for nVidia, except maybe the ire of those poor, poor Radeon users convinced that they've won the Gaming Wars (whatever those were).

      - Chris
      • Re:Look At Origin (Score:4, Insightful)

        by Babbster ( 107076 ) <aaronbabb@NOspaM.gmail.com> on Tuesday March 02, 2004 @04:48AM (#8438713) Homepage
        Then again, when Id demoed Doom 3 the first time, they used ATI (9700 cards, as I recall) to do it, so they can't hate ATI that much.
      • those poor, poor Radeon users convinced that they've won the Gaming Wars (whatever those were).

        Aye, son, and we did win the Gaming Wars! Why I remember the legendary battle of ought two. ATI was flounderin' and NVidia had just unleashed the GeForce4 which was poundin' away at her flanks and closin' round her backside. All seemed lost. Then, out of nowhere, ATI unveiled the fury that was the Radeon 9700 Pro! NVidia was caught completely unawares, and was forced to retreat to fight another day. Those were
        • Re:Look At Origin (Score:2, Insightful)

          by nelsonal ( 549144 )
          Not the war, but it was an important battle. The GeForce 2 and 3 had much better driver support than anything ATi offered, especially for the then new Windows 2000. However N'Vidia first started having problems slightly before they were announced as the video providers for the XBox, they missed a targeted tape out for the NV31 or NV35 (future GeForce 4), and ATi took the lead with better products and driver support they they had had in several years. The war isn't over until one of the two is out of busi
      • There is also the fact that Id develops all their games on UNIX then ports it to windows. Last time I checked, the d3d support in UNIX was pretty poor *cough*.
    • For a Hexium 6 with 100 GHz bus, photonic RAM and Superconductor GPU - which may well be available before this Vaporware condenses!

      ... or more seriously ...

      I think that I will wait for some real benchmarks to be available before investing heavily in an upgrade for this one - the release date is still a little way off and I am sure it will be worth waiting to see what is available then.

  • Comment removed based on user account deletion
    • Re:Screw Carmack (Score:4, Insightful)

      by Stubtify ( 610318 ) on Tuesday March 02, 2004 @03:28AM (#8438428)
      I believe this is what he has done, as it says "recommended" graphics card. More likely than not this is just business, ie. "here's X million dollars, recommend our graphics card." To write directly for a single graphics card platform would be crippling/pissing off your audience.
    • Re:Screw Carmack (Score:2, Interesting)

      by Anonymous Coward
      Thank GOD Carmack is still pushing OGL. DirectX is still a kludgey hack. Why build for a closed platform (DirectX)? There is still PLENTY of life in OGL and 2.0 should be SCHWEEEETTTT!

      W00T Carmack!

      • Re: (Score:2, Informative)

        Comment removed based on user account deletion
        • Re:Screw Carmack (Score:5, Insightful)

          by asteinberg ( 521580 ) <[ude.drofnats] [ta] [grebniets.ira]> on Tuesday March 02, 2004 @04:23AM (#8438612) Homepage
          I've played the same game in both OpenGL and DirectX (Unreal 2003 or Unreal2...I forgot which one) and they flawlessly. [sic] In fact, I can't tell which one is better.

          You can't tell the difference because for that particular game, they are used in essentially identical ways. I would assume that if the Unreal series is going to bother supporting both APIs, they simply added a layer of abstraction to their graphics code, coded the engine using that abstraction, and then created implementations of the abstraction for both OpenGL and Direct3D that look exactly the same by design.

          This, however, implies nothing about the quality of either API, it merely suggests that the developers coded to the greatest common subset of both APIs. Of course, it does seem that this subset is good enough to make a nice-looking game, which seems to suggest that it doesn't really matter from an end-user perspective which API the developer chooses. Therefore...

          But if your going to program a game in an API, why not DirectX? It handles Video, Audio, and input. OpenGL is nice, but it only does video (that I know of).

          This is totally irrelevant - we just saw that Direct3d and OpenGL are equally capable so why choose the one that limits you to a single platform? The fact that DirectX handles other things is totally irrelevant - you can use OpenGL alongside the other components of DirectX without a problem (and this is in fact what Carmack has done in the past). You can also just use other libraries for other aspects - for example, SDL works just great. While certainly not of the scope of these commercial games, I found that SDL+OpenGL was more than adequate for my own game [stanford.edu], and I got the bonus of being able to make Linux, Mac, and even Windows versions with only about 2 total lines of code changed (had to switch where the #include's pointed for each platform, though a more clever build environment probably could've handled that). Offhand, about the only thing we couldn't do with SDL + OpenGL that DirectX provides is the network coding/matchmaking stuff from DirectPlay, but AFAIK most commercial games don't use this anyway.

          • Re:Screw Carmack (Score:5, Informative)

            by z01d ( 602442 ) on Tuesday March 02, 2004 @06:02AM (#8438942)

            about the only thing we couldn't do with SDL + OpenGL that DirectX provides is the network coding/matchmaking stuff from DirectPlay

            Here comes the SDL_net [libsdl.org]

            (Not to the parent poster): In case you are curious, there're plenty userful and commercial-quality libs out there [libsdl.org], such as SDL_image, SDL_mixer, SDL_ttf...They just rock, some of them had been used in commercial titles (remember Lokigames?)...forget about DirectX, and screw MS.
            • Commercial quality? You must be joking.. SDL is consistenly laggier, slower and buggier than directx in every discipline.
              Opensource doesn't work for game[s][-libraries]. It never has and it never will.
              • I know, how can opensource work with all these eyes looking at the code, testing it for free, and pushing it more and more as they realize its better. I don't think SDL performance is an issue.
          • Wow... after 5 minutes, I can tell that UPA Frisbee is gonna get absurdly addicting (especially since I'm an avid player in real life). So... any chance you guys could put the source code up as well? I'd love to be able to see what all you did to pull that off.

            Sweet game... now we just need a networked multiplayer version :)

            Mark Erikson
            • Hah, glad you like it :).

              We've been planning to post the source for a while now but we wanted to clean it up a little before posting and then we all got sidetracked with other work. I think we're keeping track of who wanted to get notified when we do release the source, so if you want to get added to that list send me an email.

              Oh, and I'm with you on the multiplayer version - the AI has a few weaknesses that make the game a bit too easy once you figure them out. Multiplayer is without doubt the killer f

        • Re:Screw Carmack (Score:4, Interesting)

          by Yokaze ( 70883 ) on Tuesday March 02, 2004 @04:26AM (#8438627)
          > I've played the same game in both OpenGL and DirectX (Unreal 2003 or Unreal2...I forgot which one) and they flawlessly. In fact, I can't tell which one is better.
          > But if your going to program a game in an API, why not DirectX?

          On that matter, I'd quote some post, which maybe has some relevance to your question:

          >> Why build for a closed platform (DirectX)?

          id Software is not only producing games of the Quake and Doom-series on various platforms, which are not very demanding on the input devices, and usually only have a minimum of video. They also sell the graphics engine to other companies. IRC, that is actually the main income of the company.
          So, why should a company which makes its money mainly from graphics engines restrict oneself to one platform? Its not like they can't use DirectX for video, audio, and input anymore when they use OpenGL for graphics.
          Especially, when they have experts on that API and at the time when Quake emerged, DirectX 3D was nothing more than a hack.

          • Which is why Quake was software rendered, the GLquake you are referring to came along later.
            (and what about the rendition verite accelerated dos binary)
            • Well, you do bring up an interesting point, but you miss your own key observation.

              Quake was released with a software renderer that was platform non-specific. OpenGL allows Carmack the same level of platform agnosticism with the benefits of hardware acceleration. Quake III requires an OpenGL accelerator, but at the time of release, OpenGL accelerators were fairly ubiquitous.

              As for the platform-specific VQuake, this was back when cards could only perform well on optimimized, native rendering code. The fa
        • Yeah, plus the Fact that I can play DommIII under Linux, you insensitive clod! (at least I hope so)

          And for sound, etc.. There are other API's that do the same stuff. OpenAL for example.

          So, why using DirectX, when there are such good other API's ;-)
        • Re:Screw Carmack (Score:3, Interesting)

          by eXtro ( 258933 )
          DirectX is Microsoft proprietary and Carmack has always stated that proprietary isn't a good thing. It's easier to port a game if it's built from the ground up for portability. I don't know what he's developing on now but in the past he hasn't always even done initial development on Windows.
        • In fact, Carmack himself has said that DirectX is very good, except the 3D part (Direct3D, or hwatever they call it nowadays), which, he said, should be replaced with OpenGL.

          And, have you tried to program something using OpenGL. It IS a kludgey hack, the classical MS API, big, ugly, and hard to use. (yes, you are right, OpenGL only does video...for an open source gaming API, try SDL. maybe not as powerful as DirectX, but MUCH more usable)
    • Re:Screw Carmack (Score:5, Informative)

      by oskillator ( 670034 ) on Tuesday March 02, 2004 @06:12AM (#8438962)
      Why couldn't Carmack just code the think in DirectX.

      He could have, but he likes the OpenGL API more, as he documented in his plan file in 1996. [rmitz.org] This also addresses your question later in the thread as to how DirectX is a kludgey hack; in 1996 at least, the interface was really nasty. It has probably improved since.

      There's also the portability issue. If he coded it using DirectX, that locks the code to Microsoft platforms. No easy Mac, Linux, or console ports aside from the xbox.

      • Re:Screw Carmack (Score:2, Interesting)

        by El_Ge_Ex ( 218107 )
        ...his plan file in 1996.

        You are basing your argument on comments he made EIGHT YEARS AGO!

        Apparently there are people here that don't realize that Q2 and Q3 could use either OGL or DirectX in the Windows version.
      • It has improved, but there are still 'goto' statements in the Microsoft example code (memory foggy, I think it was the mesh examples). Oh well, some things never change.
      • He could have, but he likes the OpenGL API more, as he documented in his plan file in 1996.
        You know, he could have changed his mind since then. It's eight years ago, after all.
    • I happen to agree with Carmack. nVidia needs this to boast them back up in the video card industry. none the less i wouldn't buy doom 3 if they supported microsoft by using directx over opengl. they have always used opengl, making it portable to mac's, linux, windows, and another other platform that runs opengl (with minor changes in the code, unlike using directx).

      My three cents.
    • What are you talking about and why should you be pissed off? You actually want Carmack to code to a proprietary API? He writes against `GL because he prefers it's cross-platform nature. There's nothing DX offers that he can't get from GL and shaders are small programs in their own right -- not something whose functionality is generically exposed by DX alone.
  • by DeadboltX ( 751907 ) on Tuesday March 02, 2004 @04:19AM (#8438599)
    This is just a marketing ploy. nvidia says hey I'll scratch your back if you scratch mine. Since ATI recently struck a deal valve they prolly don't have as much resources available to offer and so nvidia easily made a better offer to id. I really doubt it will make a noticable difference whether you are running a 9800 pro or a 5800, and the only thing we have to look forward to is another spiffy nvidia intro where a flaming skull flys a few circles around the nvidia logo before getting shot up and exploding.
  • by z01d ( 602442 ) on Tuesday March 02, 2004 @05:42AM (#8438888)
  • Well, the Alpha (Score:3, Interesting)

    by MImeKillEr ( 445828 ) on Tuesday March 02, 2004 @06:33AM (#8439018) Homepage Journal
    ..ran fine on my Verto GF4 (64MB) once I back-leveled the drivers. Since everyone was saying that the Alpha wasn't stream-lined and that the final would be, I'm betting that my GF4 will still work.

    Sorry nVidia - I love your cards but I'm not upgrading just so I can play DooM III on an 'approved' video card.
    • of course it will work.

      not at the 'max' of course, but then again if you make the engine so that max is obviously overkill for any hardware(so that the engine will scale for some time into the future) what does it matter?

      as long as the game is good and looks good who cares. fullscene AA never made any game great.
  • endorsement sucks (Score:3, Interesting)

    by Anonymous Coward on Tuesday March 02, 2004 @07:49AM (#8439276)
    NVIDIA is pleased to announce that id Software recommends the GeForce FX family of graphics processing units for DOOM 3.

    This means nothing, it could even be simply that the box system requirements says "GeForce FX or better" under the recomended sub-heading.

    Anyway, I hate this crap with games manufacturers officially "reccomending" hardware or deliberately coding a game to be more efficient with one brand over another. I despise seeing the Nvidia logo on game intros and thought a lot less of Valve when they endorsed ATi. I'll be thinking id has come a long way down in the world if they start endorsing nvidia, not that it would make sense since who wants to buy a game engine that is deliberately coded to run better on one specific brand's hardware.

    Its just such a cheap shitty way to try and make people buy your product. Cant beat the competition by making a better product? Frightened your competitor just does everything better than you? Screw being competitive and trying to offer something better for your customers, pay off developers to make the competition crapper instead!

    Developers should be ashamed of themselves, theyre supposed to be about making something as good as possible for any customer, not only if theyre using hardware from whoever theyve shacked up with.
    • I despise seeing the Nvidia logo on game intros and thought a lot less of Valve when they endorsed ATi

      I'm very happy for id and Valve to run their little GPU sponsorship deals. It means they have more money for games development that doesn't have to come out of their customers' pockets.

  • XBOX Version (Score:3, Interesting)

    by vasqzr ( 619165 ) <vasqzr@@@netscape...net> on Tuesday March 02, 2004 @08:09AM (#8439354)

    iD has always said the XBOX version will be equal to the PC version (even at half the cpu/gfx capability) and released simultaneously.

    Maybe this has something to do with it, due to the XBOX having an NVIDIA GPU and not an ATI.

  • Just advertising (Score:3, Insightful)

    by tprime ( 673835 ) on Tuesday March 02, 2004 @08:15AM (#8439392)
    This is just another advertising ploy on nvidia's part along the same thread that Gatorade is the official sports drink of the NFL. NVIDIA hasn't had great press lately (console wise) and needed some fresh good press.
  • by Lust ( 14189 ) on Tuesday March 02, 2004 @08:36AM (#8439548) Homepage
    I just bought a new ATI card. When I read that nVidia is the better choice for Doom 3, I wonder "Hrm, maybe I should just stick with HalfLife 2 and skip Doom altogether." No way am I forking out for a new videocard again...might as well buy a console and sit by the TV. In reality, the differences between cards may be small, but there is a two-edged sword with these company claims.
    • Mod +1 (Score:1, Redundant)

      by Bishop ( 4500 )
      That is a good point.
    • by aliens ( 90441 ) on Tuesday March 02, 2004 @10:40AM (#8440690) Homepage Journal
      I'm not quite sure, but you realize that ATI's cards will run Doom3 just fine and Nvidia will run HL2 just fine.

      This is all a lot of posturing, no way a major studio would hurt sales by making a game perform so much better on one video card as to make the other unplayable.

      And judging by the fact that people have written very very nice wrappers for nvidia only demos (dawn) so that they run even better on ATI hardware doesn't make me worry too much.

      Although I reccomend not getting a new video card until the next generation NV40/R420 come out.
    • If you consider one company being branded "recommended" as the end-all solution for your gaming needs, perhaps you had better follow your own advice. Console games are far more forgiving in that respect.

      That said, I do not see how any recommendation will *drastically* affect the way the game looks or plays. Ooh, so NVidia cards are slightly more optimized. "6 more FPS! In your face ATi!"

      Seriously, WHAT does id have to gain by suggesting that the ONLY way to play is with one or the other, especially
    • That's a good question because I'm far more likely to buy both games than to buy both video cards. I'm probably going to stick with the card I have.

      So, under those conditions, this deal hasn't really changed anything.
  • Look, there really doesn't have to be any kind of secret deal going on. Carmack said that he preferred ATI last year. A lot of work and changes could have been made in a year's time. For one thing, a year ago most of the optimizations probably had not been made. So, NVidia might really have the best card for the job.
  • So half life = ATI

    And Doom III = Nvidia

    Can't we all just get along? I paid good money for ATI Radeon 9800 Pro 128MB, and anticipated both games to be ATI driven. Now I feel screwed to a degree. Sigh...
    • Note: "Geforce FX is recommended for Doom III" does not equal "Doom III will not run on Radeons". It'll be perfectly playable on both cards (it would be business suicide if it weren't), possibly/probably being a small bit faster on nVidia cards. id and nVidia probably just wanted to endorse each others products a bit, as a counter to Valve and ATi doing the same. That is all.
  • Cmon people.. (Score:2, Informative)

    Just because they endorse a card doesn't mean that Doom III won't work with an ATI card, or HL2 won't work with an NVIDIA card, or it will work less. It seems like people are already planning on not picking up a game they would otherwise have rushed out to buy because of endorsements. Do you need to buy a GeForceFX to play Doom 3? No. Does NVIDIA get money if you buy Doom 3? Probably not.

    On another note... I actually tried the Doom 3 leak on my GeForce FX 5900 Ultra. A small blurb about it is on my websit

  • Good, the last card I bought was an ATi and it will be the last ATi I will probably ever buy. The card was nice, but if their drivers weren't so absolutely horrible it'd be a decent card. Good jobe Id, for signing with the better company.
    • My card two cards ago was a GeForce 2 MX. The shenanigans Nvidia pulled with their newer Detonators screwing around old cards means I'll never buy another Nvidia card again. Happily on a Radeon 9800 Pro now.

"Being against torture ought to be sort of a multipartisan thing." -- Karl Lehenbauer, as amended by Jeff Daiell, a Libertarian

Working...