NVidia Recommended Graphics Card For Doom 3 81
Griffon4 writes "Nvidia
announced today that they have been branded the recommended graphics card for id's forthcoming PC FPS Doom 3. Now I'm wondering:
Recall a little over a year ago that Carmack said the Nvidia
card at the time was just a slight bit more worthy than the ATI?
Jump forward to today - are we getting a real recommendation based on id's own
experience or just seeing the end result of a financial deal that
benefits both?" Other possible factors (apart from NVidia simply being the better graphics card for Doom 3) include the alleged origination of some Doom 3 Alpha version leaks, unlikely to endear ATI to id, and, of course, ATI already having a major bundle deal in place with a certain other FPS.
Look At Origin (Score:5, Insightful)
Re:Look At Origin (Score:5, Insightful)
I'd imagine Nvidia are keener to push this, given that it's a direct advert for their products. id gain nothing from doing press releases for nVidia, except maybe the ire of those poor, poor Radeon users convinced that they've won the Gaming Wars (whatever those were).
- Chris
Re:Look At Origin (Score:4, Insightful)
Re:Look At Origin (Score:3, Funny)
Aye, son, and we did win the Gaming Wars! Why I remember the legendary battle of ought two. ATI was flounderin' and NVidia had just unleashed the GeForce4 which was poundin' away at her flanks and closin' round her backside. All seemed lost. Then, out of nowhere, ATI unveiled the fury that was the Radeon 9700 Pro! NVidia was caught completely unawares, and was forced to retreat to fight another day. Those were
Re:Look At Origin (Score:2, Insightful)
Re:Look At Origin (Score:2)
Re:Look At Origin (Score:2)
Re:Look At Origin (Score:2)
I should probably have used scarcasm tags instead of the *cough* I used eh?
Re:Look At Origin (Score:2)
Re:Look At Origin (Score:1)
Is that true now? IIRC, Carmack developed Q3 on NT4.
I'm saving up ... (Score:1)
I think that I will wait for some real benchmarks to be available before investing heavily in an upgrade for this one - the release date is still a little way off and I am sure it will be worth waiting to see what is available then.
Re: (Score:1)
Re:Screw Carmack (Score:4, Insightful)
Re:Screw Carmack (Score:2, Interesting)
W00T Carmack!
Re: (Score:2, Informative)
Re:Screw Carmack (Score:5, Insightful)
You can't tell the difference because for that particular game, they are used in essentially identical ways. I would assume that if the Unreal series is going to bother supporting both APIs, they simply added a layer of abstraction to their graphics code, coded the engine using that abstraction, and then created implementations of the abstraction for both OpenGL and Direct3D that look exactly the same by design.
This, however, implies nothing about the quality of either API, it merely suggests that the developers coded to the greatest common subset of both APIs. Of course, it does seem that this subset is good enough to make a nice-looking game, which seems to suggest that it doesn't really matter from an end-user perspective which API the developer chooses. Therefore...
But if your going to program a game in an API, why not DirectX? It handles Video, Audio, and input. OpenGL is nice, but it only does video (that I know of).
This is totally irrelevant - we just saw that Direct3d and OpenGL are equally capable so why choose the one that limits you to a single platform? The fact that DirectX handles other things is totally irrelevant - you can use OpenGL alongside the other components of DirectX without a problem (and this is in fact what Carmack has done in the past). You can also just use other libraries for other aspects - for example, SDL works just great. While certainly not of the scope of these commercial games, I found that SDL+OpenGL was more than adequate for my own game [stanford.edu], and I got the bonus of being able to make Linux, Mac, and even Windows versions with only about 2 total lines of code changed (had to switch where the #include's pointed for each platform, though a more clever build environment probably could've handled that). Offhand, about the only thing we couldn't do with SDL + OpenGL that DirectX provides is the network coding/matchmaking stuff from DirectPlay, but AFAIK most commercial games don't use this anyway.
Re:Screw Carmack (Score:5, Informative)
about the only thing we couldn't do with SDL + OpenGL that DirectX provides is the network coding/matchmaking stuff from DirectPlay
Here comes the SDL_net [libsdl.org]
(Not to the parent poster): In case you are curious, there're plenty userful and commercial-quality libs out there [libsdl.org], such as SDL_image, SDL_mixer, SDL_ttf...They just rock, some of them had been used in commercial titles (remember Lokigames?)...forget about DirectX, and screw MS.
Re:Screw Carmack (Score:1)
Opensource doesn't work for game[s][-libraries]. It never has and it never will.
Re:Screw Carmack (Score:2)
Re:Screw Carmack (Score:1)
Sweet game... now we just need a networked multiplayer version
Mark Erikson
Re:Screw Carmack (Score:1)
We've been planning to post the source for a while now but we wanted to clean it up a little before posting and then we all got sidetracked with other work. I think we're keeping track of who wanted to get notified when we do release the source, so if you want to get added to that list send me an email.
Oh, and I'm with you on the multiplayer version - the AI has a few weaknesses that make the game a bit too easy once you figure them out. Multiplayer is without doubt the killer f
Re:Screw Carmack (Score:4, Interesting)
> But if your going to program a game in an API, why not DirectX?
On that matter, I'd quote some post, which maybe has some relevance to your question:
>> Why build for a closed platform (DirectX)?
id Software is not only producing games of the Quake and Doom-series on various platforms, which are not very demanding on the input devices, and usually only have a minimum of video. They also sell the graphics engine to other companies. IRC, that is actually the main income of the company.
So, why should a company which makes its money mainly from graphics engines restrict oneself to one platform? Its not like they can't use DirectX for video, audio, and input anymore when they use OpenGL for graphics.
Especially, when they have experts on that API and at the time when Quake emerged, DirectX 3D was nothing more than a hack.
Re:Screw Carmack (Score:1)
(and what about the rendition verite accelerated dos binary)
Re:Screw Carmack (Score:2)
Quake was released with a software renderer that was platform non-specific. OpenGL allows Carmack the same level of platform agnosticism with the benefits of hardware acceleration. Quake III requires an OpenGL accelerator, but at the time of release, OpenGL accelerators were fairly ubiquitous.
As for the platform-specific VQuake, this was back when cards could only perform well on optimimized, native rendering code. The fa
Re:Screw Carmack (Score:2, Insightful)
And for sound, etc.. There are other API's that do the same stuff. OpenAL for example.
So, why using DirectX, when there are such good other API's
Re:Screw Carmack (Score:3, Interesting)
Re:Screw Carmack (Score:1)
And, have you tried to program something using OpenGL. It IS a kludgey hack, the classical MS API, big, ugly, and hard to use. (yes, you are right, OpenGL only does video...for an open source gaming API, try SDL. maybe not as powerful as DirectX, but MUCH more usable)
Re:Screw Carmack (Score:5, Informative)
He could have, but he likes the OpenGL API more, as he documented in his plan file in 1996. [rmitz.org] This also addresses your question later in the thread as to how DirectX is a kludgey hack; in 1996 at least, the interface was really nasty. It has probably improved since.
There's also the portability issue. If he coded it using DirectX, that locks the code to Microsoft platforms. No easy Mac, Linux, or console ports aside from the xbox.
Re:Screw Carmack (Score:2, Interesting)
You are basing your argument on comments he made EIGHT YEARS AGO!
Apparently there are people here that don't realize that Q2 and Q3 could use either OGL or DirectX in the Windows version.
Re:Screw Carmack (Score:2)
Re:Screw Carmack (Score:1)
Re:Screw Carmack (Score:2)
Re:Screw You (Score:1)
My three cents.
Re:Screw Carmack (Score:1)
Corperate Sponsorship At Work (Score:4, Funny)
Why refer to theinquirer.net... (Score:5, Informative)
When the man himself once made a post righ here on
Re:Why refer to theinquirer.net... (Score:2, Insightful)
Re:Why refer to theinquirer.net... (Score:1)
Well, the Alpha (Score:3, Interesting)
Sorry nVidia - I love your cards but I'm not upgrading just so I can play DooM III on an 'approved' video card.
Re:Well, the Alpha (Score:2)
not at the 'max' of course, but then again if you make the engine so that max is obviously overkill for any hardware(so that the engine will scale for some time into the future) what does it matter?
as long as the game is good and looks good who cares. fullscene AA never made any game great.
Re:The Way It's Played (Score:2)
Re:The Way It's Played (Score:1)
What I'm saying is that us gamers need a second job...
Re:The Way It's Played (Score:2)
Bull (Score:2)
Re:Bull (Score:1)
Re:The Way It's Played (Score:1)
Besides, I doubt id would be stupid enough to anger half of their customers for a comparatively small amount of money.
endorsement sucks (Score:3, Interesting)
This means nothing, it could even be simply that the box system requirements says "GeForce FX or better" under the recomended sub-heading.
Anyway, I hate this crap with games manufacturers officially "reccomending" hardware or deliberately coding a game to be more efficient with one brand over another. I despise seeing the Nvidia logo on game intros and thought a lot less of Valve when they endorsed ATi. I'll be thinking id has come a long way down in the world if they start endorsing nvidia, not that it would make sense since who wants to buy a game engine that is deliberately coded to run better on one specific brand's hardware.
Its just such a cheap shitty way to try and make people buy your product. Cant beat the competition by making a better product? Frightened your competitor just does everything better than you? Screw being competitive and trying to offer something better for your customers, pay off developers to make the competition crapper instead!
Developers should be ashamed of themselves, theyre supposed to be about making something as good as possible for any customer, not only if theyre using hardware from whoever theyve shacked up with.
unless it saves me money (Score:1)
I'm very happy for id and Valve to run their little GPU sponsorship deals. It means they have more money for games development that doesn't have to come out of their customers' pockets.
XBOX Version (Score:3, Interesting)
iD has always said the XBOX version will be equal to the PC version (even at half the cpu/gfx capability) and released simultaneously.
Maybe this has something to do with it, due to the XBOX having an NVIDIA GPU and not an ATI.
Just advertising (Score:3, Insightful)
Does it REALLY help both companies? (Score:3, Interesting)
Mod +1 (Score:1, Redundant)
Re:Does it REALLY help both companies? (Score:4, Insightful)
This is all a lot of posturing, no way a major studio would hurt sales by making a game perform so much better on one video card as to make the other unplayable.
And judging by the fact that people have written very very nice wrappers for nvidia only demos (dawn) so that they run even better on ATI hardware doesn't make me worry too much.
Although I reccomend not getting a new video card until the next generation NV40/R420 come out.
Re:Does it REALLY help both companies? (Score:1)
That said, I do not see how any recommendation will *drastically* affect the way the game looks or plays. Ooh, so NVidia cards are slightly more optimized. "6 more FPS! In your face ATi!"
Seriously, WHAT does id have to gain by suggesting that the ONLY way to play is with one or the other, especially
Re:Does it REALLY help both companies? (Score:1)
So, under those conditions, this deal hasn't really changed anything.
No mystery here (Score:2, Interesting)
Half Life and Doom III (Score:2)
And Doom III = Nvidia
Can't we all just get along? I paid good money for ATI Radeon 9800 Pro 128MB, and anticipated both games to be ATI driven. Now I feel screwed to a degree. Sigh...
Re:Half Life and Doom III (Score:1)
Cmon people.. (Score:2, Informative)
On another note... I actually tried the Doom 3 leak on my GeForce FX 5900 Ultra. A small blurb about it is on my websit
Good! (Score:1)
Re:Good! (Score:2)