Official Doom 3 Benchmarks Released 573
Rogerpq3 writes "Before the game goes on sale, id Software has been kind enough to release some benchmarks for DOOM 3 with the latest video cards on the market from NVIDIA & ATI. HardOCP has published the five page article which should help anyone trying to decide if they should upgrade their video card for DOOM 3. There's also an introductory note from John Carmack, mentioning: 'The benchmarking was conducted on-site, and the hardware vendors did not have access to the demo before hand, so we are confident that there is no egregious cheating going on.', and the HardOCP writers comment: 'As of this afternoon we were playing DOOM 3 on a 1.5GHz Pentium 4 box with a GeForce 4 MX440 video card and having a surprisingly good gaming experience.'"
Might possibly upgrade... (Score:5, Interesting)
I guess an upgrade is in my future, although I'm not sure I'll get to the "cinematic" level that's possible in D3's rendering.
Re:Might possibly upgrade... (Score:2, Interesting)
And it's not like I'll lose the game or anything, I figure I'll just wait another year or two to upgrade again and by that time I'll have a system that can run
Re:Might possibly upgrade... (Score:3, Informative)
Here is a review of some cards that are actually in my price range and from the sounds of it might be in yours.
http://tech-report.com/etc/2003q3/valve/index.x
Just something to keep in mind.
Re:Might possibly upgrade... (Score:3, Interesting)
Re:Might possibly upgrade... (Score:2, Funny)
Re:Might possibly upgrade... (Score:2, Informative)
Of course... (Score:2, Insightful)
Of course Nvidia's card is going to do better. Doom3 has a specialized codepath for nvidia hardware, while the ATI card does not.
If a codepath were written for the X800 series of cards, I'm sure the scores would be closer to each other.
I take the superiority of one card over the other with a grain of salt.
Re:Of course... (Score:2)
It should be noted that all of the modern cards play the game very well, and benchmark scores should not be the be-all-end-all decision maker. Scores will probably improve somewhat with future driver releases, and other factors like dual slots or dual power connectors can weigh against some of the high end cards.
Re:Of course... (Score:4, Informative)
Re:Of course... (Score:2)
Re:Of course... (Score:5, Informative)
NV10 path: geforce4 mx.
NV20 path: geforce3 and geforce4.
R200 path: ati 8500/9000.
ARB2 path: nvidia FX/ati r300+
I assume radeon 9800 is included for arb2 because they use the r350 and r360 cores.
The arb2 path and r200 path use 1 pass, the nv20 path uses 2 passes, and the nv10 path uses 5 passes.
Also arb2 is the only path using that vertex/fragment programs which adds slightly to a few effects. (a heat-shimmer effect was mentioned).
Re:Of course... (Score:5, Insightful)
There is no way Carmack would neglect almost half of the gamers out there. The fact is, Radeons have always had less than stellar performance with OpenGL. They are built for D3D.
Re:Of course... (Score:3, Insightful)
Re:Of course... (Score:2, Informative)
"The NV30 runs the ARB2 path MUCH slower than the NV30 path. Half the speed at the moment. This is unfortunate, because when you do an exact, apples-to-apples comparison using exactly the same API, the R300 looks twice as fast, but when you use the vendor-specific paths, the NV30 wins."
"The reason for this is that ATI does everything at high precision all the time, while Nvidia internally supports three
Re:Of course... (Score:4, Informative)
Re:Of course... (Score:2)
Re:Of course... (Score:3, Informative)
The reason Nvidia kicks ATI's ass in Doom3 is because Doom3 is HEAVY on the stencil buffer shadows. Nvidia's newer FX cards can render two-sided stencil buffer volumes in one pass, which is a huge speed win for stencil shadows. It also supports stencil shadow volume clipping, which speeds things up even further.
The long and short of it is, any game that uses a unified lighting model like Doom3's, using s
Re:Of course... (Score:3, Interesting)
Even if that never happens, I won't even consider purchasing any of the current GeForce 6800 series. NVidia has fallen into the trap that killed 3Dfx of forgetting that their products are a small part of a multi-purpose computer.
You can pretty much throw a 9800 or X800 series card into any machine and get a really good gaming machine. With the new cards in the GeForce series you have expensive req
Re:Of course... (Score:5, Insightful)
Massive Power Supplies: 6800 GTs are happy in shuttles with 250W PSUs
Extra Slots: The 6800 and GT are single-slot
High-end Cooling: See whats cooling your CPU, then count the transistors on each. Besides, it's much better to have a good cooling solution with headroom for overclocking than something that barely makes the grade
Dustbuster Sound: I think you're confusing the 6800 series with a certain FX card. Besides, there is nothing stopping third-party manufacturers changing the fan, and many do.
supid things like bringing back SLI
SLI is a really good idea - it allows those who want to to have a very fast setup without increasing the price for those who are content with a fast setup.
Now NVidia is positioning itself in the difficult, obtrusive ultra-high end space where 3Dfx was when it died.
Not at all. nVidia has sold zillions of FX5200s to OEMs.
Re:Of course... (Score:3, Insightful)
Massive Power Supplies: 6800 GTs are happy in shuttles with 250W PSUs
If you take out the CPU and Hard disks, yes.
Actually, a *good* 350W PSU can handle the task.
High-end Cooling: See whats cooling your CPU, then count the transistors on each.
You miss the point, the complaint was that that video card was making to much noise .. You can't explain or justify that by pointing at something else.
Dustbuster Sound: I think you're confusing the 6800 series with a certain FX card. Besides, there is
Re:Of course... (Score:4, Informative)
I don't really know what you're talking about, ATI is winning? They charge $100 more for a video card that performs worse in what will be the hottest new game this year, and they're winning? NVidia is going to have support for 2 video cards (2 insanely fast video cards) with PCI express, and ATI is winning? Maybe you were just upset with the NVidia FX series (I was upset too, it really killed me, I love NVidia mainly for their linux support and opengl performance, but the FX was just total CRAP, and when I saw the 6800 was gonna be a monster I was a little upset and even feared it was the end for NVidia but I was VERY surprised when I saw the final product, especially the benchmarks.) With the 6800, I see them as being back on top. You just sound like someone who has read one article a long time ago when NVidia first showed off the 6800, I think you should really check out the 6000 series, you'd be surprised at how well NVidia did this new series.
Re:Of course... (Score:5, Informative)
Over the last 2 generations of cards, nvidia has made huge leaps in terms of features, particularly in terms of shaders. Pixel shaders can now be very long. They support conditional branching, so if statements and loops are possible without unrolling.
Now the geforce FX series, while great in terms of features, had well documented problems with 32-bit performance. However, these problems have been completely resolved in the 6 series. The 6 series of cards are superior to ATI's offerings in every sense, except possibly power consumption (and FYI, the GT doesn't require 2 slots).
OTOH, ATi has completely failed to innovate over the last 3 years. Every revision since the 9700 has been effectively just a speed increase. Their latest cards give basically nothing new in terms of features over the 9700 pro. In terms of capability, their latest cards are inferior to nvidia's FX cards.
As an owner of a 9700 and a hobbyist developer, I'm very familiar with the limitations. The shader length is highly restricted, conditional branching can't be done, so loops have to be unrolled. For this reason, even the latest ATI cards can't fully support the OpenGL Shading Language. What can be done on an FX or a Geforce 6 in one pass could take 10 or more passes on an X800. Many important features for shadow mapping are hopelessly missing, such as rendering to a depth texture, and hardware linear filtering.
So it looks to me like ATi are struggling to keep up in terms of performance, and they've put so much resources into just keeping the peformance acceptable that they've completely failed to innovate. And while gamers might not have noticed this before, they are starting to with Doom 3, and as developers push shader tech to its limits, they will really start to see the limitations of their cards. Hopefully they can fix the situation with their next generation of cards, but my next card will certainly be a nvidia.
Re:Of course... (Score:3, Insightful)
More programmability is not just a gimmick, it's where real-time graphics is heading.
Re:Of course... (Score:3, Interesting)
This
Re:Of course... (Score:3, Interesting)
NVidia apparently ended up dying because managment
ATI (Score:2, Interesting)
Re:ATI (Score:3, Insightful)
Re:ATI (Score:3, Informative)
This turns out not to be the case. The 6800GT uses one Molex, one slot, is not loud, and runs just fine with a 300W PSU or thereabouts. The 6800 Ultra, however, does indeed fit your description, although I have heard no particular complaints about noise.
Re:ATI (Score:3, Informative)
Re:The word is "its". (Score:4, Funny)
Re:ATI (Score:3, Interesting)
Re:ATI (Score:5, Informative)
Funny, seems Carmack would:
Looking at the cream of the crop in video cards, it is painfully obvious that ATI is going to have to make some changes in their product line to stay competitive, at least with DOOM 3 gamers. There is no way for a $500 X800XT-PE to compete with a $400 6800GT when the GT is simply going to outperform the more expensive card by a good margin. I am sure ATI is trying their best to figure out their next move and it will certainly be interesting to see if their driver teams pull a rabbit out of their hat or not.
Re:ATI (Score:3, Interesting)
I'd like to see them pull their head out of their ass first - I still can't run KOTOR on a 9800Pro with any stability - I have a basterd mix of Cat 4.2 and 4.7 and that is only marginally stable. This is on a game that was very highly rated and sold a bunch.
Their OpenGL drivers smell like crotch!
Re:ATI (Score:2)
That said, their drivers have gotten much better (I'm referring to Windows binaries only here -- the Linux drivers leave a lot to be desired,) but my point was I think ATI will have to fix this problem (if it's fixable,) not just wait for iD to fix it for them as the OP suggested.
Reread the article, carefully this time (Score:5, Insightful)
I don't think Doom3 will be significantly changed to help out ATI, but I'm positive ATI will change their drivers to help out Doom3's performance. As Carmack pointed out, the Nvidia drivers have already been fine tuned for Doom. My guess is that ATI, after the fiasco with releasing the Doom alpha, hasn't had as much opportunity to optimize for Doom.
On the other hand, it's no surprise to see ATI losing to a card that obviously has more horsepower. Frankly, I'm impressed that a card that's so much cooler, smaller, and quieter does so well against Nvidia's monster. But in this case, at least, we see Nvidia's power fully utilized. Hopefully, ATI gets so more performance out of theirs, though.
-Dan
The Bottom Line (Score:4, Interesting)
6800GT continues to look by by far the best price/performance card currently available.
Re:The Bottom Line (Score:2)
On the whole, the whole article just read like a ra-ra advertisement for nVidia. Finally, 2 slots and a new power supply just for one game? Nuh-
Re:The Bottom Line (Score:2)
Let's assume that you do more than just game on your PC - well fair enough, I check my email and use IM/IRC and do web browsing, anywhere up to 10% of the time! So maybe I don't feel like I need to upgrade my PC for the next generation of games, although I know I will, it's my pride and joy, and I enjoy gaming with decent hardware (I still buy on the budget curve nonetheless).
So, you have these power connectors and PCI slots in your system, a
Re:The Bottom Line (Score:2, Insightful)
Re:The Bottom Line (Score:2)
Re:The Bottom Line (Score:2)
"NVIDIA has told us more than once that the 6800 series was "designed to play DOOM 3," and the truth of that statement is now glaringly obvious. "
For Doom 3 the Nvidia cards do look to be a better choice, but the cards themselves are biased for Doom 3, no doubt because Nvidia expects there to be many people waiting to upgrade for D3. HL2 performance may very well differ as from what I recall HL2 is in ATI's camp.
Cards have biases in how they work and some do some things we
Re:The Bottom Line (Score:2)
RE: Nvidia (Score:2, Funny)
Oh, yeah, Linux is better than Windows... blah blah blah.
How about an Amiga port? (Score:5, Funny)
I think it's a quite obviously untapped market there for games authors, an entire community that grew up on THE games machine clamoring for more.
Re:How about an Amiga port? (Score:2, Interesting)
Re:How about an Amiga port? (Score:5, Funny)
What would be really cool is an iPod port because iPod is awesome and it has a screen, a processor, and some kind of scroll wheel with clickable buttons.
Or an IBM XT Port (Score:4, Funny)
You are in a twisty little maze of passages all alike. There is a pink demon here.
Use rocket launcher
You died. Play again?
More or less than 1 fps (Score:5, Funny)
Re:More or less than 1 fps (Score:5, Funny)
I've got the solution to your Doom 3 problems.
Even heard of chess by e-mail? My company has just opened a subscription-based service--Doom3ByEmail.com [doom3byemail.com].
You allocate a frame subscription of your chosen duration with any major credit card, we send you a rendered frame from your own personalized Doom 3 game, you send us an XML file containing directional commands, and we send you the resulting frame...
Who said Doom 3 wouldn't run on your PDA?
Uh, hello? (Score:5, Insightful)
How about some benchmarks for a card I actually have, like a ti4800? ;-) Saying "suprisingly good gaming experience" on a GF4MX means nothing... are you seeing a creepy title screen and playing a pong minigame, or actually seeing 30fps+?
Sorry, but dropping $500 on a video card is just not an option, this would be more useful if we had some everyday specs.
Re:Uh, hello? (Score:2, Interesting)
yea i agree with this (Score:2)
i hope htey did the same kind of hardware polling that valve is doing/has done with regards to hl2 to see where their customer base actuallys tands in terms of hardware so that they don't end up with a flood of game returns for shit that doesnt' work.
i'm curious if there are console versions of the game planned that would require that it run on something set in stone and a couple years off from bleeding edge.
Re:yea i agree with this (Score:3, Insightful)
Keep R'ing TFA -- they test (1) nVidia Geforce6800 Ultra (1st place with a bullet), (2) nVidia Geforce6800GT (strong second), (3) ATI X800XT-PE (3rd and more costly than (2)), (4) and (5) nVidia GeForceFX5950 and ATI9800XT (pretty much a tie -- ATI is a tad faster with AF [anisotopic filtering] but no AA [anti-aliasing], add in the AA and nVidia edges ahead.)
That's five, and at least two of them are what I'd call "widely in
Re:yea i agree with this (Score:2)
Radeon
Radeon 7500
Radeon 9200
GeForce3
GeForce4
GeForce4MX
Those are 'widely in use'.
Essentially anything released before this year, and spanning the past three years since Doom3 was announced!
Re:Uh, hello? (Score:3, Informative)
Yes, the ATI high end and amazingly-high-performance nVidia6800 Ultra are $500ish, but the nVidia6800GT trounced the $500 ATI card and it's $100 less. That's three choices $400 and under, two under $200!
worst they benchmark is a 9800XT... idiots (Score:2)
I would like to see a benchmark from a Radeon 9600 or worse up. That might actually help.
Re:Uh, hello? (Score:2)
Re:Uh, hello? (Score:4, Insightful)
All this stuff about buying new cards is mostly a pissing competition. I have seen nothing in the reported hardware requirements, nor benchmarks that would imply you couldn't get a very satisfactory game of a Geforce 4.
Jedidiah
Re:Uh, hello? (Score:2)
Re:Uh, hello? (Score:2, Informative)
Oh yeah? Try playing Quake3 on a minimally configured machine and see what the gameplay is like. 320x200 with software rendering can be get good framerates, but you can't see a blooming thing!
Quake3 doesn't even support software rendering. You don't know what you're talking about, do you?
Define "remarkably good" (Score:3, Insightful)
More importantly, the boxes they did the benchmarking on were maxed out with specs like 2GB of DDR400 and an Athlon 64 or comparable processor. Unless you've got all the other specs to match the test box, y
No minimum framerates? (Score:5, Interesting)
Second, they did not run these benchmarks, and they were done at the iD offices: "Today we are sharing with you framerate data that was collected at the id Software offices in Mesquite, Texas. Both ATI and NVIDIA were present for the testing and brought their latest driver sets." It sounds as though Hardocp was not even present for the tests.
Their review of the BFG 6800GT OC convinced me to get that card. This article, however, does not convince me of...much of anything. I do have certain questions about their journalism, but it's best saved for a more appropriate time.
Hmmm.... (Score:2)
By the way, if people are still playing Doom 3* twenty-five years after it comes out, *then* we can start talking about the benefits of emphasizing gameplay over gee-whiz special effects that won't be gee-whiz in 6 months. Until then, call me elitist, call me old-fashioned, but don't call me bored!
*or any game based on its engine
Re:Hmmm.... (Score:2)
Still being "updated" (Score:2)
see Doom Legacy [newdoom.com], ZDoom [zdoom.org]
Now, Doom3 is not really original anymore in terms of theme, so it might not do as well. But it could very well become one of those "old classics" several years from now.
Another big hotspot is the Doom3 engine, as we'll probably see several later games developed from companies that have licensed the engine for use in their own products.
Re:Hmmm.... (Score:2)
I expect this to be one of the first games to be massively ripped from XBox to PC if this is true.
Maybe Doom3 is too *conservative* on hardware!? (Score:5, Interesting)
Re:Maybe Doom3 is too *conservative* on hardware!? (Score:3, Funny)
I don't think that the current batch of cards is going to "handle" very hig
So does this mean... (Score:2, Funny)
This is why i love iD (Score:2)
But when Quake 3 came out i could run it on a P233 (with MMX!), voodoo 2 12meg and 128MB ram. iD engines scale all the way.
I will be interested in seeing how low people can get Doom 3 running.
Re:This is why i love iD (Score:2)
Re:This is why i love iD (Score:3, Insightful)
Re:This is why i love iD (Score:2)
I ran Quake 3 fine on a K6-2 300MHz with 64MB of RAM and a 4MB nVidia Riva 128.
Your words are very true.
Wireframe only ? Capture a whole new market ! (Score:2)
C'mon John, I'm sure you can meet the technical challenge! Take pity on all those people with 486s !
Surprising? (Score:2)
Are they saying they were surprised it worked well, or surprised it was an enjoyable game?
The P4 is compensating (Score:2)
A new MB (if you can't support 8X AGP already), Barton or P4 (unless you've got a 1.5Ghz+ CPU and 8X AGP), plus new memory if you aren't already using DDR and the graphics card is going to run you under $500. You can pick up a GeForce FX 5200 for around $100. If you had to buy everything listed you'd come in under $500 if yo
Re:Don't waste your money... (Score:2)
older hardware? (Score:4, Funny)
Re:older hardware? (Score:2)
Pure crap as always... (Score:2)
"X800XT-PE may not be worthy of being included on the same graph"
Later...
If you would have told me a year ago that I could play DOOM 3 on a GeForce 3 64MB video card and 1.8GHz AthlonXP and have a good gaming experience, I would have called you crazy, but that is exactly what we are seeing.
Translation:
Save your money, DOOM 3 has the most insane graphics, and still plays just fine on the ~$150 cards. Which means most other games are totally fine. (I play Lineage 2 on a Rage fury pro with 32MB
That's reassuring. (Score:2, Interesting)
It's comforting to know that said vendors are so honest and reliable, that if you make it physically impossible (or at least extremely improbable), that they will not "egregiously cheat" on published benchmarks.
This review tells us nothing (Score:5, Insightful)
WELL NO SHIT! What did you expect? The game to only run acceptably on hardware that doesn't exist yet? Geez..
As of this afternoon we were playing DOOM 3 on a 1.5GHz Pentium 4 box with a GeForce 4 MX440 video card and having a surprisingly good gaming experience
Why no benchmarks of this? IMO much more useful than a benchmark of a P4 3.6GHz system with 4GB of RAM and a 6800 Ultra..
Re:This review tells us nothing (Score:2)
Next week, before you can purchase DOOM 3, our goal is to publish the DOOM 3 [H]ardware Guide in order to give you an official resource to help you know what to expect out of your current hardware or to help make the right hardware buying decision should you be ready for an upgrade.
This week they're publishing the high end graphics card benchmarks. They are putting together the data for those other boxes, and they'll be publishing that as a more complete guide next week. If you can
Reminds me of ATI/Half-Life2 (Score:2)
Is there some kind of under-the-table manipulation going on here? Is ATi trying to leverage HL2 to sell more cards? Is nVidia doing the same with Doom3?
Or are both companies going to release new drivers soon and even the whole thing out?
I'm just going to wait and see. And upgrade after HL2 has been out for six months. THEN I'll play these games. I usually buy a game after it has earned a reputation. Then I'll kno
Re:Reminds me of ATI/Half-Life2 (Score:2)
Were you referring to these? [extremetech.com]
Those benches are quite old (Sept 2003!!!) and you'll note that different generations of cards were used here. Also, the HL2 benches were run under DX9 and DX8, AFAIK the Doom3 benches were run under openGL.
So no... there is no direct comparison. Different card gens and different rendering tech was used in the benches. Though it does look like nVidia is back on the ball after get
Re:Reminds me of ATI/Half-Life2 (Score:2)
Of course, Valve have been absolutely shocking in terms of their professional conduct compared to iD.
DooM3 alpha leaks. iD: Oh wow, this is pretty sucky, we're going to look into it, see what we can find out, sack someone or something.
HL-2 code leaks. Valve: OMG we got haxored by terrorists patriot act save us DMCA! FBI help help help now we're no
How cute! (Score:4, Insightful)
Nvidia Cheating [slashdot.org]
ATI Cheating [slashdot.org]
Quadro?? (Score:2)
I have a brand new Quadro FX 1000 in my laptop and a year or so old Quadro 4 in my Desktop (Both with 128MB) - I wonder how well they'll run Doom3?
They're fairly optimized for opengl - so I remain hopeful!
Friedmud
Just another FPS... (Score:2)
*GASP* (Score:2)
Cards (Score:2)
Presumably because they were able to play a hand of poker while waiting for each frame to be rendered.
By surprisingly good they mean something subtle (Score:2, Funny)
I believe I speak.... (Score:2)
Well at least we have HL2 to look forward to
G4 Powerbook? (Score:2)
Is Doom 3 just a sponsored demo? (Score:2)
Sweeeet! (Score:5, Funny)
Hey just realized while typing this that JC's initals are JC, it all makes sense...
Re:Look at the hardware they use to run it (Score:2)
Re:What about Half-Life 2? (Score:2)
There are no coincidences.
Re:Summary . . . (Score:2, Interesting)
Well, I guess that depends on what you thing "a lot" means. At $661CDN for a 6800GT, I don't see too many being sold in the near future. The Radeon X800XT is even worse, at $800CDN. WTF!? This is narrowed down to the very hardcore of gamers, and they represent a very small percentage of the gaming population.
Many people likely will upgrade, but I just don't see this game selling $600+ cards to a large number of folks.
Re:wankfest (Score:2)
I always buy at the budget end of the curve, having just bought a great 9600XT for $230AUD, which more than doubles the performance of my last card. I upgrade every 12-18 months depending on how rich I'm feeling, and how the market looks compared to the way my games are running.
So I'm looking at these benchmarks with