Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
First Person Shooters (Games) Entertainment Games

Official Doom 3 Benchmarks Released 573

Rogerpq3 writes "Before the game goes on sale, id Software has been kind enough to release some benchmarks for DOOM 3 with the latest video cards on the market from NVIDIA & ATI. HardOCP has published the five page article which should help anyone trying to decide if they should upgrade their video card for DOOM 3. There's also an introductory note from John Carmack, mentioning: 'The benchmarking was conducted on-site, and the hardware vendors did not have access to the demo before hand, so we are confident that there is no egregious cheating going on.', and the HardOCP writers comment: 'As of this afternoon we were playing DOOM 3 on a 1.5GHz Pentium 4 box with a GeForce 4 MX440 video card and having a surprisingly good gaming experience.'"
This discussion has been archived. No new comments can be posted.

Official Doom 3 Benchmarks Released

Comments Filter:
  • by SIGALRM ( 784769 ) * on Wednesday July 21, 2004 @11:12PM (#9766842) Journal
    the fact of the matter is that many of you will be just fine, although an upgrade may still be in your future
    I'm not an expert on the Carmackian magic in Doom 3... but apparently the fact that the gaming engine works from the complex model downward and offers subsets to supported configurations is much more efficient in FPS terms than other engine architectures. However, as JC states, we should not all "live and die by the frame rate".

    I guess an upgrade is in my future, although I'm not sure I'll get to the "cinematic" level that's possible in D3's rendering.
    • I upgraded back when they announced a release date for HL2 a year ago lol. I didn't get a top of the line system but it was a lot better than what I was replacing and I figured it would do okay for HL2. If I had it to do over again I would of course have waited, but I'm relieved to find out that I'll be able to at least run Doom 3 somewhat decently.

      And it's not like I'll lose the game or anything, I figure I'll just wait another year or two to upgrade again and by that time I'll have a system that can run
    • Something to keep in mind when you upgrading. ATI does a good job of keeping up with Nvidia on D3. Nvidia is obviously quicker, but ATI isn't bad. However, on he HL2 benchmarks that have been released, ATI has been smoking Nvidia.

      Here is a review of some cards that are actually in my price range and from the sounds of it might be in yours.

      http://tech-report.com/etc/2003q3/valve/index.x? pg =2

      Just something to keep in mind.
      • I'm going with nVidia next round simply because of the ATI driver nightmare and their awful application support. I use 3d applications like Blender on a regular basis and ATI has caused nothing but trouble with that. Crashes, slowdowns, display errors, etc. Users of other packages also recommend not using ATI for work. That's enough of a reason to drop ATI for me. Yes, I know nVidia's cards are more expensive and eat more power, but hell, if it means my system will run more stable I can see over that. If nV
  • Of course... (Score:2, Insightful)

    by DarkHelmet ( 120004 ) *

    Of course Nvidia's card is going to do better. Doom3 has a specialized codepath for nvidia hardware, while the ATI card does not.

    If a codepath were written for the X800 series of cards, I'm sure the scores would be closer to each other.

    I take the superiority of one card over the other with a grain of salt.

    • Carmack:

      It should be noted that all of the modern cards play the game very well, and benchmark scores should not be the be-all-end-all decision maker. Scores will probably improve somewhat with future driver releases, and other factors like dual slots or dual power connectors can weigh against some of the high end cards.
    • Re:Of course... (Score:4, Informative)

      by w00d ( 91529 ) on Wednesday July 21, 2004 @11:18PM (#9766887)
      Both the Nvidia 6800 and ATI X800 run on the same ARB2 rendering path. Older cards have their own paths.
      • Do you have a link to that?
      • Re:Of course... (Score:5, Informative)

        by Anonymous Coward on Thursday July 22, 2004 @12:07AM (#9767181)
        Heres the list from that pcgamer clip.

        NV10 path: geforce4 mx.
        NV20 path: geforce3 and geforce4.
        R200 path: ati 8500/9000.
        ARB2 path: nvidia FX/ati r300+

        I assume radeon 9800 is included for arb2 because they use the r350 and r360 cores.

        The arb2 path and r200 path use 1 pass, the nv20 path uses 2 passes, and the nv10 path uses 5 passes.

        Also arb2 is the only path using that vertex/fragment programs which adds slightly to a few effects. (a heat-shimmer effect was mentioned).
    • Re:Of course... (Score:5, Insightful)

      by PhrostyMcByte ( 589271 ) <phrosty@gmail.com> on Wednesday July 21, 2004 @11:26PM (#9766944) Homepage
      BS.

      There is no way Carmack would neglect almost half of the gamers out there. The fact is, Radeons have always had less than stellar performance with OpenGL. They are built for D3D.
      • Re:Of course... (Score:3, Insightful)

        by mausmalone ( 594185 )
        And one of the reasons that Carmack says that the framerate is not the be-all/end-all benchmark is that even though the ATi cards run OpenGL a bit slower than nVidia cards, they usually render a slightly better picture (better AA, beter AF, less color banding, etc ...). This is one of the reasons why he actually said he thought the Radeon would be a good choice for Doom 3 (last year at E3) as this was an obvious difference in the Radeon 9800/GeForce FX iteration. I don't know if nVidia has improved their
    • Re:Of course... (Score:2, Informative)

      by Dekar ( 754945 )
      This is from a pretty old .plan [bluesnews.com] from John Carmack, but the second quote seems to still be valid today:

      "The NV30 runs the ARB2 path MUCH slower than the NV30 path. Half the speed at the moment. This is unfortunate, because when you do an exact, apples-to-apples comparison using exactly the same API, the R300 looks twice as fast, but when you use the vendor-specific paths, the NV30 wins."

      "The reason for this is that ATI does everything at high precision all the time, while Nvidia internally supports three

      • Re:Of course... (Score:4, Informative)

        by RedWizzard ( 192002 ) on Thursday July 22, 2004 @12:32AM (#9767271)
        This is from a pretty old .plan from John Carmack, but the second quote seems to still be valid today ... Basically, Nvidia screws up when it comes down to standard ARB2 code path, but it does so well with their own path that developers have to code it, and Nvidia gives them a lot of support. It looks like a fair deal to me.
        However the NV30 path is gone. The Nvidia drivers now perform well enough that the standard ARB2 path performance is as good as the NV30 path performance.
      • Wasn't ATI's 'high precision' ARB2 path 24bit as opposed to NVidia's 32bit?
    • As others have pointed out, the Nvidia cards used in this benchmark do NOT have their own codepath.

      The reason Nvidia kicks ATI's ass in Doom3 is because Doom3 is HEAVY on the stencil buffer shadows. Nvidia's newer FX cards can render two-sided stencil buffer volumes in one pass, which is a huge speed win for stencil shadows. It also supports stencil shadow volume clipping, which speeds things up even further.

      The long and short of it is, any game that uses a unified lighting model like Doom3's, using s

    • Re:Of course... (Score:3, Interesting)

      by egarland ( 120202 )
      If a codepath were written for the X800 series of cards, I'm sure the scores would be closer to each other.

      Even if that never happens, I won't even consider purchasing any of the current GeForce 6800 series. NVidia has fallen into the trap that killed 3Dfx of forgetting that their products are a small part of a multi-purpose computer.

      You can pretty much throw a 9800 or X800 series card into any machine and get a really good gaming machine. With the new cards in the GeForce series you have expensive req
      • Re:Of course... (Score:5, Insightful)

        by Seft ( 659449 ) on Thursday July 22, 2004 @04:27AM (#9768072)
        With the new cards in the GeForce series you have expensive requirements like massive power supplies extra slots, high-end cooling, and you need to not mind the dustbuster sound coming from your machine

        Massive Power Supplies: 6800 GTs are happy in shuttles with 250W PSUs

        Extra Slots: The 6800 and GT are single-slot

        High-end Cooling: See whats cooling your CPU, then count the transistors on each. Besides, it's much better to have a good cooling solution with headroom for overclocking than something that barely makes the grade

        Dustbuster Sound: I think you're confusing the 6800 series with a certain FX card. Besides, there is nothing stopping third-party manufacturers changing the fan, and many do.


        supid things like bringing back SLI

        SLI is a really good idea - it allows those who want to to have a very fast setup without increasing the price for those who are content with a fast setup.


        Now NVidia is positioning itself in the difficult, obtrusive ultra-high end space where 3Dfx was when it died.

        Not at all. nVidia has sold zillions of FX5200s to OEMs.
        • Re:Of course... (Score:3, Insightful)

          by AftanGustur ( 7715 )

          Massive Power Supplies: 6800 GTs are happy in shuttles with 250W PSUs

          If you take out the CPU and Hard disks, yes.

          Actually, a *good* 350W PSU can handle the task.

          High-end Cooling: See whats cooling your CPU, then count the transistors on each.

          You miss the point, the complaint was that that video card was making to much noise .. You can't explain or justify that by pointing at something else.


          Dustbuster Sound: I think you're confusing the 6800 series with a certain FX card. Besides, there is

      • Re:Of course... (Score:4, Informative)

        by Zakabog ( 603757 ) <.john. .at. .jmaug.com.> on Thursday July 22, 2004 @06:47AM (#9768471)
        Umm I speak as someone who had just purchased two GeForce 6800 GTs (well I only got one, but my friend got one too, and I installed and played on both systems.) He has an AMD 64 FX-53 and I have a 3200+. The loudest part of my computer is the fan on my chip, second to that, my hard drive (my old 40 gig samsung, my new serial ata WD is pretty quiet, so is my older WD drive.) The video card takes up one slot, I only have a 450 watt PSU (was like $40, and I didn't buy it cause of the video card, I bought it because a while ago my 260 watt PSU died, and I figured why not get a 400 watt PSU in case I ever wanted to do water cooling and stuff like that.) My friend has a 500 watt PSU he bought, I figured he should get it just in case the card doesn't like his 300 watt PSU, it was only $50 (he could have gotten one for $30 that would supply more than enough power, but the 500 watt one looked really nice so we got it, and when you're buying a $400 video card a $200 motherboard and an $800 CPU a $50 PSU is so increadibly cheap.)

        I don't really know what you're talking about, ATI is winning? They charge $100 more for a video card that performs worse in what will be the hottest new game this year, and they're winning? NVidia is going to have support for 2 video cards (2 insanely fast video cards) with PCI express, and ATI is winning? Maybe you were just upset with the NVidia FX series (I was upset too, it really killed me, I love NVidia mainly for their linux support and opengl performance, but the FX was just total CRAP, and when I saw the 6800 was gonna be a monster I was a little upset and even feared it was the end for NVidia but I was VERY surprised when I saw the final product, especially the benchmarks.) With the 6800, I see them as being back on top. You just sound like someone who has read one article a long time ago when NVidia first showed off the 6800, I think you should really check out the 6000 series, you'd be surprised at how well NVidia did this new series.
      • Re:Of course... (Score:5, Informative)

        by woodhouse ( 625329 ) on Thursday July 22, 2004 @07:54AM (#9768778) Homepage
        By view couldn't be any more different. ATi is losing the battle, and by a long way. Here's why.

        Over the last 2 generations of cards, nvidia has made huge leaps in terms of features, particularly in terms of shaders. Pixel shaders can now be very long. They support conditional branching, so if statements and loops are possible without unrolling.

        Now the geforce FX series, while great in terms of features, had well documented problems with 32-bit performance. However, these problems have been completely resolved in the 6 series. The 6 series of cards are superior to ATI's offerings in every sense, except possibly power consumption (and FYI, the GT doesn't require 2 slots).

        OTOH, ATi has completely failed to innovate over the last 3 years. Every revision since the 9700 has been effectively just a speed increase. Their latest cards give basically nothing new in terms of features over the 9700 pro. In terms of capability, their latest cards are inferior to nvidia's FX cards.

        As an owner of a 9700 and a hobbyist developer, I'm very familiar with the limitations. The shader length is highly restricted, conditional branching can't be done, so loops have to be unrolled. For this reason, even the latest ATI cards can't fully support the OpenGL Shading Language. What can be done on an FX or a Geforce 6 in one pass could take 10 or more passes on an X800. Many important features for shadow mapping are hopelessly missing, such as rendering to a depth texture, and hardware linear filtering.

        So it looks to me like ATi are struggling to keep up in terms of performance, and they've put so much resources into just keeping the peformance acceptable that they've completely failed to innovate. And while gamers might not have noticed this before, they are starting to with Doom 3, and as developers push shader tech to its limits, they will really start to see the limitations of their cards. Hopefully they can fix the situation with their next generation of cards, but my next card will certainly be a nvidia.
  • ATI (Score:2, Interesting)

    It'll be interesting to see how ATI responds to this. They pulled ahead in the last generation, but it seems to be Nvidia has learned from their mistakes. Nice to see that uberhardware isn't needed to get decent framerates. Too bad for the hardware industry though...
    • Re:ATI (Score:3, Insightful)

      Last I heard/saw, the 6800 still needed 2 molex connectors, and took up two expansion slots, sounded like a jet engine, and required a minimum 400 watt power supply. The ATI card uses much quieter cooling, requires one slot, and one power connector. For a machine that's on 24 hours a day in the same room I sleep in, noise is a big factor. If I needed the caliber performance of the latest/greatest card, and had an extra 400 to spend on a video card who's price will most likely be half that in 6-8 months,
      • Re:ATI (Score:3, Informative)

        by stonedonkey ( 416096 )
        Last I heard/saw, the 6800 still needed 2 molex connectors, and took up two expansion slots, sounded like a jet engine, and required a minimum 400 watt power supply.


        This turns out not to be the case. The 6800GT uses one Molex, one slot, is not loud, and runs just fine with a 300W PSU or thereabouts. The 6800 Ultra, however, does indeed fit your description, although I have heard no particular complaints about noise.

        • Re:ATI (Score:3, Informative)

          by 10Ghz ( 453478 )
          Even the Ultra runs just fine on a more modest poewr-supply. NVIDIA recommended 400 watt PSU just to be on the safe side, but many reviewers ran the card with 350 watt PSU just fine. Later NVIDIA reduced the requirements on the Ultra. So it does not need 400 watt PSU. Not in theory, and not in practice.
    • Re:ATI (Score:3, Interesting)

      by MtViewGuy ( 197597 )
      I wouldn't be surprised that within a few months of Doom 3's release there will be a Version 1.1 of Doom 3 with internal code changes that will fully take advantage of the registers of ATI's R300 and newer graphics chipsets.
      • Re:ATI (Score:5, Informative)

        by randyest ( 589159 ) on Wednesday July 21, 2004 @11:49PM (#9767090) Homepage
        I wouldn't be surprised that within a few months of Doom 3's release there will be a Version 1.1 of Doom 3 with internal code changes that will fully take advantage of the registers of ATI's R300 and newer graphics chipsets.

        Funny, seems Carmack would:

        Looking at the cream of the crop in video cards, it is painfully obvious that ATI is going to have to make some changes in their product line to stay competitive, at least with DOOM 3 gamers. There is no way for a $500 X800XT-PE to compete with a $400 6800GT when the GT is simply going to outperform the more expensive card by a good margin. I am sure ATI is trying their best to figure out their next move and it will certainly be interesting to see if their driver teams pull a rabbit out of their hat or not.
        • Re:ATI (Score:3, Interesting)

          by Bodhammer ( 559311 )
          "if their driver teams pull a rabbit out of their hat or not."

          I'd like to see them pull their head out of their ass first - I still can't run KOTOR on a 9800Pro with any stability - I have a basterd mix of Cat 4.2 and 4.7 and that is only marginally stable. This is on a game that was very highly rated and sold a bunch.

          Their OpenGL drivers smell like crotch!

          • Indeed. ATI has some support issues. For a while, they definitely had the fastest card, but stability has never been thier forte, especially compared to the likes of Matrox or nVidia.

            That said, their drivers have gotten much better (I'm referring to Windows binaries only here -- the Linux drivers leave a lot to be desired,) but my point was I think ATI will have to fix this problem (if it's fixable,) not just wait for iD to fix it for them as the OP suggested.
        • by adiposity ( 684943 ) on Thursday July 22, 2004 @01:07AM (#9767386)
          Are you implying that Carmack made the above statement? Because...he didn't. That's Kyle Bennet, the author of the HardOCP article speaking. Carmack only made the brief statement at the beginning (it's color coded to help you spot it), which states that "all of the modern cards play the game very well," and "there is no egregious cheating going on," and most importantly, "Nvidia drivers have been tuned for Doom's primary light/surface interaction fragment program."

          I don't think Doom3 will be significantly changed to help out ATI, but I'm positive ATI will change their drivers to help out Doom3's performance. As Carmack pointed out, the Nvidia drivers have already been fine tuned for Doom. My guess is that ATI, after the fiasco with releasing the Doom alpha, hasn't had as much opportunity to optimize for Doom.

          On the other hand, it's no surprise to see ATI losing to a card that obviously has more horsepower. Frankly, I'm impressed that a card that's so much cooler, smaller, and quieter does so well against Nvidia's monster. But in this case, at least, we see Nvidia's power fully utilized. Hopefully, ATI gets so more performance out of theirs, though.

          -Dan
  • The Bottom Line (Score:4, Interesting)

    by rokzy ( 687636 ) on Wednesday July 21, 2004 @11:15PM (#9766865)
    "If I had to make a list of high end video cards to purchase to play DOOM 3, the GeForce 6800Ultra and GeForce 6800GT would easily take the number 1 and number 2 spots with the ATI Radeon X800XT-PE rounding out the number 3 place."

    6800GT continues to look by by far the best price/performance card currently available.
    • I'm not convinced. When the hardware specs were posted on Slashdot the other day, somebody insightfully noted that there's a pretty serious conflict of interest here, given the "exclusiveness" of these benchmarks. Would id have let them do anything but publish benchmarks that were heavily in nVidia's favor? Is this just another TCO study commissioned by Microsoft?

      On the whole, the whole article just read like a ra-ra advertisement for nVidia. Finally, 2 slots and a new power supply just for one game? Nuh-

      • That is a really bizarre comment. Think about it for a minute.

        Let's assume that you do more than just game on your PC - well fair enough, I check my email and use IM/IRC and do web browsing, anywhere up to 10% of the time! So maybe I don't feel like I need to upgrade my PC for the next generation of games, although I know I will, it's my pride and joy, and I enjoy gaming with decent hardware (I still buy on the budget curve nonetheless).

        So, you have these power connectors and PCI slots in your system, a
      • Re:The Bottom Line (Score:2, Insightful)

        by vehn23 ( 684035 )
        Just as an aside I picked up a geforce 6800 non-ultra last month and could not be more pleased with its performance - and it requires only one slot and a "standard" 300W power supply.
    • I know, I was going to buy the X800XT-PE when it became available later this month, but I might as well go out and get the 6800 ultra now, WOOT!
    • And I quote from the article:

      "NVIDIA has told us more than once that the 6800 series was "designed to play DOOM 3," and the truth of that statement is now glaringly obvious. "

      For Doom 3 the Nvidia cards do look to be a better choice, but the cards themselves are biased for Doom 3, no doubt because Nvidia expects there to be many people waiting to upgrade for D3. HL2 performance may very well differ as from what I recall HL2 is in ATI's camp.

      Cards have biases in how they work and some do some things we

      • One thing I forgot to add, is that I hope that PC game companies do not get entrenched into a video card camp so far, that their game will only run well on a particular card. That I hope, would be bad business.
  • RE: Nvidia (Score:2, Funny)

    by rdilallo ( 682529 )
    I'm just glad I won't have to go out and upgrade my video card for this game. Seems like every time there's a new game out, I'm upgrading.

    Oh, yeah, Linux is better than Windows... blah blah blah.
  • by Amiga Lover ( 708890 ) on Wednesday July 21, 2004 @11:17PM (#9766877)
    Any news on the possibility of an Amiga port? The new Amigas have some awesome hardware. G3 800mhz or higher than 1GHz G4 cpus, DDR and some kind of Radeon.

    I think it's a quite obviously untapped market there for games authors, an entire community that grew up on THE games machine clamoring for more.
    • Please, please, please let this be a joke. The thought of you being serious is just plain scary. Apple are shipping 2.5ghz water cooled G5s as standard and you're excited about an 800mhz G3? I owe my career to what I learned on my Amiga while at high school, but I moved on. OTOH, if this is a joke, then ha! You had me laughing.
    • by Anonymous Coward on Wednesday July 21, 2004 @11:52PM (#9767103)
      Any news on the possibility of an Amiga port? The new Amigas have some awesome hardware. G3 800mhz or higher than 1GHz G4 cpus, DDR and some kind of Radeon.

      What would be really cool is an iPod port because iPod is awesome and it has a screen, a processor, and some kind of scroll wheel with clickable buttons.
    • by BlackHawk-666 ( 560896 ) on Thursday July 22, 2004 @02:59AM (#9767780)
      I've been working with ID on a port of Doom III to the IBM XT for those die-hards who refuse to upgrade. Here's a sample from the first level which will be released for free:

      You are in a twisty little maze of passages all alike. There is a pink demon here.
      Use rocket launcher

      You died. Play again?

  • by Radix37 ( 670836 ) on Wednesday July 21, 2004 @11:20PM (#9766898) Homepage
    That's the most important question... would my p3-450 with a voodoo2 break 1 fps or not?
    • by Anonymous Coward on Wednesday July 21, 2004 @11:36PM (#9767016)

      I've got the solution to your Doom 3 problems.

      Even heard of chess by e-mail? My company has just opened a subscription-based service--Doom3ByEmail.com [doom3byemail.com].

      You allocate a frame subscription of your chosen duration with any major credit card, we send you a rendered frame from your own personalized Doom 3 game, you send us an XML file containing directional commands, and we send you the resulting frame...

      Who said Doom 3 wouldn't run on your PDA?

  • Uh, hello? (Score:5, Insightful)

    by oGMo ( 379 ) on Wednesday July 21, 2004 @11:21PM (#9766905)

    How about some benchmarks for a card I actually have, like a ti4800? ;-) Saying "suprisingly good gaming experience" on a GF4MX means nothing... are you seeing a creepy title screen and playing a pong minigame, or actually seeing 30fps+?

    Sorry, but dropping $500 on a video card is just not an option, this would be more useful if we had some everyday specs.

    • Re:Uh, hello? (Score:2, Interesting)

      by spdycml ( 625849 )
      I agree. The article makes it seem like they tested a range of cards when they really only tested cards priced at 400 plus. For those of us who aren't doctors/lawyers/senators we need some benchmarks for our cards. I have a All-In-Wonder 9600Pro and I wanna know if it will work. I don't see myself spending 500 bucks to play this game. well.....unless I have to...um....hmmm.....maybe.
    • they have only 3 cards listed in the test and none of htem are widely in use

      i hope htey did the same kind of hardware polling that valve is doing/has done with regards to hl2 to see where their customer base actuallys tands in terms of hardware so that they don't end up with a flood of game returns for shit that doesnt' work.

      i'm curious if there are console versions of the game planned that would require that it run on something set in stone and a couple years off from bleeding edge.
      • they have only 3 cards listed in the test and none of htem are widely in use

        Keep R'ing TFA -- they test (1) nVidia Geforce6800 Ultra (1st place with a bullet), (2) nVidia Geforce6800GT (strong second), (3) ATI X800XT-PE (3rd and more costly than (2)), (4) and (5) nVidia GeForceFX5950 and ATI9800XT (pretty much a tie -- ATI is a tad faster with AF [anisotopic filtering] but no AA [anti-aliasing], add in the AA and nVidia edges ahead.)

        That's five, and at least two of them are what I'd call "widely in
        • I dunno, I would think:
          Radeon
          Radeon 7500
          Radeon 9200
          GeForce3
          GeForce4
          GeForce4MX

          Those are 'widely in use'.

          Essentially anything released before this year, and spanning the past three years since Doom3 was announced!
    • Re:Uh, hello? (Score:3, Informative)

      by randyest ( 589159 )
      Hey, the ATI 9800, which was benchmarked in the article, is only $147 [fatwallet.com] if you're in So. Cal. and $179.99 [fatwallet.com] anywhere else (BestBuy, even.) GeForceFX5950 isn't much more, if not less (online.)

      Yes, the ATI high end and amazingly-high-performance nVidia6800 Ultra are $500ish, but the nVidia6800GT trounced the $500 ATI card and it's $100 less. That's three choices $400 and under, two under $200!
    • Must also agree with parent - this article is useless. The worst card they benchmark is the absolute top of the line for the generation of cards that is only just starting to be replaced. If I recall correctly the X800 and the new nvidia cards have been known about for what, 3-4 months, and available for even fewer.

      I would like to see a benchmark from a Radeon 9600 or worse up. That might actually help.
    • Reading through the (many,many) comments on the HardOCP thread about this, the HardOCP staff member who did the reviews has stated that they are doing and in fact have completed benchmarks with many other, lower-end configurations, and the results will be posted as soon as they're available.
  • by stonedonkey ( 416096 ) on Wednesday July 21, 2004 @11:21PM (#9766913)
    I have a great amount of respect for hardocp.com, despite Kyle Bennet's occasional frothing rants. I've been reading the site for years. That said, these benchmarks are only partially useful without knowing the minimum framerate. Did it plummet anywhere? Did it only plummet on ATi cards?

    Second, they did not run these benchmarks, and they were done at the iD offices: "Today we are sharing with you framerate data that was collected at the id Software offices in Mesquite, Texas. Both ATI and NVIDIA were present for the testing and brought their latest driver sets." It sounds as though Hardocp was not even present for the tests.

    Their review of the BFG 6800GT OC convinced me to get that card. This article, however, does not convince me of...much of anything. I do have certain questions about their journalism, but it's best saved for a more appropriate time.

  • So my 750Mhz Duron w/GeForce MX 440 is *not* going to work? That sucks. I guess I'll go back to playing nethack...

    By the way, if people are still playing Doom 3* twenty-five years after it comes out, *then* we can start talking about the benefits of emphasizing gameplay over gee-whiz special effects that won't be gee-whiz in 6 months. Until then, call me elitist, call me old-fashioned, but don't call me bored!

    *or any game based on its engine
    • Well Doom was originally released in December of 93' and plenty of people still play it.
      • Hell, by 3rd parties, doom/doom2 are still being updated/upgraded:

        see Doom Legacy [newdoom.com], ZDoom [zdoom.org]

        Now, Doom3 is not really original anymore in terms of theme, so it might not do as well. But it could very well become one of those "old classics" several years from now.

        Another big hotspot is the Doom3 engine, as we'll probably see several later games developed from companies that have licensed the engine for use in their own products.
  • by timeOday ( 582209 ) on Wednesday July 21, 2004 @11:26PM (#9766943)
    Well I didn't expect this. Not even released yet, Doom 3 runs at 1600x1200 on "high quality" at 68 fps on the Nvidia 6800 Ultra, or 42 fps with 4x antialiasing. In other words it can just barely make use of the best hardware at the time of its release. That's fairly conservative in my book.
    • Well, I am rather disappointed in the resolution they chose for the top end. I have a nice widescreen monitor which does 1920 x 1200. I know others who have highend gear (for other purposes than gaming) which far exceed that. In my case, I just want 1920x1200 at highest quality textures and AA over 30fps since I'm LCD. I have my doubts that I'll get there without dropping some cash on a new vid card since I run a Radeon 9600 Pro.
      I don't think that the current batch of cards is going to "handle" very hig
  • ...I won't need to sell some organs on the black market, after all?
  • For almost every game i've ever played there were problems, little glitches that demanded certain versions of drivers, stuff like that. Even if the game was well behaved it ran like a dog on my PCs (which are all really old and crap).

    But when Quake 3 came out i could run it on a P233 (with MMX!), voodoo 2 12meg and 128MB ram. iD engines scale all the way.

    I will be interested in seeing how low people can get Doom 3 running.
  • a surprisingly good gaming experience

    Are they saying they were surprised it worked well, or surprised it was an enjoyable game?
  • for the weak graphics card. And if you have a MB that supports a P4 (not cheap at all) you most likely have AGP 8X so it's only ~$100 to spring for a GeForce FX 5200 and you're done.

    A new MB (if you can't support 8X AGP already), Barton or P4 (unless you've got a 1.5Ghz+ CPU and 8X AGP), plus new memory if you aren't already using DDR and the graphics card is going to run you under $500. You can pick up a GeForce FX 5200 for around $100. If you had to buy everything listed you'd come in under $500 if yo
  • by martin-boundary ( 547041 ) on Wednesday July 21, 2004 @11:39PM (#9767035)
    For those of use who are still stuck on Intel 386 hardware with a VGA card, can somebody please convert those benchmarks into something understandable? Also, if I did upgrade to more recent hardware, how many extra monsters could I have in DOOM1 for the same frame rate? Ach, mein Leben!
    • For those of use who are still stuck on Intel 386 hardware with a VGA card, can somebody please convert those benchmarks into something understandable?
      Basically, it means "Bend over and take it, bitch."
  • Page 1:

    "X800XT-PE may not be worthy of being included on the same graph"

    Later...

    If you would have told me a year ago that I could play DOOM 3 on a GeForce 3 64MB video card and 1.8GHz AthlonXP and have a good gaming experience, I would have called you crazy, but that is exactly what we are seeing.

    Translation:

    Save your money, DOOM 3 has the most insane graphics, and still plays just fine on the ~$150 cards. Which means most other games are totally fine. (I play Lineage 2 on a Rage fury pro with 32MB
  • That's reassuring. (Score:2, Interesting)

    by causality ( 777677 )
    'The benchmarking was conducted on-site, and the hardware vendors did not have access to the demo before hand, so we are confident that there is no egregious cheating going on.'



    It's comforting to know that said vendors are so honest and reliable, that if you make it physically impossible (or at least extremely improbable), that they will not "egregiously cheat" on published benchmarks.
  • by d_jedi ( 773213 ) on Wednesday July 21, 2004 @11:46PM (#9767073)
    If you're running the most recent CPU/GPU with a $hitload of RAM.. you're going to have a good gaming experience

    WELL NO SHIT! What did you expect? The game to only run acceptably on hardware that doesn't exist yet? Geez..

    As of this afternoon we were playing DOOM 3 on a 1.5GHz Pentium 4 box with a GeForce 4 MX440 video card and having a surprisingly good gaming experience

    Why no benchmarks of this? IMO much more useful than a benchmark of a P4 3.6GHz system with 4GB of RAM and a 6800 Ultra..
    • As the article says:

      Next week, before you can purchase DOOM 3, our goal is to publish the DOOM 3 [H]ardware Guide in order to give you an official resource to help you know what to expect out of your current hardware or to help make the right hardware buying decision should you be ready for an upgrade.

      This week they're publishing the high end graphics card benchmarks. They are putting together the data for those other boxes, and they'll be publishing that as a more complete guide next week. If you can
  • Weren't we seeing the ATi cards outperforming nVidia by disgusting margins on HL2 benchmarks?

    Is there some kind of under-the-table manipulation going on here? Is ATi trying to leverage HL2 to sell more cards? Is nVidia doing the same with Doom3?

    Or are both companies going to release new drivers soon and even the whole thing out?

    I'm just going to wait and see. And upgrade after HL2 has been out for six months. THEN I'll play these games. I usually buy a game after it has earned a reputation. Then I'll kno
    • "Weren't we seeing the ATi cards outperforming nVidia by disgusting margins on HL2 benchmarks?"

      Were you referring to these? [extremetech.com]
      Those benches are quite old (Sept 2003!!!) and you'll note that different generations of cards were used here. Also, the HL2 benches were run under DX9 and DX8, AFAIK the Doom3 benches were run under openGL.

      So no... there is no direct comparison. Different card gens and different rendering tech was used in the benches. Though it does look like nVidia is back on the ball after get
    • ATI paid Valve $5M to ensure that they had the better benchmarks, which is why you also see HL2 vouchers coming with your ATI graphics cards at the high end.

      Of course, Valve have been absolutely shocking in terms of their professional conduct compared to iD.

      DooM3 alpha leaks. iD: Oh wow, this is pretty sucky, we're going to look into it, see what we can find out, sack someone or something.

      HL-2 code leaks. Valve: OMG we got haxored by terrorists patriot act save us DMCA! FBI help help help now we're no
  • The question I have is... how well do the quadro cards perform???

    I have a brand new Quadro FX 1000 in my laptop and a year or so old Quadro 4 in my Desktop (Both with 128MB) - I wonder how well they'll run Doom3?

    They're fairly optimized for opengl - so I remain hopeful!

    Friedmud
  • ... those of us that can't (or won't) upgrade to the latest and greatest will just be stuck playing yet another FPS. The graphics are what will build the atmosphere, the fancy effects will take the experience and immersion to another level. My Ti4200 will just give me a pixelated experience. I'll stick with playing Splinter Cell on the PS2 just a bit longer in that case.
  • They spelled "bated breath" properly! That must be a first for the Internet.
  • 'As of this afternoon we were playing DOOM 3 on a 1.5GHz Pentium 4 box with a GeForce 4 MX440 video card and having a surprisingly good gaming experience.

    Presumably because they were able to play a hand of poker while waiting for each frame to be rendered.

  • They were surprised it even ran on a GeForce4 MX 440. That in itself is a good gaming experience. ;)
  • For all Radeon X800-Pro owners when I say.... "NOOOOOOOOOOOOOO"
    Well at least we have HL2 to look forward to :)
  • I'm going to buy a G4 powerbook, any comments on that, taking into considering the architecture and hardware used?
  • You have to wonder if NVIDIA and ATI didn't get together and just pay ID to write a game to get everyone to upgrade their video cards.
  • Sweeeet! (Score:5, Funny)

    by Horizon_99 ( 58767 ) on Thursday July 22, 2004 @01:55AM (#9767574)
    This is the coolest thing I've heard so far about the game:
    Talking to John briefly about his overclocking comments made some things clear to us that many enthusiasts will need to be aware of. When he speaks of "new usage patterns" he is literally talking about transistors on some of new GPUs that are going to be used for the first time when you play DOOM 3 on your video card. So be aware that pushing your GPU MHz may get you different results in DOOM 3 than with other games.
    Yeah, bring my card to it's knees JC!

    Hey just realized while typing this that JC's initals are JC, it all makes sense...

Life is a game. Money is how we keep score. -- Ted Turner

Working...