Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Games Entertainment

Carmack on Doom 3 Video Cards 430

mr_sheel writes "According to a Gamespy interview with John Carmack, Carmack says what he thinks about the video cards with Doom3: ATI Radeon 8500 is a better card, with a nicer fragment path, while NVidia still consistently runs faster due to better drivers. And of course, the GeForce SDR cards will not be "fast enough to play the game properly unless you run at 320x240 or so." And in a ShackNews interview with Carmack, he says that Doom 3 at E3 was only running at medium quality... wow."
This discussion has been archived. No new comments can be posted.

Carmack on Doom 3 Video Cards

Comments Filter:
  • by bogie ( 31020 ) on Friday May 31, 2002 @01:02AM (#3615257) Journal
    Cool cutting edge graphics are great, but really its still the gameplay that matters. It seems like all the gaming sites/rags/etc only get off on talking about pixel shaders, and game engines, when all the gamer wants is something original and fun to play. I just pray it can measure up to games like Half-life, No One Lives Forever, and Dues Ex. I want an ACTUAL STORYLINE, scripted events, and real NPC interaction. If its just Doom/Quake/Serious Sam style gameplay, with great graphics I won't be buying this time around.
    • I think the last 3 or so games, in addition to Carmack's personal policies as evidenced in his .plans and emails, have illustrated that Id's slant towards technology rather than storyline is here to stay. No big deal, so long the games based on Id engines are of sufficient quality (see JK2, Half-Life, etc.). This isn't necessarily a bad thing: id keeps pushing the tools and tech part, others will take their tech and make great games out of them. I have little doubt that the Doom III technology will result in an awesome single player game with fantastic storyline, NPC interaction and scripted events; I'm not sure that Doom III itself will be that game.
    • Real Author (Score:5, Informative)

      by Ted V ( 67691 ) on Friday May 31, 2002 @01:49AM (#3615429) Homepage
      They've hired a real science fiction author to write the story for the game. It's the same guy who did the 7th Guest story, if you remember that old (but excellent) game. I don't remember the guys name off the top of my head though...
      • Re:Real Author (Score:5, Informative)

        by CBNobi ( 141146 ) on Friday May 31, 2002 @02:20AM (#3615539)
        That would be Matt Costello.

        From the id Software E3 interview [gamespy.com] at GameSpy:

        GameSpy: [7th Guest and now DOOM III writer] Matt Costello ... somehow I suspect you were involved with getting him involved in the project.

        Graeme Devine: [laughs] Oh yeah! I remember we were looking for a writer ... we'd talked to a bunch of writers, Tim and John were reading books and stuff, and I said "Well, I know a guy. I've worked with him before, he's really good: Matt Costello."

        So, we got some of his books and John read them and loved them, and it's just really weird, bringing him onto the project ... an old friend, bringing part the old team back. It's been really fun.
    • I want an ACTUAL STORYLINE, scripted events,

      So you're saying you want a linear game. No thanks.
  • Great. (Score:2, Funny)

    by mu_wtfo ( 224511 )
    If I start saving *now*, I might just be able to afford hardware that'll actually run the game by the time its released.

  • And it can be closed quickly if someone walks by.

    82 seconds on expert! Yay!
  • by Gizzmonic ( 412910 ) on Friday May 31, 2002 @01:07AM (#3615274) Homepage Journal
    So some hotshot Ferrari-drivin' game developer who makes more money than God likes to buy video cards every week to compare 'em?

    You know what? What if people were obsessed with lobsters the way that these guys were with fill rates?

    you know, bob down at the creek is like: "Hey, I caught this lobster, and it's scurrying abilities are really great, but the sloppy curvature of its claws really kills it for me..." and then slim replies, "Well, shit, I'm gonna overclock my lobster boat and catch so many lobsters they're gonna elect me King of Red Lobster! And it's got bump-mapping too!"

    My point being: You can stay up too late and have your weird z-buffered, anti-aliased dreams, but you can't get back that $400 you just dropped on the latest Bligblagdoodlehopper of a card, and dontcha forget itBR>

    • by Anonymous Cowrad ( 571322 ) on Friday May 31, 2002 @01:15AM (#3615308)
      Ok, first off, nobody makes more money than god. Churches are very profitable businesses.

      Second off, sweet christ that was a terrible analogy, if only because maybe five guys in the world can relate.

      Thirdly (and lastly, my beer isn't getting any cooler), why shouldn't there be a high end pc games market? Porsche doesn't have to use geo metro engines so that geo metro owners don't feel left out.

      Lastly (I lied about the last one), of course you can't get money back that you spend. This is one of the fundamental tenets of capitalism. I'm afraid you're just going to have to get used to it.
      • why shouldn't there be a high end pc games market? Porsche doesn't have to use geo metro engines so that geo metro owners don't feel left out.

        Interesting thought. Why aren't there any truly high-end supercards out there? I'm talking custom built 8x AGP Pro + 2 PCI slot cards with 3 DVI outputs that perform game functions like a Wildcat 5110 does Maya...

        Probably wouldn't sell many of them at $2-5000 a pop, but they'd be there for geek bragging rights at least. Plus I could pick one up on Ebay a year or so after it comes out for a pittance :)
    • What if people were obsessed with lobsters the way that these guys were with fill rates

      People would get sick of lobster. In early colonial days (N. Amer.) lobsters were incredibly plentiful. They would be collected as fertilizer for farms, there was a law limiting how often you could make your indentured servant eat lobster.
    • You're kidding, right?! Every serious game-company does this for the exact reasons mentioned by Carmack.
      They have to be concerned with fill-rates. They have the crappy job to make sure that their game runs as good as possible on all plausible setups. Only one way to find out...
      I think someone here is a little envious of the success (and money) of one Mr. Carmack. And come on.. The lobster analogy..? That seriously makes NO sense to anyone.
    • So some hotshot Ferrari-drivin' game developer who makes more money than God likes to buy video cards every week to compare 'em?

      I hightl doubt Mr Carmack has to buy graphics cards, I would assume that as an ISV iD get whatever they want essentially for free, since they drive hardware sales in their niche of the industry to such a degree. Hardware manufacturers would be only to eager to do anything in return for him recommending their products to his millions and millions of fans. But I suspect that as a purist, he's only swayed by superior technology and not by perks!

      You can stay up too late and have your weird z-buffered, anti-aliased dreams, but you can't get back that $400 you just dropped on the latest Bligblagdoodlehopper of a card

      Sure you can. Sell it on eBay, or think of the money you aren't wasting at bars when you're at home playing :-P
    • If people want to drop $400 every couple of years in order to enjoy the newest high-end video games at the highest resolution and refresh rate possible, why should you care? To you, it may be a waste of money, but it isn't to them.

      - A.P.
  • I wonder if any of the current laptops will be able to run Doom3... I'm considering buying a laptop with a GF4go as the Radeon7500 based ones seems to be slower... I wonder if its really worth it to go from 32 megs to 64 megs of ram?
    • I just bought one of these. Its a Dell Inspiron 8200. P4-M 1.7GHz, Enhanced UXGA, GeForce4 440Go 64MB, 512MB DDR SDRAM, 60GB 5400rpm HD.

      It runs RTCW in 1600x1200 with everything turned all the way up very comfortably. I'm VERY happy. It'll do fine with Doom 3.
    • Re:Laptops...? (Score:4, Informative)

      by ToLu the Happy Furby ( 63586 ) on Friday May 31, 2002 @04:53AM (#3615895)
      I wonder if any of the current laptops will be able to run Doom3... I'm considering buying a laptop with a GF4go as the Radeon7500 based ones seems to be slower... I wonder if its really worth it to go from 32 megs to 64 megs of ram?

      Unfortunately, as both the GF4 Go and Mobile Radeon 7500 lack hardware pixel shaders, they will not be able to render Doom3 in its full glory. Of course they will be able to run it, but many of the graphical goodies will either be missing or will need to be (very slowly) computed on the CPU.

      As for 32 vs. 64 MB, I'd go for the latter if you want to run Doom3. Surfaces in Doom3 can contain up to 5 texture maps, which means tons of RAM usage at anything but low texture detail. If you run out of room on the card, you need to store textures in main memory and access them over the AGP bus, which is too slow for that sort of thing. IIRC both the GF4 GO and Mobile Radeon 7500 are available with 64 MB, although I suppose one sometimes doesn't get the choice when buying a laptop.

      Basically, the top-of-the-line 3D cards of today are going to be necessary to run Doom3 decently, so the top-of-the-line mobile 3D cards--which are about a generation behind the desktop--are going to be able to run it, but somewhat mediocrely. Of course, Doom3 probably won't be out for at least a year, maybe a year and a half. By that time you'll be able to buy a laptop which runs the game beautifully. If you have to buy a laptop now then it'll be a bit tougher. Kind of makes you wish laptop 3D cards were upgradable like desktop ones...
    • by squaretorus ( 459130 ) on Friday May 31, 2002 @08:44AM (#3616412) Homepage Journal
      This is the key question.

      What better way to attract women than to be playing Doom III on a train within a week of launch and to be kicking ASS!

      If your laptop has a nice velvetty 'keyboard nipple' pointer you have a second angle with which to get them going! Chicks really dig those! "ooooh! it feels so soooft!"

      All aboard the love train!!!
  • Uhm (Score:3, Informative)

    by Anonymous Coward on Friday May 31, 2002 @01:08AM (#3615283)
    He did not say that the Radeon 8500 was better than the Geforce4 at all. In fact, he said that the Geforce4 was better than current ATI offerings. However, he said that next-gen ATI offerings, which he used to demo at e3, were better than next-gen NVIDIA offerings currently (rumors are that it's just a scouped up gf4, something like a gf4 ultra).
  • That's funny. (Score:2, Interesting)

    by blair1q ( 305137 )
    What Carmack actually said is,

    "The GeForce 4 Ti is the best card you can buy."

    So I'm wondering if we aren't being spammed by ATI marketing here.

    --Blair
  • by bertok ( 226922 ) on Friday May 31, 2002 @01:11AM (#3615291)
    I suspect that when Doom III is released, a lot of people are going to upgrade to the GeForce 5 just to be able to play the game. This has happened in the past. "New ID game? Time to upgrade..." is a line even I've repeated like a parrot myself over the past few years. However, as this cartoon [penny-arcade.com] points out, ID software is best at making engines, not games. Will upgrading be worth it for most people, or are they better off waiting a year or two until interesting games are released that utilize the Doom 3 engine?

    Consider this: Of the three games I've played almost exclusively in recent years, all three were Half-Life mods: Counter-Strike, Day Of Defeat, and Team Fortress Classic. However, with my current GeForce 3 based video card, I get the maximum 100fps at the highest supported resolution of 1280x960. So what exactly is the point of upgrading? Even if I upgraded to be able to play Doom III, I'd play it for at most a month, then go back DoD/CS/TFC.

    PS: While we're on the topic of Half-Life, does anyone know why the engine doesn't allow resolutions above 1280x960? It seems like an arbitrary limit that could be easily removed. Maybe some of the people that invest months of time into writing HL cheats should try to figure out how to remove that limit instead...

    • have you read ANY of the reviews on doom3. The only reason doom3 is called doom3 and not another name is because it exists in the same world. The gameplay itself is totally different, they are trying to get it be like a horror movie. Read reviews, be less ignorant.

      Wait a second. the mods you listed are multi only, there is no plot (beyond the individual map), quake3 is an awesome game as long as you don't want anything else besides dm and tdm.

      p.s. as for the HL cheat thing, i heard somewhere that the newest version of ogc will fade in music from winamp whenever someone dies in cs (and fade out on rebirth).
    • PS: While we're on the topic of Half-Life, does anyone know why the engine doesn't allow resolutions above 1280x960? It seems like an arbitrary limit that could be easily removed. Maybe some of the people that invest months of time into writing HL cheats should try to figure out how to remove that limit instead...

      They might not have textures at high enough resolution on the disc. It would probably look crappy magnifing them anymore.

  • by Xunker ( 6905 ) on Friday May 31, 2002 @01:11AM (#3615292) Homepage Journal
    Seriously.

    I'm an "old timer", but still I'm not old enough to have been concious of when this phenomenon actually began; there was a fundamental change somewhere in the last 15 years where things shifted from games using existing hardware fully to where games became the reason themselves to create new, faster hardware devices.

    Not that this is bad, nit by any means, but it does give one interesting meat to consider; no one will argue that games are what's driving things like new video card technologies -- when did the chicken outdo the egg?
    • Right about the time companies realized gamers would buy new hardware to play a game. Sounds retarded, and yet it's true...
    • by BusterB ( 10791 ) on Friday May 31, 2002 @01:17AM (#3615317)
      Wing commander was the first game to start the hardware upgrade craze over a game. I have the PC Computing magazine that discusses this; it probably drove the move to 386's more than windows 3!
      • I agree with you absolutely. Origin was definitely pushing hardware to the limits with its games of the early 90s, annoyingly so if you ask me. Whereas Wing Commander ran comfortably on my 386-33, just a couple years later I'd end up needing a 486-66 to play Strike Commander, Privateer, and System Shock. A couple years after that, Ultima Online was about to be released, and my PC was again useless.

        The constant need to upgrade to enjoy the latest games just seems like a fact of geek life now. Thank you for reminding me of whom to blame for all this.
    • by ObligatoryUserName ( 126027 ) on Friday May 31, 2002 @01:26AM (#3615347) Journal
      As far as 3D-accelerators go, the point when people started buying hardware just for games can fairly accurately be pinpointed to the release of GlQuake - which was a free download after Quake shipped allowing hardware acceleration. For a few years after that games shipped with hardware and software rendering, but all the reviews for such games would say "this game looks wicked cool with hardware acceleration, but looks like dog vomit in software mode- only buy this spiffy new game if you have a 3D card". Slowly then games went from software render only, to both software and hardware rendering, to where we are today that all games require hardware acceleration. This trend has repeated itself for various features build into different generations of 3D accelerators.
      • Kinda sad, but probably the biggest seller of all, back in the day, for hardware acceleration was the game "Tomb Raider" by EIDOS. I recall it coming out with 3dfx acceleration, and people crapping themselves in amazement about it.

        • Games with big, nicely rendered breasts ALWAYS do well, reguardless of any actual quality.

          You want Doom III to have the same forced-upgrade appeal as the last two? Just put some big, nicely rendered breasts in it somewhere. Maybe on those fat grey guys from the screenshots.....
      • I'm not sure it was glQuake that did it. I remember Quake looking OK in software mode.

        What I remember of that time is that the game that made me buy a 3DFX card even before the game came out was the (now) much laughed at Tomb Raider. That was the first game I remember ever *needing* a a 3D card for as it just looked amazing.

        Even if you don't like Tomb Raider now, remember that at the time Tomb Raider was amazing and offered a kind of cinematic experience really not seen before in games - an experience that was great increased with the 3D card (like the waterfall, or the T-Rex).

        I recall a couple of other friends saying that they bought their first 3D card for that game. I think you'd be surprised at how many people did so.
    • by NanoGator ( 522640 ) on Friday May 31, 2002 @02:13AM (#3615507) Homepage Journal
      "...no one will argue that games are what's driving things like new video card technologies -- when did the chicken outdo the egg? "

      It probably happened when people spent $3,000 on the latest computer hardware and demanded immediate return on their investment. At l;east that was my experience. My dad got me a 486-33 mhz machine back when they were seriously top of the line. That computer was like my supercomputer for many, many months. My dad dropped a pretty hefty chunk of change on it. He and I both felt that for all the money spent on it, it'd better be a day to night difference over the old 286 I had.

      Fortunately, I had Wing Commander II. And boy was it superior on the 486! The game took advantage of the extra RAM to draw more stuff on the screen (like the pilot's hand controlling the ship), and it had the voice pack so your wingman could talk! And the game was smooooooooooooooth.

      I think that game did more to impress my dad with his investment than the 3D stuff I ended up doing later on it. Any queeziness he had about buying me that machine melted that night.

      I can tell you something, it's satisfying to buy new hardware and have it blow your old hardware away. That's why games like Halo are so important to the XBOX. Quake 3 was the game to do that on PC, but it looks like Doom 3 will easily take its place.

      In any case, I think that explains the shift. To tell you the truth, if I didn't run Lightwave so much, I probably wouldn't have much idea how much faster one computer is over another. Guess I should play games s'more. ;)

    • When I whined and complained to my dad to buy an extra 4MB of RAM for $250 to put into the 486DX2 to play Doom.

      Kinda funny, I'll be buying a Geforce4 (Geforce5?) when DoomIII comes out.

    • I know 10 years ago I bought a faster pc so that I COULD play games. Games have allmost always been more 'intense' for the hardware than simple desktop applications.

      WordPerfect etc still ran great on my 8086 but if you wanted to play for example doom, you needed a faster machine(a 486 in my case).

      Now look at it from the other side: would their ever be so much money invested in the development of faster hardware if there where no games? You don't really need a fast CPU to type a letter or make some spreadsheets.
    • I've often said that if it wasn't for games, we'd probably all still be using 486's for our day-to-day desktop office machines. I've always thought that it's been the games market that's driven the technology advances, particularly in sound and video. Of course, there are lots of other applications that require fast hardware, (site servers, video and stills graphic design, university research etc) but for the average joe home user, it's been games which have led to them having a 1ghz+ machine on their desktop for writing docs and surfing the web.
    • there was a fundamental change somewhere in the last 15 years where things shifted from games using existing hardware fully to where games became the reason themselves to create new, faster hardware devices.

      Not that this is bad, nit by any means, but it does give one interesting meat to consider; no one will argue that games are what's driving things like new video card technologies -- when did the chicken outdo the egg?


      When games started taking a long time to make. Used to be even a revolutionary game could be made from start to finish in months. Wolfenstein 3-D took about 6 months to make. The original Doom took about a year. iD has been working on Doom3 for about two years and they've still probably got at least a year to go.

      When Carmack decided the technical parameters of the Doom3 engine back in 2000, he would have been an idiot if he designed it to only take advantage of the features of existing hardware. Instead he designed it to use features and require performance which he knew would be entering the mainstream by the time the game was released.

      Of course, Carmack is unique in that he can actually influence Nvidia, ATi, etc. a bit into supporting the features he wants. On the other hand, you have to realize that back in 2000 Carmack, Nvidia, ATi, etc. already had a very good idea what sort of features would be supported in the 3D cards of 2003. Of course there is some guessing and tweaking involved (which Carmack seems to be particularly good at), but a good 3D engine designer has to design for the hardware of the time when the game will be released, not the hardware of the time when he's designing.

      As for why games take so much longer these days, that's another story but the basic point is that they are not only more complex technically but that there is ever increasing detail in the art, scripting, level design, etc. that it takes much larger teams with much better tools much longer to make a game than in the old days.
      • When games started taking a long time to make. Used to be even a revolutionary game could be made from start to finish in months. Wolfenstein 3-D took about 6 months to make. The original Doom took about a year.

        Doom took a year? If I remember correctly, Doom was about a year late when it came out...am I wrong?

        Tim

    • A few years ago, my parents bought a mac, a Performa 5260 (one of the most un-upgradable and unsupported machines on the planet, which I knew at the time), and I kept saying, don't get a Performa, don't get a Performa. Well, they got the Performa (and later got quite angry when I told them that I'd told them not to get it).

      My stepfather asked me, one day, after he'd had his computer for a while, how you compare one computer to another, in terms of speed. My simple reply was 'games'. See how a game plays on both systems. He didn't believe me that games are used for benchmark numbers, if not entire benchmarks. 'Why would anyone buy a better computer just for games?' he asked.

      Our Performa came with a ton of useless crap on CD, but it also came with a copy of FA/18 Hornet 2.0, a flight sim. Stepfather is very into planes, so he started playing it one day. Over the next few months, he was more and more into the game.

      When Hornet 3.0 came out, he purchased it. Same with A-10 Cuba, and the Hornet Korea upgrade. He even bought a game he couldn't play (Falcon 4.0), just for when he later could play it (i.e. got a new machine). He was also very disappointed when I told him he couldn't add a 3D accelerator to the Perorma, to get the beautifully textured goodness of Hornet 3.0. I think it was at this point I told him I'd told him, and he shouted that I had not.

      We run a home-based business, or rather, they do, and I used to help. They needed a new computer, and the local Mac shop had a great deal on a G4, 17" monitor, laser printer, and so on, so they leased it (the whole purchase = tax deductable as a lease). GeForce 2MX (great at the time) and a sweet sweet 533 G4 processor.

      Wouldn't you know it, Falcon 4.0's hardware acceleration only supports RAVE, and ATI cards directly, neither of which is supported on the GeForce 2, and no OpenGL support. What's the first thing he thinks of? Buy an ATI card for it, spend a few hundred bucks that they really don't have, and upgrade, just so that one game plays nicer than it did before (it plays very smooth in software mode).

      I agree with the other posters, and my anecdote supports the claim. 3D is what drives sales. I remember WC3's 3D gameplay (basically software 3D done beautifully) on my friend's 486, and it was amazing. Let me tell you, if you didn't have the hardware to play it, you damn well wanted to buy the hardware to play it. That was the major turning point (for me). En masse, QGL sounds about right. DirectX was another important turning point, too. By making games faster (in Windows), people could write more complex games with better graphics, and they didn't have to bother with a DOS version. Then, people who didn't have Windows 95 had to get it, and people who didn't have the hardware for Windows 95, or barely had it, had to get that. If you wanted to game, you HAD to have W95, or you were stuck playing legacy games until eternity (which, for a hardcore gamer, is not an option).

      So DX, OpenGL, GLide (which sucked), and use of these technologies are, to me, what really turned the tables. Game development took off, and so did hardware purchases. Now, everyone's chasing their first 3D high.

      I'm just waiting for a holodeck.

      --Dan
  • by Trepalium ( 109107 ) on Friday May 31, 2002 @01:15AM (#3615309)
    It's never been that their cards are junk, it's just that for every card, they start anew with completely untested drivers, which never quite mature before the card is discontinued, and new ones introduced. Nvidia's "unified" drivers, on the other hand, tend to be refinements from version to version and card to card, rather than completely different drivers.

    If ATI could just finally fix their drivers once and for all, they'd be on even standing with Nvidia.

    • by ergo98 ( 9391 ) on Friday May 31, 2002 @01:48AM (#3615427) Homepage Journal
      I would say that an analysis at nvidia and at ATI would also show a completely different corporate philosophy regarding driver development (I can't vouch for this, nor do I have any first hand knowledge : It's just a hunch). With nvidia hardware, the drivers (I'm normally a Windows guy, so we're talking Wintel here) install professionally, they work superbly, they continually support even ancient chipsets (TNT users are seeing performance improvements with each detonator release), and they are feature rich. With ATI, in every experience that I've had the installs have been horribly amateurish, the drivers have been GPFing nightmares, the documentation is horrible, and is usually accusatory of the customer (I recently came across one of these "All your problems are belong to you" sort of documents with a ATI TVWonder PCI). ATI also likes to orphan products, so even only slightly dated products often get relegated to the un-updated trash heap. I suspect, and again this is only a hunch, that ATI treats driver and application development as an nuisance, and only as something to be done when the product is on retail shelves and to entice customers (a very short term approach), whereas nvidia treats it as a scientific continual pursuit of perfection for all their customers.

      If I sound down on ATI, I'm not really : They have proven themselves to have extraordinary hardware guys who make, literally, the best stuff in the business, however their ability to continually shoot themselves in the foot with a horrible software development record is hard to fathom : Talk to anyone about ATI, and 95% of the time they'll relate some driver nightmare they've had with an ATI card.
    • It's never been that their cards are junk, it's just that for every card, they start anew with completely untested drivers,

      That was the story with Nvidia for the longest time, too, lest people forget. They only started getting really good drives after the GeForce 2 was released (and early GeForce 2 drivers were horrible).
  • Re: (Score:2, Informative)

    Comment removed based on user account deletion
  • by Francis ( 5885 ) on Friday May 31, 2002 @01:37AM (#3615385) Homepage
    Once upon a time, I turned to my friend and said, "When in God's name did graphics cards become more expensive than your CPU?"

    Without missing a beat he replies, ".. Well, it's got more transistors..."
  • Id should talk to visa to get a co-branded credit card going -- thats about the only way people are going to get to play their games -- charging videocards :)

    on another note, is it just me or is Id and john carmack dangerously close to becoming the george lucas of the game industry? I find myself incapable of getting excited about DOOM 3.

    • is it just me or is Id and john carmack dangerously close to becoming the george lucas of the game industry? I find myself incapable of getting excited about DOOM 3.

      WTF are you talking about? Have you even seen the movie from E3? Pixel-shaded bumpmapping *drool*. Every object casts a shadow *drool*. Trent Reznor doing the music *drool*. Realistic physics *drool*. Real-time 5.1 mixing *drool*.

      Now excuse me while I use my Phantom Menace DVD to wipe the slobber off my computer.

      • lucas does the same thing. all you just described was a tech demo and a horribly void sex life. people who care can get pixel shaded bump mapping from...well...anything with pixelshaded bump mapping. and last time i was in the weird world of "outside" i saw a lot of stuff casting shadows.

        the question is, will id include all of that crap with a game worth playing, or is "the guy from doom" gonna end up wandering around casting a shadow and telling nataly portman "i dont like the sand, its rough and irritating" in dolby digital 5.1 just like another recent heavily hyped tech demo with a few good action scenes.
  • by jcsehak ( 559709 ) on Friday May 31, 2002 @02:01AM (#3615462) Homepage
    Because by the time a Mac version of the game is released, those expensive video cards will have been low-end for at least a couple years.

    [Me 3 years from now]: Hey, I just got this cool new game, Doom III !

    [Everybody else]: ...

    [Everybody else] (to each other, turning away): C'mon, let's go play Tribes 4.
  • Interesting review (Score:4, Informative)

    by olman ( 127310 ) on Friday May 31, 2002 @03:16AM (#3615689)
    You can see a different angle here. [msnbc.com] Carmack's saying that R300 kicks GF4s' ass and that's why they did demo DoomIII with ATI in E3.

    Here's the relevant bit:

    Doom III is very much hardware driven, and one of the controversies of this year's E3 was that the game was demonstrated on the latest ATI graphics card rather than a card from NVidia. "NVidia has been stellar in terms of driver quality and support and doing all of the things right," says Carmack, who has been an outspoken evangelist for NVidia's GeForce technology. "For the past few years, they have been able to consistently outplay ATI on every front. The problem is that they are about one-half step out of synch with the hardware generation because they did Xbox instead of focusing everything on their next board. So they are a little bit behind ATI."

    • by Zathrus ( 232140 )
      And if you bother to go find follow-up comments to that statement, you'll discover Carmack saying that it's an apples-to-oranges comparison.

      You're testing the next generation card vs. the current generation. May as well compare a GF4 Ti4600 to a Radeon 7500 and see which one does better.

      The NV30 hasn't been taped yet. There's no silicon to test. So while you can't say whether or not the NV30 will be better than the R300, it's still a faulty comparison for NV25 vs R300. And since the NV30 is supposed to be released in August/September (color me doubtful, since they don't have prelim silicon yet), there's not going to be much of a gap between their releases either.

      Frankly, even if NV30 doesn't have the edge on R300 on paper I'll buy it in a second over ATI. Why? Because ATI's drivers suck, their support sucks, and anyone who's been burned by ATI over the past 20 years will know what I'm talking about. They have long had a tendancy to release poor to middling drivers and then rapidly desupport the card. Nvidia, on the other hand, is still supporting the original TNT with current drivers - the card they made 4 years or so ago. Plus, as Carmack observes, Nvidia's drivers make their cards surpass ATI - which any benchmark will show you.

      Now if only Nvidia would put some decent output stages on the reference design... output quality at high resolutions is one area where ATI has long been better. And Matrox trounces them both.
  • Even Alan Cox agrees (Score:2, Interesting)

    by Steffen ( 84872 )
    I had the pleasure of seeing Alan Cox speak in Dublin a couple of months back, and he made the point that it was unreasonable for people to expect Nvidia to release the full source for their drivers. The Nvidia drivers were what gave them the edge. He reckoned that the ATI cards were generally a bit faster if you looked purely at the hardware, but Nvidia have had the advantage of working on the same codebase for their drivers for years now.
  • by DarkHelmet ( 120004 ) <mark&seventhcycle,net> on Friday May 31, 2002 @07:25AM (#3616180) Homepage
    he says that Doom 3 at E3 was only running at medium quality... wow.

    He couldn't find any damn quantum processors on pricewatch, or else he would have taken some higher quality shots.

  • by citanon ( 579906 ) on Friday May 31, 2002 @08:54AM (#3616454)
    ATI Radeon 8500 is a better card, with a nicer fragment path, while NVidia still consistently runs faster due to better drivers.

    Wrong!

    What Carmack actually says is this:

    In order from best to worst for Doom:

    I still think that overall, the GeForce 4 Ti is the best card you can buy. It has high speed and excellent driver quality.

    Based on the feature set, the Radeon 8500 should be a faster card for Doom than the GF4, because it can do the seven texture accesses that I need in a single pass, while it takes two or three passes (depending on details) on the GF4. However, in practice, the GF4 consistently runs faster due to a highly efficient implementation. For programmers, the 8500 has a much nicer fragment path than the GF4, with more general features and increased precision, but the driver quality is still quite a ways from Nvidia's, so I would be a little hesitant to use it as a primary research platform.

    The GF4-MX is a very fast card for existing games, but it is less well suited to Doom, due to the lower texture unit count and the lack of vertex shaders.

    On a slow CPU with all features enabled, the GF3 will be faster than the GF4-MX, because it offloads some work. On systems with CPU power to burn, the GF4 may still be faster.

    The 128 bit DDR GF2 systems will be faster than the Radeon-7500 systems, again due to low level implementation details overshadowing the extra texture unit.

    The slowest cards will be the 64 bit and SDR ram GF and Radeon cards, which will really not be fast enough to play the game properly unless you run at 320x240 or so.

    With regards to 8500 vs. GF4, he meant that the 8500 has better hardware on paper, but GF4's efficient hardware implementation makes it faster. He mentioned driver quality as a separate issue from speed.

    In talking about ATI's next generation hardware, the R300, he says the following in separate emails. From www.rage3d.com [rage3d.com].

    Doom III is very much hardware driven, and one of the controversies of this year's E3 was that the game was demonstrated on the latest ATI graphics card rather than a card from NVidia.

    "NVidia has been stellar in terms of driver quality and support and doing all of the things right," says Carmack, who has been an outspoken evangelist for NVidia's GeForce technology. "For the past few years, they have been able to consistently outplay ATI on every front. The problem is that they are about one-half step out of synch with the hardware generation because they did Xbox instead of focusing everything on their next board. So they are a little bit behind ATI."

    "I told everyone that I was going to demonstrate Doom III on the best hardware, and there has been no collusion or kickbacks or anything like that going on. Our objective is the technical merit." "The new ATI card was clearly superior. I don't want to ding NVidia for anything because NVidia has done everything they possibly could; but in every test we ran, ATI was faster."

    However, he was comparing R300 to a GF4, not NV30. In this email to nvnews [nvnews.com]:

    It [The ATI card used] was compared against a very high speed GF4. It shouldn't be surprising that a next-generation card is faster than a current generation card. What will be very interesting is comparing the next gen cards (and the supporting drivers) from both vendors head to head when they are both in production.

    Everyone working on DOOM still uses GF4-Ti cards at the moment, and if someone needs to buy a new video card today, that is what I tell them to get.

    John Carmack

  • Misrepresented. (Score:4, Informative)

    by John Carmack ( 101025 ) on Friday May 31, 2002 @03:21PM (#3619007)
    This batch of comments from me have let people draw conclusions that leave me scratching me head wondering how they managed to get from what I said to what they heard.

    Other people have outlined the issues in detail in comments already, but the crux is that, even with driver quality removed from the discussion (not counting conformance issues, running at fill limited resolutions), GF4 hardware is still faster than 8500 hardware on basically everything I tested. The 8500 SHOULD have been faster on paper, but isn't in real life.

    The hardware we used at E3 was not an 8500, and while the drivers were still a bit raw, the performance was very good indeed.

    Take with a grain of salt any comment from me that has been paraphrased, but if it is an actual in-context quote from email, I try very hard to be precise in my statements. Read carefully.

    John Carmack
  • by John Carmack ( 101025 ) on Friday May 31, 2002 @04:23PM (#3619372)
    We know for sure that we will be excluding some of the game buying public with fairly stiff hardware requirements, but we still think it is the right thing to do.

    The requirement for GF1/Radeon 7500 as an absolute minimum is fundamental to the way the technology works, and was non-negotiable for the advances that I wanted to make. At the very beginning of development, I worked a bit on elaborate schemes to try and get some level of compatibility with Voodoo / TNT / Rage128 class hardware, but it would have looked like crap, and I decided it wasn't worth it.

    The comfortable minimum performance level on this class of hardware is determined by what the artists and level designers produce. It would be possible to carefully craft a DOOM engine game that ran at good speed on an original SDR GF1, but it would cramp the artistic freedom of the designers a lot as they worried more about performance than aesthetics and gameplay.

    Our "full impact" platform from the beginning has been targeted at GF3/Xbox level hardware. Slower hardware can disable features, and faster hardware gets higher frame rates and rendering quality. Even at this target, designers need to be more cognizant of performance than they were with Q3, and we expect some licensee to take an even more aggressive performance stance for games shipping in following years.

    Games using the new engine will be on shelves FIVEYEARS (or more) after the initial design decisions were made. We had a couple licensees make two generations of products with the Q3 engine, and we expect that to hold true for DOOM as well. The hardware-only decision for Q3 was controversial at the time, but I feel it clearly turned out to be correct. I am confident the target for DOOM will also be seen as correct once there is a little perspective on it.

    Unrelated linux note: yes, there will almost certainly be a linux binary for the game. It will probably only work on the nvidia drivers initially, but I will assist any project attempting to get the necessary driver support on on other cards.

    John Carmack

He has not acquired a fortune; the fortune has acquired him. -- Bion

Working...