

Carmack on Doom 3 Video Cards 430
mr_sheel writes "According to a Gamespy interview with John Carmack, Carmack says what he thinks about the video cards with Doom3: ATI Radeon 8500 is a better card, with a nicer fragment path, while NVidia still consistently runs faster due to better drivers. And of course, the GeForce SDR cards will not be "fast enough to play the game properly unless you run at 320x240 or so." And in a ShackNews interview with Carmack, he says that Doom 3 at E3 was only running at medium quality... wow."
All I want for Doom III (Score:3, Insightful)
Re:All I want for Doom III (Score:3, Insightful)
Real Author (Score:5, Informative)
Re:Real Author (Score:5, Informative)
From the id Software E3 interview [gamespy.com] at GameSpy:
GameSpy: [7th Guest and now DOOM III writer] Matt Costello
Graeme Devine: [laughs] Oh yeah! I remember we were looking for a writer
So, we got some of his books and John read them and loved them, and it's just really weird, bringing him onto the project
Re:All I want for Doom III (Score:2)
So you're saying you want a linear game. No thanks.
Great. (Score:2, Funny)
Minesweeper runs great on my Orchid Farenheit (Score:4, Funny)
82 seconds on expert! Yay!
Re:Minesweeper runs great on my Orchid Farenheit (Score:2)
Re:Minesweeper runs great on my Orchid Farenheit (Score:2)
Re:Minesweeper runs great on my Orchid Farenheit (Score:2)
Re:Minesweeper runs great on my Orchid Farenheit (Score:3, Funny)
Coincidence or consequence? YOU BE THE JUDGE.
Re:Funny? (Score:2, Funny)
The laughter would be distracting?
yay. this is fun. (Score:4, Funny)
You know what? What if people were obsessed with lobsters the way that these guys were with fill rates?
you know, bob down at the creek is like: "Hey, I caught this lobster, and it's scurrying abilities are really great, but the sloppy curvature of its claws really kills it for me..." and then slim replies, "Well, shit, I'm gonna overclock my lobster boat and catch so many lobsters they're gonna elect me King of Red Lobster! And it's got bump-mapping too!"
My point being: You can stay up too late and have your weird z-buffered, anti-aliased dreams, but you can't get back that $400 you just dropped on the latest Bligblagdoodlehopper of a card, and dontcha forget itBR>
Re:yay. this is fun. (Score:5, Funny)
Second off, sweet christ that was a terrible analogy, if only because maybe five guys in the world can relate.
Thirdly (and lastly, my beer isn't getting any cooler), why shouldn't there be a high end pc games market? Porsche doesn't have to use geo metro engines so that geo metro owners don't feel left out.
Lastly (I lied about the last one), of course you can't get money back that you spend. This is one of the fundamental tenets of capitalism. I'm afraid you're just going to have to get used to it.
Re:yay. this is fun. (Score:3, Interesting)
Interesting thought. Why aren't there any truly high-end supercards out there? I'm talking custom built 8x AGP Pro + 2 PCI slot cards with 3 DVI outputs that perform game functions like a Wildcat 5110 does Maya...
Probably wouldn't sell many of them at $2-5000 a pop, but they'd be there for geek bragging rights at least. Plus I could pick one up on Ebay a year or so after it comes out for a pittance
Law limits amount of lobster you can feed people (Score:3, Funny)
People would get sick of lobster. In early colonial days (N. Amer.) lobsters were incredibly plentiful. They would be collected as fertilizer for farms, there was a law limiting how often you could make your indentured servant eat lobster.
Re:yay. this is fun. (Score:2)
They have to be concerned with fill-rates. They have the crappy job to make sure that their game runs as good as possible on all plausible setups. Only one way to find out...
I think someone here is a little envious of the success (and money) of one Mr. Carmack. And come on.. The lobster analogy..? That seriously makes NO sense to anyone.
Re:yay. this is fun. (Score:2)
I hightl doubt Mr Carmack has to buy graphics cards, I would assume that as an ISV iD get whatever they want essentially for free, since they drive hardware sales in their niche of the industry to such a degree. Hardware manufacturers would be only to eager to do anything in return for him recommending their products to his millions and millions of fans. But I suspect that as a purist, he's only swayed by superior technology and not by perks!
You can stay up too late and have your weird z-buffered, anti-aliased dreams, but you can't get back that $400 you just dropped on the latest Bligblagdoodlehopper of a card
Sure you can. Sell it on eBay, or think of the money you aren't wasting at bars when you're at home playing
Your point is really lame. (Score:3, Insightful)
- A.P.
Re:yay. this is fun. (Score:2, Funny)
so what? I dont even notice that shit in real life, let alone video games.
You get rockets shot at you in real life?
Where do you work??
Re:yay. this is fun. (Score:2)
Laptops...? (Score:2)
Re:Laptops...? (Score:2)
It runs RTCW in 1600x1200 with everything turned all the way up very comfortably. I'm VERY happy. It'll do fine with Doom 3.
Re:Laptops...? (Score:4, Informative)
Unfortunately, as both the GF4 Go and Mobile Radeon 7500 lack hardware pixel shaders, they will not be able to render Doom3 in its full glory. Of course they will be able to run it, but many of the graphical goodies will either be missing or will need to be (very slowly) computed on the CPU.
As for 32 vs. 64 MB, I'd go for the latter if you want to run Doom3. Surfaces in Doom3 can contain up to 5 texture maps, which means tons of RAM usage at anything but low texture detail. If you run out of room on the card, you need to store textures in main memory and access them over the AGP bus, which is too slow for that sort of thing. IIRC both the GF4 GO and Mobile Radeon 7500 are available with 64 MB, although I suppose one sometimes doesn't get the choice when buying a laptop.
Basically, the top-of-the-line 3D cards of today are going to be necessary to run Doom3 decently, so the top-of-the-line mobile 3D cards--which are about a generation behind the desktop--are going to be able to run it, but somewhat mediocrely. Of course, Doom3 probably won't be out for at least a year, maybe a year and a half. By that time you'll be able to buy a laptop which runs the game beautifully. If you have to buy a laptop now then it'll be a bit tougher. Kind of makes you wish laptop 3D cards were upgradable like desktop ones...
Re:Laptops...? (Score:4, Funny)
What better way to attract women than to be playing Doom III on a train within a week of launch and to be kicking ASS!
If your laptop has a nice velvetty 'keyboard nipple' pointer you have a second angle with which to get them going! Chicks really dig those! "ooooh! it feels so soooft!"
All aboard the love train!!!
Uhm (Score:3, Informative)
Re:Uhm (Score:5, Informative)
"I still think that overall, the GeForce 4 Ti is the best card you can buy. It has high speed and excellent driver quality."
He said the Radeon 8500 should be faster but isn't, and "the driver quality is still quite a ways from Nvidia's, so I would be a little hesitant to use it as a primary research platform."
That's hardly the glowing endorsement of the Radeon that the story poster made it out to be.
Re:Uhm (Score:2, Insightful)
The comparison was based on the R300 (next gen ATI) vs a super pumped GF4. nVidia didnt have any working NV30 samples from what I can tell.
Carmack also stated that nVidia is 6 months behind in development due to the time they spent on the xbox.
Seems to me that nvidia is trying to play catch up by throwing MHz at the problem...can you say Pentium III?
That's funny. (Score:2, Interesting)
"The GeForce 4 Ti is the best card you can buy."
So I'm wondering if we aren't being spammed by ATI marketing here.
--Blair
Doom III and video cards (Score:3, Insightful)
Consider this: Of the three games I've played almost exclusively in recent years, all three were Half-Life mods: Counter-Strike, Day Of Defeat, and Team Fortress Classic. However, with my current GeForce 3 based video card, I get the maximum 100fps at the highest supported resolution of 1280x960. So what exactly is the point of upgrading? Even if I upgraded to be able to play Doom III, I'd play it for at most a month, then go back DoD/CS/TFC.
PS: While we're on the topic of Half-Life, does anyone know why the engine doesn't allow resolutions above 1280x960? It seems like an arbitrary limit that could be easily removed. Maybe some of the people that invest months of time into writing HL cheats should try to figure out how to remove that limit instead...
Re:Doom III and video cards (Score:3, Interesting)
Wait a second. the mods you listed are multi only, there is no plot (beyond the individual map), quake3 is an awesome game as long as you don't want anything else besides dm and tdm.
p.s. as for the HL cheat thing, i heard somewhere that the newest version of ogc will fade in music from winamp whenever someone dies in cs (and fade out on rebirth).
Re:Doom III and video cards (Score:2)
They might not have textures at high enough resolution on the disc. It would probably look crappy magnifing them anymore.
When did games dictate the need for faster hrdwre? (Score:4, Interesting)
I'm an "old timer", but still I'm not old enough to have been concious of when this phenomenon actually began; there was a fundamental change somewhere in the last 15 years where things shifted from games using existing hardware fully to where games became the reason themselves to create new, faster hardware devices.
Not that this is bad, nit by any means, but it does give one interesting meat to consider; no one will argue that games are what's driving things like new video card technologies -- when did the chicken outdo the egg?
Re:When did games dictate the need for faster hrdw (Score:2)
Re:When did games dictate the need for faster hrdw (Score:4, Insightful)
Re:When did games dictate the need for faster hrdw (Score:2)
The constant need to upgrade to enjoy the latest games just seems like a fact of geek life now. Thank you for reminding me of whom to blame for all this.
Re:When did games dictate the need for faster hrdw (Score:4, Interesting)
Re:When did games dictate the need for faster hrdw (Score:2, Informative)
Re:When did games dictate the need for faster hrdw (Score:2, Funny)
You want Doom III to have the same forced-upgrade appeal as the last two? Just put some big, nicely rendered breasts in it somewhere. Maybe on those fat grey guys from the screenshots.....
Not how I remember things... (Score:2)
What I remember of that time is that the game that made me buy a 3DFX card even before the game came out was the (now) much laughed at Tomb Raider. That was the first game I remember ever *needing* a a 3D card for as it just looked amazing.
Even if you don't like Tomb Raider now, remember that at the time Tomb Raider was amazing and offered a kind of cinematic experience really not seen before in games - an experience that was great increased with the 3D card (like the waterfall, or the T-Rex).
I recall a couple of other friends saying that they bought their first 3D card for that game. I think you'd be surprised at how many people did so.
Re:When did games dictate the need for faster hrdw (Score:4, Interesting)
It probably happened when people spent $3,000 on the latest computer hardware and demanded immediate return on their investment. At l;east that was my experience. My dad got me a 486-33 mhz machine back when they were seriously top of the line. That computer was like my supercomputer for many, many months. My dad dropped a pretty hefty chunk of change on it. He and I both felt that for all the money spent on it, it'd better be a day to night difference over the old 286 I had.
Fortunately, I had Wing Commander II. And boy was it superior on the 486! The game took advantage of the extra RAM to draw more stuff on the screen (like the pilot's hand controlling the ship), and it had the voice pack so your wingman could talk! And the game was smooooooooooooooth.
I think that game did more to impress my dad with his investment than the 3D stuff I ended up doing later on it. Any queeziness he had about buying me that machine melted that night.
I can tell you something, it's satisfying to buy new hardware and have it blow your old hardware away. That's why games like Halo are so important to the XBOX. Quake 3 was the game to do that on PC, but it looks like Doom 3 will easily take its place.
In any case, I think that explains the shift. To tell you the truth, if I didn't run Lightwave so much, I probably wouldn't have much idea how much faster one computer is over another. Guess I should play games s'more.
Re:When did games dictate the need for faster hrdw (Score:2)
Kinda funny, I'll be buying a Geforce4 (Geforce5?) when DoomIII comes out.
Re:When did games dictate the need for faster hrdw (Score:2, Interesting)
WordPerfect etc still ran great on my 8086 but if you wanted to play for example doom, you needed a faster machine(a 486 in my case).
Now look at it from the other side: would their ever be so much money invested in the development of faster hardware if there where no games? You don't really need a fast CPU to type a letter or make some spreadsheets.
Re:When did games dictate the need for faster hrdw (Score:2)
Re:When did games dictate the need for faster hrdw (Score:2)
Not that this is bad, nit by any means, but it does give one interesting meat to consider; no one will argue that games are what's driving things like new video card technologies -- when did the chicken outdo the egg?
When games started taking a long time to make. Used to be even a revolutionary game could be made from start to finish in months. Wolfenstein 3-D took about 6 months to make. The original Doom took about a year. iD has been working on Doom3 for about two years and they've still probably got at least a year to go.
When Carmack decided the technical parameters of the Doom3 engine back in 2000, he would have been an idiot if he designed it to only take advantage of the features of existing hardware. Instead he designed it to use features and require performance which he knew would be entering the mainstream by the time the game was released.
Of course, Carmack is unique in that he can actually influence Nvidia, ATi, etc. a bit into supporting the features he wants. On the other hand, you have to realize that back in 2000 Carmack, Nvidia, ATi, etc. already had a very good idea what sort of features would be supported in the 3D cards of 2003. Of course there is some guessing and tweaking involved (which Carmack seems to be particularly good at), but a good 3D engine designer has to design for the hardware of the time when the game will be released, not the hardware of the time when he's designing.
As for why games take so much longer these days, that's another story but the basic point is that they are not only more complex technically but that there is ever increasing detail in the art, scripting, level design, etc. that it takes much larger teams with much better tools much longer to make a game than in the old days.
Re:When did games dictate the need for faster hrdw (Score:2)
Doom took a year? If I remember correctly, Doom was about a year late when it came out...am I wrong?
Tim
Re:When did games dictate the need for faster hrdw (Score:3, Interesting)
My stepfather asked me, one day, after he'd had his computer for a while, how you compare one computer to another, in terms of speed. My simple reply was 'games'. See how a game plays on both systems. He didn't believe me that games are used for benchmark numbers, if not entire benchmarks. 'Why would anyone buy a better computer just for games?' he asked.
Our Performa came with a ton of useless crap on CD, but it also came with a copy of FA/18 Hornet 2.0, a flight sim. Stepfather is very into planes, so he started playing it one day. Over the next few months, he was more and more into the game.
When Hornet 3.0 came out, he purchased it. Same with A-10 Cuba, and the Hornet Korea upgrade. He even bought a game he couldn't play (Falcon 4.0), just for when he later could play it (i.e. got a new machine). He was also very disappointed when I told him he couldn't add a 3D accelerator to the Perorma, to get the beautifully textured goodness of Hornet 3.0. I think it was at this point I told him I'd told him, and he shouted that I had not.
We run a home-based business, or rather, they do, and I used to help. They needed a new computer, and the local Mac shop had a great deal on a G4, 17" monitor, laser printer, and so on, so they leased it (the whole purchase = tax deductable as a lease). GeForce 2MX (great at the time) and a sweet sweet 533 G4 processor.
Wouldn't you know it, Falcon 4.0's hardware acceleration only supports RAVE, and ATI cards directly, neither of which is supported on the GeForce 2, and no OpenGL support. What's the first thing he thinks of? Buy an ATI card for it, spend a few hundred bucks that they really don't have, and upgrade, just so that one game plays nicer than it did before (it plays very smooth in software mode).
I agree with the other posters, and my anecdote supports the claim. 3D is what drives sales. I remember WC3's 3D gameplay (basically software 3D done beautifully) on my friend's 486, and it was amazing. Let me tell you, if you didn't have the hardware to play it, you damn well wanted to buy the hardware to play it. That was the major turning point (for me). En masse, QGL sounds about right. DirectX was another important turning point, too. By making games faster (in Windows), people could write more complex games with better graphics, and they didn't have to bother with a DOS version. Then, people who didn't have Windows 95 had to get it, and people who didn't have the hardware for Windows 95, or barely had it, had to get that. If you wanted to game, you HAD to have W95, or you were stuck playing legacy games until eternity (which, for a hardcore gamer, is not an option).
So DX, OpenGL, GLide (which sucked), and use of these technologies are, to me, what really turned the tables. Game development took off, and so did hardware purchases. Now, everyone's chasing their first 3D high.
I'm just waiting for a holodeck.
--Dan
The eternal story for ATI (Score:3, Insightful)
If ATI could just finally fix their drivers once and for all, they'd be on even standing with Nvidia.
Re:The eternal story for ATI (Score:5, Insightful)
If I sound down on ATI, I'm not really : They have proven themselves to have extraordinary hardware guys who make, literally, the best stuff in the business, however their ability to continually shoot themselves in the foot with a horrible software development record is hard to fathom : Talk to anyone about ATI, and 95% of the time they'll relate some driver nightmare they've had with an ATI card.
Re:The eternal story for ATI (Score:2)
I'd never go ATI again, nor would I ever touch Matrox again either. It's the same story for Matrox, btw.
Re:The eternal story for ATI (Score:2)
That was the story with Nvidia for the longest time, too, lest people forget. They only started getting really good drives after the GeForce 2 was released (and early GeForce 2 drivers were horrible).
Re: (Score:2, Informative)
Re:The biggest suprise of the interview (Score:2)
Faster GFX card insanity (Score:3, Funny)
Without missing a beat he replies, ".. Well, it's got more transistors..."
Id branded credit card? (Score:2)
on another note, is it just me or is Id and john carmack dangerously close to becoming the george lucas of the game industry? I find myself incapable of getting excited about DOOM 3.
Re:Id branded credit card? (Score:2)
WTF are you talking about? Have you even seen the movie from E3? Pixel-shaded bumpmapping *drool*. Every object casts a shadow *drool*. Trent Reznor doing the music *drool*. Realistic physics *drool*. Real-time 5.1 mixing *drool*.
Now excuse me while I use my Phantom Menace DVD to wipe the slobber off my computer.
exactly his point (Score:2, Insightful)
the question is, will id include all of that crap with a game worth playing, or is "the guy from doom" gonna end up wandering around casting a shadow and telling nataly portman "i dont like the sand, its rough and irritating" in dolby digital 5.1 just like another recent heavily hyped tech demo with a few good action scenes.
Once again, Mac users have the edge (Score:4, Funny)
[Me 3 years from now]: Hey, I just got this cool new game, Doom III !
[Everybody else]:
[Everybody else] (to each other, turning away): C'mon, let's go play Tribes 4.
Interesting review (Score:4, Informative)
Here's the relevant bit:
Doom III is very much hardware driven, and one of the controversies of this year's E3 was that the game was demonstrated on the latest ATI graphics card rather than a card from NVidia. "NVidia has been stellar in terms of driver quality and support and doing all of the things right," says Carmack, who has been an outspoken evangelist for NVidia's GeForce technology. "For the past few years, they have been able to consistently outplay ATI on every front. The problem is that they are about one-half step out of synch with the hardware generation because they did Xbox instead of focusing everything on their next board. So they are a little bit behind ATI."
Re:Interesting review (Score:3, Interesting)
You're testing the next generation card vs. the current generation. May as well compare a GF4 Ti4600 to a Radeon 7500 and see which one does better.
The NV30 hasn't been taped yet. There's no silicon to test. So while you can't say whether or not the NV30 will be better than the R300, it's still a faulty comparison for NV25 vs R300. And since the NV30 is supposed to be released in August/September (color me doubtful, since they don't have prelim silicon yet), there's not going to be much of a gap between their releases either.
Frankly, even if NV30 doesn't have the edge on R300 on paper I'll buy it in a second over ATI. Why? Because ATI's drivers suck, their support sucks, and anyone who's been burned by ATI over the past 20 years will know what I'm talking about. They have long had a tendancy to release poor to middling drivers and then rapidly desupport the card. Nvidia, on the other hand, is still supporting the original TNT with current drivers - the card they made 4 years or so ago. Plus, as Carmack observes, Nvidia's drivers make their cards surpass ATI - which any benchmark will show you.
Now if only Nvidia would put some decent output stages on the reference design... output quality at high resolutions is one area where ATI has long been better. And Matrox trounces them both.
Even Alan Cox agrees (Score:2, Interesting)
high quality (Score:3, Funny)
He couldn't find any damn quantum processors on pricewatch, or else he would have taken some higher quality shots.
Article Misinterprets Carmack (Score:4, Informative)
Wrong!
What Carmack actually says is this:
With regards to 8500 vs. GF4, he meant that the 8500 has better hardware on paper, but GF4's efficient hardware implementation makes it faster. He mentioned driver quality as a separate issue from speed.
In talking about ATI's next generation hardware, the R300, he says the following in separate emails. From www.rage3d.com [rage3d.com].
However, he was comparing R300 to a GF4, not NV30. In this email to nvnews [nvnews.com]:
Misrepresented. (Score:4, Informative)
Other people have outlined the issues in detail in comments already, but the crux is that, even with driver quality removed from the discussion (not counting conformance issues, running at fill limited resolutions), GF4 hardware is still faster than 8500 hardware on basically everything I tested. The 8500 SHOULD have been faster on paper, but isn't in real life.
The hardware we used at E3 was not an 8500, and while the drivers were still a bit raw, the performance was very good indeed.
Take with a grain of salt any comment from me that has been paraphrased, but if it is an actual in-context quote from email, I try very hard to be precise in my statements. Read carefully.
John Carmack
High end hardware reasoning (Score:5, Informative)
The requirement for GF1/Radeon 7500 as an absolute minimum is fundamental to the way the technology works, and was non-negotiable for the advances that I wanted to make. At the very beginning of development, I worked a bit on elaborate schemes to try and get some level of compatibility with Voodoo / TNT / Rage128 class hardware, but it would have looked like crap, and I decided it wasn't worth it.
The comfortable minimum performance level on this class of hardware is determined by what the artists and level designers produce. It would be possible to carefully craft a DOOM engine game that ran at good speed on an original SDR GF1, but it would cramp the artistic freedom of the designers a lot as they worried more about performance than aesthetics and gameplay.
Our "full impact" platform from the beginning has been targeted at GF3/Xbox level hardware. Slower hardware can disable features, and faster hardware gets higher frame rates and rendering quality. Even at this target, designers need to be more cognizant of performance than they were with Q3, and we expect some licensee to take an even more aggressive performance stance for games shipping in following years.
Games using the new engine will be on shelves FIVEYEARS (or more) after the initial design decisions were made. We had a couple licensees make two generations of products with the Q3 engine, and we expect that to hold true for DOOM as well. The hardware-only decision for Q3 was controversial at the time, but I feel it clearly turned out to be correct. I am confident the target for DOOM will also be seen as correct once there is a little perspective on it.
Unrelated linux note: yes, there will almost certainly be a linux binary for the game. It will probably only work on the nvidia drivers initially, but I will assist any project attempting to get the necessary driver support on on other cards.
John Carmack
Re:I'd exchange speed of rendering (Score:5, Informative)
If you want an even cheaper solution, go for a GF3 Ti200. It's still fast enough to play everything (including, I assume, Doom III), and goes for like ~$120.
Whatever you do, don't get a GF4 MX. They aren't actually that slow, but their architecure is on the level of the old GF2s.
Re:I'd exchange speed of rendering (Score:2, Informative)
Re:I'd exchange speed of rendering (Score:2)
Re:I'd exchange speed of rendering (Score:2)
Frankly, if you're going to buy a GF4, the best buy is the Ti4400. You can find them for about $230 - only $10-20 more than a 128 MB Ti4200 and considerably faster.
I definitely agree with the previous post though - the absolute best bang for the buck right now is the GF3 Ti200. These cards were twice the price 2 months ago and are only 6 months old. The GF4 is not a good buy ATM -- the NV30 is coming out in 2-4 months and should absolutely blow the old stuff out of the water (as will the R300, the Perhilion, and 3DLabs's card). Both the ATI R300 and the NV30 should be fully DX9 compatible too (which Perhelion and 3DLabs are not).
I'm in kind of a tight spot... I'm seriously looking at buying a new computer, but don't know what graphics card to get, if any. Unfortunately my old card is a 32MB GF2, which is rather constricting at this point (if only because of the memory). I'll probably go GF3 Ti200, since I think Doom3 will be the next big thing that would want more, and it's over a year away.
Re:I'd exchange speed of rendering (Score:2, Informative)
Re:I'd exchange speed of rendering (Score:2)
I didn't buy a faster card because I was waiting for a reason I needed a faster card, and because it was fifty bucks. Spending $400 so that I could run 1280x1024 with AA just wasn't worth it for me. But when Doom 3 comes out, and my card doesn't cut it anymore, you're damn strait I'll upgrade.
Re:I'd exchange speed of rendering (Score:2)
Of course, it is UT2, so most people will turn off the eye candy just because it's annoying.
Re:I'd exchange speed of rendering (Score:2)
The step from GF3 to GF4 is not that important in that way, existing functions were enhanced and sped up, but there were no similarily ground breaking functions added. Seems to be quite a common thing for Nvidia cards, as the same was true for TNT and TNT2, the original GeForce and the GF2 (that was when hardware T&L was introduced, remember that hype?) and now with GF3 and GF4.
Re:I'd exchange speed of rendering (Score:3, Funny)
When building a system to play modern 3d games, I've started thinking about the video CPU as the "main processor", and the Athlon or Intel CPU on the motherboard as the "coprocessor". This way, I can sort of trick myself into being comfortable spending $300+ on the "main processor" and a mere $150 on the "coprocessor".
If you're not in it for the games, that philosophy doesn't really apply. Since I have want to play the latest games right away, I need to have MS Windows on that system. The OS condemns the machine to being a toy, so my philosophy above pretty much makes sense. ;)
Re:I'd exchange speed of rendering (Score:2)
I just paid 300 pounds gb for my gf4 ti4600
150 per year
2.88 per week
but after the 2 years I dont throw them away. I could sell it for maybe 50 quid. Actually they just shuffle down the computers. My best goes into the spare PC, the spare PC's might get donated to a friend.
I can live with that.
roll on GF5
:)
Re:I'd exchange speed of rendering (Score:2)
Re:I'd exchange speed of rendering (Score:2)
I had no serious problems when I upgraded to a Radeon either and am currently enjoying the dual monitor support.
Try installing the latest version of X.
Re:Woah... (Score:5, Insightful)
In the "interview" with Shacknews (actually it's just one email), Carmack says that high quality settings opposed to medium ones would mean "uncompressed textures" and "anisotropic filtering". While especially anisotropic filtering is nice, it's not that big of a deal. The game would look better, but not stunningly so, and I'm not actually sure if you'd notice the higher quality in the low res movies that are available on the net.
The interview is quite interesting, though, even though it doesn't really tell us anything we didn't already know (Nvidia faster than Ati, Ati's drivers suck, GF4 Ti best buy). Please note that the story (for some reason) links to page two of the review, page one [gamespy.com] is available, too.
Re:Woah... (Score:2)
> nice, it's not that big of a deal. The game
> would look better, but not stunningly so, and
> I'm not actually sure if you'd notice the higher
> quality in the low res movies that are available
> on the net.
But I noticed that the shadown had some annoying stair-step effect!
I wonder if anistropic filtering would have corrected this (it depends how it's applied of course)..
Anyway, I was trying to analyse the way it looks, I suspect that when you're playing the game, you're much less sensitive to such details.
Re:Woah... (Score:2)
Re:Woah... (Score:2)
2) Besides, as long as it renders it at consistently higher than 30 fps, I'm fine.
3) Is UT2003 one of those "newest Unreal-engine based games"? UT 2003 is slated for a Q3/02 release. Claiming that a Q3/02 game won't run on Q2/02 high-end graphics cards is nothing short of ridiculous.
Re:The Console winner will be? (Score:2, Offtopic)
A user who prefers PC games would think you're a loon for suggesting that. Remember, games are what you play, not graphics.
Re:The Console winner will be? (Score:2)
If Doom3 comes to any console, its xbox.
a) originally a
b) no other console is powerful enough to do it, if a gf3 is going to be a "moderately performing" card when the game comes out.
Re:The Console winner will be? (Score:2)
The vast majority of console games suck compared to even mediocre PC games. Console controllers can't even compare to the ease, flexibility, and precision of a keyboard/mouse combo or a PC joystick. Split-screne sucks for multiplayer. Even with a hard drive and networking capability, the X-Box still won't have nearly the kind of game communities that Quake or Half-Life have. Hell, it won't even compare to the much smaller game communities that have sprung up around lesser-selling games. Console games are merely somewhat entertaining distractions. The PC is where you'll find the real games.
Re:The Console winner will be? (Score:3, Insightful)
Also, launch a super nintendo emulator on your pc, then try to tell me you wouldn't rather have a controller. Controllers are simply the best input device for certain games.
Re:The Console winner will be? (Score:5, Funny)
c) and on a TV.
On a TV. I mean really. You want to take a game like that, meant to be seen at 1024x764 and put in on a screen that can squeeze out only 400x500 if you're lucky? Would you like me to kick you in the nuts while you're playing, too?
FYI on TV res. (Score:2, Informative)
NTSC(North America) 720x480
PAL(Europe) 720x576
Anything else needs to be shot, burnt, scraped, or just plan thrown away.
Re:FYI on TV res. (Score:2)
That's actually 768x576, at 50Hz. NTSC runs at 60Hz.
Re:The Console winner will be? (Score:3, Interesting)
If you watched TV that was at 400X500 you'd be pretty upset with the picture quality.
My el-cheapo 19 inch tv does the testpattern that shows it has the capability to seperate pixels at the 724x485 resolution... if your tv cant, then your tv is really crappy.
Re:The Console winner will be? (Score:2, Insightful)
Re:The Console winner will be? (Score:2, Insightful)
Nevermind the fact that the console versions would be running at a lower resolution and thus require much less video capabilities to render the scenes, but the fact that the game will be coded closer to the metal will take off a huge percentage of the required system specs.
I personally would be very amazed if Doom III didn't at least make it to the XBox. Kid yourself all you want, but by about this time TWO YEARS FROM NOW, PCs will JUST be catching up to the XBox. This is of course based purely on the assumption that Xbox/Gamecube developers don't continue to outdo themselves well on into that time frame and show us stuff we assumed the machines simply wouldn't do.
Simple Case and Point. Quake III on the Dreamcast outperforms Quake III on a Pentium II 400 with an 8 megabyte video card and 24 megs of system memory. In fact, I'm not sure Quake III would play on a PC with those specs at all, yet it kicks much ass on a Dreamcast.
Expect to be blown away.
A mouse? (Score:2, Funny)
Re:The Console winner will be? (Score:2, Interesting)
That game looks 2x better on PC than on Xbox. PCs have ALREADY eclipsed the power of the current consoles, at least at the high end part of the market..
Also, there are plenty of games coming that look better than anything I've seen on consoles(Doom III, SWG, and Unreal 2 being three of them).
Re:The Console winner will be? (Score:2)
Simple Case and Point. Quake III on the Dreamcast outperforms Quake III on a Pentium II 400 with an 8 megabyte video card and 24 megs of system memory. In fact, I'm not sure Quake III would play on a PC with those specs at all, yet it kicks much ass on a Dreamcast.
It's reasonably playable on a PC like that with a TNT2. I think it looks like crap on both systems though. ;)
Re:The Console winner will be? (Score:5, Insightful)
Re:Disappointing... (Score:4, Interesting)
I play games on the PC. Am I a PC gamer? I like single-player games. I'm quite excited about Doom III's focus being on single-player. I am quite likely to buy it, and to spend whatever I need to be able to run it properly.
I remember the first two Dooms fondly because they were engrossing single-player games. Quake I was good as well, but Quake II, Arena and games like Unreal, etc. catered to the multi-player crowd. Fine, that's what some people want, but not me.
I think the main reason that I don't like multiplayer FPS games is that I suck. My friends (when we can co-ordinate something) kick my ass, and I get tired really quickly of having my ass fragged on the net by some 14 year-old who runs circles around me. I don't have my whole life to devote to improving my Quake skills. Therefore, I like to play single-player, where I can set my own handicap.
Moreover, there is a real repetitiveness to deathmatch-type games, IMHO. Give me something engrossing, like Half Life was.
As for the complaint about a $300 video card, well:
a/ games like this are graphics-dependent and I would rather have mind-blowing graphics and realism than have it suck because they want to be backwards-compatible with your Voodoo 1 card **
b/ you are going to use this $300 for more than Doom III, because
c/ by making such an advanced and neat-o engine (if it is all it is hyped to be), ID is improving the quality of ALL FPS games. First, they are raising the bar for their competitors and second, many will license their technology. Maybe even some people who will make a nice multi-player FPS for people like you.
Therefore, I think you should retract your silly comments and support what ID is doing for the good of gamers everywhere.
** Don't knock me for this, I play Nethack too, and I posted my YAFAP today, for those who would like to congratulate me.
Re:Disappointing... (Score:2, Funny)
Not to mention that when you do manage to kill someone, they start whining "cheater, wallhack, <weapon>whore!!"
Today's prices are meaningless (Score:2)
Today's prices are meaningless. By the time this game arrives GF4 Ti's and Radeon 85xx's will medium to low end cards and reasonably priced.
yeah (Score:2, Funny)
Not new (Score:2)