Half-Life 2, ATI, NVIDIA, and a Sack of Cash 412
Latent IT writes "If you're into games, and unless you've been living under a rock for the past few days, you've heard a bit of a rumble from Valve on the relative quality of ATI vs. NVIDIA cards. Starting with articles like this one (previously reported), Valve told the world that the ATI 9800 Pro was nearly three times faster in some cases than the formerly competitive NVIDIA offering, the 5900 Ultra. Curiously, this happened at an ATI sponsored event, "Shader Day". But the story hasn't stopped there. NVidia released this response, essentially claiming that their new drivers, that were available to Valve at the time of their press conference, would make for vast, legitimate performance improvements. An interview with Massive, the creators of the Aquamark 3d benchmark, seems to confirm this opinion - that the NV3x chipset wasn't designed around any certain API very well, and the drivers are critical in achieving good performance. Anandtech writes here about the restrictions Valve placed on what benchmarks could be run. However, the key to this whole story may be this: an article, which I haven't seen get much coverage in all this, seems to make everything a little clearer - Valve stated that their OEM bundling deal with ATI came from the fact that ATI's cards were so superior, and that they were "performance enthusiasts". However, if the Inquirer is to be believed, the bundling deal was a result of an outright auction, on what will probably be the most popular game of the year. Which year that might be, is another issue altogether. Whatever happened to just making hardware, and making games?"
gaming is big business now... (Score:5, Insightful)
Re:gaming is big business now... (Score:5, Funny)
Re:gaming is big business now... (Score:5, Interesting)
Ah, but games arnt written for hardware (Score:3, Insightful)
You're both right (Score:2, Insightful)
Most developers will find at some point that they need to optimize their graphics API code for specific chip sets.
Oh, and mentioning DirectX before OpenGL in the same breath is what Microsoft WANTS you to do.
Re:You're both right (Score:5, Insightful)
Oh, and mentioning DirectX before OpenGL in the same breath might mean you like serializing items in a list in alphabetical order... oh no!
Bullshit (Score:3, Insightful)
That said, if I was a game development company, I would be putting the boots to nVidia any way I could right now. Today, it's "We'll get around to making your game work with our drivers when it's popular" but tomorrow it could be "You want your game to work well with our drivers? That will be $3,000,000 please." The shit that nVidia are
Re:gaming is big business now... (Score:4, Insightful)
OT, one thing I like about Id software is that they are down-to-earth and very objective about the strengths and weaknesses of vid cards.
cant be that bad (Score:4, Funny)
share the damn drivers! (Score:5, Insightful)
Betterment serves no profitable purpose unless it is unatainable by one's competitor. If someone can show how they'll make more money by making a better product while also aiding their competitor in the same endeavor, they might help us out a bit more.
Re:share the damn drivers! (Score:2, Informative)
http://games.slashdot.org/comments.pl?sid=78019
Re:share the damn drivers! (Score:2)
Re:cant be that bad (Score:5, Interesting)
The Geforce FX processes PS2.0 instructions in a whole different way. Using Microsoft's compiler produces slow code when using PS2.0. Nvidia still doesn't have a JIT compiler in their drivers to reorder the PS2.0 instructions for maximum performance. The Detonator 50 series drivers are supposed to fix this. How well it's fixed is still up in the air.
Re:cant be that bad (Score:5, Insightful)
"And thats why I'm with NVIDIA"
Re:cant be that bad (Score:5, Insightful)
Times have changed however, and DirectX development has lept forward in a way that would be nearly impossible for OpenGL to do as quickly. Mainting platform compatbility is great, but it does severely limit the development speed of the language when it comes to new features that developers need. With DirectX, there's a single codebase for all developers that's updated fairly frequently with new features available to everyone.
I'm not bashing OpenGL, it's a great language that is well suited to jobs where platform cross-compatibility is of paramount importance, industial graphics applications, 3D, etc. That said, most of those said applications now support DirectX as well, but retain OpenGL for compatibility reasons.
OpenGL is just not all that valuable for games anymore, with DirectX being a better alternative for Windows games where porting to other platforms isn't a concern.
N.
Re:cant be that bad (Score:5, Informative)
Re:cant be that bad (Score:3, Interesting)
Fast forward to today when developers are still somewhat mystified with DirectX, being the moving target that it is. OpenGL is still a standard
Re:cant be that bad (Score:3, Informative)
Re:cant be that bad (Score:2, Interesting)
R350:
8 pipelines
8 FPUs
NV35:
4 pipelines
4 FPUs
NV35 will always be half as slow as R350 per clock when using full compliant precision. All the D50 drivers will do is introduce more cheats, and even lower image-quality (driverheaven.net previewed 51.75 in AquaMark3 (DX9 program), and found IQ to be significantly worse than 44.03 and 45.23).
Does it really matter? (Score:5, Funny)
Re:Does it really matter? (Score:5, Funny)
Cash (Score:4, Funny)
GOD (Score:2, Interesting)
Just give me FFVI or give me..well Metal Gear Solid.
mac
Re:GOD (Score:2)
There are certain things that can be said again and again, without being overly redundant.
I was a huge fan of the first Tecmo Bowl [gamespot.com]. It truly was an absolute masterpiece. The sad thing is, the ONLY time I ever got to the final game, my wife (ex now...) actually came out and unplugged my Nintendo, because I had been playing for like 15 hours straight.
So- keep on talking about Tecmo, because it really is one of the great ones. I don't know if I would play it today (some things are best left in the fog o
Yes but (Score:2, Interesting)
What concerns me is whether the practice of producing games that work with _nothing_ other than recent nVidia and ATI cards continues. Game after game comes out which simply does not work on other brands' video cards.
Re:Yes but (Score:2)
If Half-Life 2 isn't going to work on nVidia cards, or if they decide to completely ignore everyone but the Windows market, I'll just wait for Doom 3. I can be
Not all that deep and mysterious... (Score:5, Insightful)
I don't think that article says anything about one hardware platform being better than the other, and I don't doubt that had NVidia won the bundling deal, they would've had a "NVidia Shader Day" event, regardless of the performance of the product.
I still find the most interesting point being that Valve says that they had to put in a lot more time and effort making the gaming experience on NVidia cards good than on ATI cards, to the point of developing a seperate graphics path for NVidia chips.
If the solution to the performance issues was a simple driver update from NVidia (WITHOUT degrading quality in any way), then surely Valve would've left it to Nvidia to handle and proceeded to spend their time working on the game iteself...
N.
Re:Not all that deep and mysterious... (Score:2)
Err, I think that was sort of the point of the article...
If the solution to the performance issues was a simple driver update from NVidia (WITHOUT degrading quality in any way), then surely Valve would've left it to Nvidia to handle and proceeded to spend their
Re:Not all that deep and mysterious... (Score:2)
At least that's the way I read it.
N.
Re:Not all that deep and mysterious... (Score:3, Insightful)
Valve has said that they do not have said drivers. To wit, valve has stated, really, until they're public it's not appropriate to bench with any driver. NVidia says they do, but that's irrelevant by their above arguement.
So Valve has spent a *LOT* of time optimising code since until it's actually in a release it really isn't a useable driver. And as history has shown, a "benchmark driver" and a public official driver are often very different performance wise.
If I wa
I think it's safe to rule out linux, even more now (Score:2)
Hardware release and driver quality (Score:4, Insightful)
Maybe Not the Bestselling Game (Score:5, Interesting)
Re:Maybe Not the Bestselling Game (Score:2)
Re:Maybe Not the Bestselling Game (Score:5, Interesting)
I would make business sense to not have them clash and get released at the same time, so I expect Doom 3 won't ship this year, but in the first quarter of next (unless they aim for Christmas, though I can't see it being much of a 'Chirstmas Title', what with the evil-scary-hell-spawned-zombies that make you want to turn all the lights on and hide under the sink with a big kitchen knife).
As impressive as HL2's physics/environment engine (and use of DX9) clearly is, Doom 3 is still going to have the edge in rendering jaw-dropping indoor environments with stupid amounts of eye candy, so at least it won't look 'aged' or suffer from the later release date.
bla bla bla bla (Score:5, Insightful)
Re:bla bla bla bla (Score:5, Interesting)
The 50 series drivers were incomplete during HL2 development. The driver samples that nVidia was providing to Valve were milestone drivers - incomplete featurewise, but each completed feature was "complete" (written to specs and considered stable). The fact that fog was not rendering is likely not a speed hack, but an as-yet incomplete (as in not even started in that driver release) feature.
Trust is a hard thing to earn, and easy to lose. I'm withholding judgement until nVidia's promised 50 series drivers come out.
Re:bla bla bla bla (Score:5, Insightful)
The 50 series drivers were incomplete during HL2 development. The driver samples that nVidia was providing to Valve were milestone drivers - incomplete featurewise, but each completed feature was "complete" (written to specs and considered stable). The fact that fog was not rendering is likely not a speed hack, but an as-yet incomplete (as in not even started in that driver release) feature.
Even if this is a driver bug and not a speed hack, if there are missing graphical elements in Half-Life 2 with the 50.xx drivers then Valve certainly did the right thing when they asked reviewers not to use them for benchmarking.
[]s Badaro
Re:bla bla bla bla (Score:2)
Don't single out nVidia for this. You must have a short or selective memory. Remember the Quack 3 [tech-report.com] fiasco?
Bottom line is, any company's going to do whatever they think they can get away with to sell more cards. Doesn't make nVidia any more evil than ATI.
Conspiracy Theorists (Score:5, Interesting)
Re:Conspiracy Theorists (Score:2)
Doesn't that defeat the purpose of having a generic DirectX API? One size fits all, and all that.
To hell with both Nvidia and Ati! (Score:5, Funny)
Let's see how Half Life 2 will run on my 3DFX Voodoo 1 & S3 Virge!
ati and nvidia dx9 (Score:3, Interesting)
Re:ati and nvidia dx9 (Score:5, Interesting)
Hardly... Having used the most recent versions of both the linux ATI drivers and the linux nVidia drivers, I can honestly say that ATI's drivers are much more stable, and perform just as well as nVidia's drivers. In my opinion, each release from nVidia (in the last year or so) has gotten much less stable, while ATI's drivers keep improving.
Dinivin
Re:ati and nvidia dx9 (Score:3, Informative)
Re:ati and nvidia dx9 (Score:5, Informative)
The driver even handled an upgrade to Kernel 2.6 without flinching. NVidia AGPGART support doesn't have to be hacked in any more either, it would seem.
No more mucking around with the FireGL drivers from the German branch of ATI.
Re:ati and nvidia dx9 (Score:3)
This classic AFU post [google.com] does a very good job of explaining why that's nothing to be peeved about. Worth a read.
The Dark Horse! (Score:3, Funny)
BitBoys will come back I tell you!
with two main card companies? (Score:2)
here's hoping that the same thing doesn't happen in the future with doom.
however, as long as the games work, regardless of which card you choose, doesn't matter in the end. i think this might be one case where microsoft is helping rather than hurting- were it not for directx, i think we'd be in a really confusing situation. i sure don't miss dos games.
i can't believe i said that about microsoft. ah well.
What about OpenGL? (Score:5, Insightful)
What's about OpenGL; I only purchase OpenGL games, because I mostly can make them run in Linux, and WineX is only a ugly workaround to run games in non native enviroment. If I'd a game company, I'd take care of potential Linux customers.
Re:What about OpenGL? (Score:2)
Re:What about OpenGL? (Score:2)
Re:What about OpenGL? (Score:2)
Re:What about OpenGL? (Score:5, Insightful)
Aside from the comments about how Linux users might be more likely to pirate the games instead of buying them.....I'd like to point out the fact that the Linux userbase is literally NOTHING compared to the Windows userbase. I'm sorry, Linux is nice...but you Linux advocates have to realize that while your system might be superior in many ways....it still just lacks the pure numbers of Windows, or even Mac.
So, of course you'll get karma for making a pro-linux comment, but you'd never get modded down here for the fact that your idea of taking care of linux users is just a BAD BUSINESS IDEA (at the time). It's a waste of money on support and development when you could make a lot more money for a lot less by developing/support a Windows market.
So in summary, wishful thinking never hurt anyone....but your idea is not good business move. No hard feelings.
Re:What about OpenGL? (Score:2, Interesting)
I'm not saying "you, support Linux", I'm saying "let's make games using standart API such as OpenGL, and with minimal tweaking and minimal effort, you can support various OS platforms.
Clairity (Score:2, Interesting)
Is a powerful system with no cohesive graphics standard really that much better than a consistent, albeit more primitive piece of hardware?
Both sides (Score:5, Funny)
The view of ATi fanboys is this: Anyone who bought a GeForce FX is an idiot, as they obviously should have had a stolen timedemo of Half-Life 2 on hand to benchmark with. If they didn't break into Valve's offices and steal the code, that's their own fault. Also, nVidia is clearly exactly like 3dfx, because they slipped up, JUST LIKE 3DFX! Dun dun dunnn!(The Quake/Quack scandal involving ATi never existed, of course.)
The view of most sane, rational human beings is that this is just another stage of the highly competitive video card market, and that anyone who spends time arguing over which company is better needs to be tranquilized, preferably with something meant for very large animals.
Re:Both sides (Score:4, Informative)
For me, there were other considerations (Score:4, Interesting)
Which I don't. So when it came time to upgrade my system (about 2 weeks ago), Nvidia won hands-down -- and it was because they are Linux friendly, not because some rigged benchmark somewhere said they are a few frames per second faster than the other guy. Nvidia has been providing quality Linux drivers for their products for a long time, and I hope they'll continue to do so.
I've been playing a lot of Neverwinter Nights on my 5900 and it looks beautiful. I'm planning to purchase more Linux games as soon as my budget permits. Yes, there are people out there running Linux who appreciate high-end graphics cards. Probably more than the marketing types think; after all, most hacker types I know are also hardcore gamers.
Re:For me, there were other considerations (Score:2)
Not to judge people....but how many 'hacker types' do you know that buy all of their software, especially games?
Re:For me, there were other considerations (Score:2)
Screw Valve and HL2 (Score:3, Interesting)
Re:Screw Valve and HL2 (Score:3, Funny)
Re:Screw Valve and HL2 (Score:2)
I originally downloaded an ISO of HalfLife off of the net. I'll admit it! I played the entire game through to the end and went out the next day and bought an original (before I was even thinking about online play). The game was so immersive and enjoyable that I knew it was a classic keeper for my collection.
I've still got my original CD-R burn of HL sitting beside my nice "Game of the Year" edition of HL.
Granted, I've enjoyed any number o
but.... (Score:5, Interesting)
then how come these programs also show Nvidia shader performance as pathetic
halo PC
tomb raider angel of darkness
shadermark
3dmark03
and why have the det 50 drivers which nvidia recomended that valve used been proven to reduce image quality by a substantial amount?
is ATI really rich enough to buy off all of these companies and also manage to sabotage Nvidias drivers and PR team?
Valve optimised HL2 for nVidia (Score:2)
BTW, where is this "proof" of reduced image quality to which you refer? All I'd heard of was some incorrect fogging (which is obviously bad, and is doubtless a bug that we can hope will be fixed before the Det50s are out of beta).
Re:but.... (Score:2, Informative)
they inserted static clipping planes and swaped shaders out
take a look at this article using the latest build of Halo PC and take note of the developer comments at the top
i think you will find your wrong
http://www.gamersdepot.com/hardware/video_cards / at i_vs_nvidia/dx9_desktop/001.htm
also tomb raider angel of darkness being a shite game is irrelevent in this matter
its one of the only DX9 programs out at the moment and s
A Couple Things (Score:2, Insightful)
Second, check out this image quality comparison [driverheaven.net] over at DriverHeaven with Aquamark 3. It sure looks to me like nVidia is back to their old tricks again.
Video card benchmarks: the epitomy of dishonesty. (Score:5, Interesting)
At the same time, both card makers are really putting out insane results that wouldn't have been thought of even a couple of years ago.
My decision in graphics cards is based on my past experience and driver support. In this area nVidia still winds hands down. If ATI wants to sell me a card, they're going to need to beef up their Linux driver support big time.
Re:Video card benchmarks: the epitomy of dishonest (Score:3, Insightful)
Re:Video card benchmarks: the epitomy of dishonest (Score:3, Informative)
As Valve has said. If the driver isn't an official release, it's not appropriate to bench with it. NVidia may turn their driver around, but given the industry's history it's a reasonable expectation. Valve knows and has stated a number of times in a couple of recent interviews that the majority of their customers are NVidia users. Hence the great deal of time optimising of their code.
The auction is just good business.
Re:Video card benchmarks: the epitomy of dishonest (Score:2)
Check out this comment [slashdot.org] a few posts above yours. Looks like ATI is cleaning up its act when it comes to Linux.
I'll tell you what happened (Score:4, Interesting)
Actually... (Score:2, Informative)
not so interesting (Score:2)
Step 1. Produce crap
Step 2. Convince enough people to buy it before they realize its crap
Step 3. Profit
Now, the reason Step 2 is so difficult is that i can always download a demo, try it at the store,
The rest of the story (Score:5, Interesting)
Why? Well, one stated reason was a policy to test only with "publicly available hardware, and publicly available software". Laudable enough, considering that non-public drivers could have any number of bugs or "optimisations" that could render the game incorrectly and thus misrepresent its performance.
Indeed, Valve referred to an issue where fog was completely left out of an entire level, and though they didn't point any fingers, it was later revealed that yes, the beta Det 50s were the culprit.
For further info, you should read this [amdmb.com] report on the performance of the beta Release 50 Detonators. Summary: not much difference - at least for DX8-level games. DX9 is where the focus supposedly was, and there is a 25% gain in the PS2.0 test in 3DMark03, which is something.
However, who knows if it'll translate to a 25% gain in HalfLife 2 - probably not, in itself. And given recent 3DMark/nVidia events, even that much is uncertain, until the drivers are released for public examination. In any case, it's a long way short of the 100% gain needed for the 5900 Ultra to just draw even with the 9800 Pro.
nVidia apparently have a strong lead in Doom 3 scores, though (admittedly with the partial-precision NV3X-specific code path), so they will no doubt be hoping that Doom 3 outsells HalfLife 2... Myself, I have a 9600 Pro in my sights, just in time for the HL2 release :-)
BTW, regarding the release delay? According to Gabe Newell, "First I've heard of it". So there you are. Only 16 days to go...
Re:The rest of the story (Score:5, Informative)
They would have to result into a 50% gain in HL2 in order for the FX5900 Turbo to catch up to the Radeon 9800 Pro, not 100%. The graph with the customized nVidia code path has 40fps vs 60 fps. Although, of course, the nVidia path is lower quality, since the 5900 doesn't do 24bit precision.
Also, I wouldn't call it a CLEAR lead in Doom 3. The nVidia scores 20% higher on medium quality, but the Radeon takes the lead on high quality. Again, nVidia calls driver problem.
Myself, I will be upgrading for Christmas, when I will know for sure which one works best, and how the drivers are. This is also the time when the FX6000 Super Mega Turbo and Radeon10K Elite Pro Plus Plus(Or whatever) push the prices down on the "older" cards ;)
Re:The rest of the story (Score:2)
Ironically, I bought my 9700 Pro the day (well, afternoon) that I got ahold of the Doom3 leaked alpha, so I could tinker with it under considerably improved performance than my old GeForce2GTS.
The fact that it turns out to have far supe
Another DX9 Benchmark (Score:2)
Re:Another DX9 Benchmark (Score:3, Informative)
From driverheaven.net [driverheaven.net]:
UM (Score:3, Funny)
From the "michael works for NVidia" department (Score:2, Informative)
Putting things into perspective. (Score:2, Informative)
NVIDIA required a patch (Score:2, Interesting)
Likely: There is no sabotage (Score:3, Informative)
It can't run "true" DX9 spec games worth crap.
Why?
Because to save die space, nVIDIA engineers decided it'd be best to use 32 bit FP units, compared to ATi's more numerous 24 bit FP units. DX9 specs call for 24 bit precision computations, which is the ATi native precision (which can then be mapped to 16 or extended to 32 bit precision, if asked for) whereas the FX which has to operate in 32, 16, or 12(?) bit modes basically loses half its registers (or more, if you are comparing to 12 bit registers) because it must run in 32 bit mode to be compliant.
End result? Less high speed registers on the FX part, more swapping from ram and less FP computational power to go around.
And this is only a simple example. I believe it has been noted that that Carmack eluded to many ugly optimizations in using lower precision math or proprietary shader paths he had to make to the D3 engine for the benefit of the FX not sucking utterly in terms of performance. It isn't really a playable DX9 part, all in all.
If valve says they spent serious time working for the Geforce codepath (and indeed, it is quite a bit faster in hyrbid mode, but now they are making it well known that it isn't running "true" dx9, which it the truth. It should also be noted that this hybrid mode is what the D3 benchmark was run in which offered the nVIDIA part such stellar performance, specifically noted by Carmack.) then they probably did so. Either that or they would have mentioned nothing.
Drop the "it must be corporate scandal" bit. If you read some of the specs and dev notes you will note that they more or less universally have their gripes in getting DX9 performance out of the FX part.
Blame Microsoft (Score:3, Insightful)
After nVidia's falling out with them over the Xbox chipset pricing, its likely MS changed the DX9 spec mid-development and only gave the new specs to ATI. Thats why ATI's cards are perfectly designed to run DX9 but nVidia's specs are off. For example, DX9 calls for 24bit FP, which ATI does, while nVidia only supports 16 or 32bit, forcing developers to choose between correct rendering or improved performance.
Also nVidia is to blame for their driver cheating fiasco, which makes developers especially weary to trust beta or "optimized" drivers, and for expecting every game company to optimize for their cards just because they're the biggest.
Capitalism 101 (Score:2)
Um, capitalism unless I missed my guess. More specifically, the relationship between gaming hardware and software is finally maturing to the point to realize one of the more advanced techniques used in making money-- Networking. Both markets are now not only making more money than before, but are increasingly reliant on one another. Something like this was only a matter of time, IMO. You may have noticed it in that "Exclusive Game Demo" story (
Views (Score:3, Interesting)
Now this issue is quite different. There was a write up recently on why NVIDIA hardware is so much slower than ATI hardware when using 2.0 pixel shaders. I don't remember the URL, so if anyone would be so kind to post it that would be great. Basically, it was stating that the Detonator 40 drivers needed to be rewritten to better take advantage of 2.0 pixel shaders. Detonator 50 drivers are a lot faster and fix this problem, but they do reduce image quality [driverheaven.net] quite noticeably. This could be the reason that swayed Valve's decisions.
The fact of the matter is, we need next generation GeForce chips.
Microsoft Hails 'Half-Life 2' as New Benchmark (Score:2, Interesting)
See here [yahoo.com] for the full ad
Whatever happened to evidence? (Score:3, Interesting)
"Whatever happened to just making hardware, and making games?"
Whatever happened to the good ole days when people didn't believe everything they heard or read?
I'm just skeptical of an article that says we "heard from a friend of a friend." It's all too speculative, with little evidence of any real wrongdoing. Newel expressed concerns about the drivers that Nvidia was offering. He also said it took three times as long to write the codepath for NVIDIA, implying that they had to account for a lot more problems. If you want to speculate, look at the slides from "shader day." [tomshardware.com]
To qoute: "During the development of that benchmark demo Valve found a lot of issues in current graphic card drivers of unnamed manufacturers:
Camera path-specific occlusion culling
Visual quality tradeoffs e.g. lowered filtering quality, disabling fog
Screen-grab specific image rendering
Lower rendering precision
Algorithmic detection and replacement
Scene-specific handling of z writes
Benchmark-specific drivers that never ship
App-specific and version specific optimizations that are very fragile"
And we know that several of these have been explicitly tied to NVIDIA.
Deal is irrelevant... (Score:4, Insightful)
Here's Valve's problem: They make moddable games. That's at the core of their business. They didn't just make HalfLife as a game (although they did that, and very well) -- they made it as a platform upon which anyone was free to develop their own FPS games: CounterStrike being the most famous, but there are many others, such as Natural Selection or Day of Defeat.
Likewise, they are not just making HalfLife 2, but a platform upon which mods will be made. But why is this relevant to the videocard debate? Here's where we get back to the drivers.
The drivers -- the mythological r50 drivers that noone's actually gotten their hands on yet -- might well provide a speed boost to HL2 as it stands. Maybe. But if they do, it is because they have hand-tuned those drivers for HL2. See Mr Burke's quote:
What he omits is, the best experience possible for the specific subset of vidoecard functionality currently present in HL2 at this time. A little background for those of you who haven't kept up on recent videocard technology: Modern videocards have Vertex Shaders and Pixel Shaders. These are essentially short programs written in assembler (and now a variant on C) that the driver compiles and executes on the videocard, not the CPU (taking load off of it) that customise rendering in various ways. Vertex shaders typically perform lighting, animation or mesh deformation effects, while pixel shaders provide surface material effects, such as water distortion or bump mapping.
ATI's cards appear to be able to handle any pixel shader program you throw at them. Whether this is because the cards are just that fast and general they can cope with it, or whether the compiler in their driver cunningly optimises any GPU program you throw at it (the same way a C compiler optimises CPU code, by reordering instructions to avoid stalls, factoring out loop invariants, etc) we don't know. Frankly, we don't care: The important thing is, we write code, and it works.
NVidia's cards do not work this way. NVidia's cards are fast, but only if you hand-tune your assembler to precisely match their architecture. Except we don't know enough about their rules to do this (proprietary NVidia technology blah blah).
When Valve have written shaders, found them to be fast on ATI cards and slow on NVidia's cards, NVidia have released new drivers and, lo... it's fast on NVidia's cards. NVidia go "hey, uh, our bad... driver bug... fixed now...". But make even a tiny, trivial change to the shader, and bam: it's slow again. With a little more experimentation along these lines, it's easy to come to the conclusion that there was no bug, there is no fix, NVidia simply have a lookup table of shaders they 'recognise', and when one of those comes along, they replace it with one they wrote themselves, hand-tuned for their card.
There's a problem with this, of course. For a start, if you're not as big as Valve, NVidia aren't going to set aside an engineer to go around optimising shaders for your game or release new drivers. Secondly, if you make any changes you're back to square one and need to resubmit your shader to them and get it fixed up. Thirdly, if like Valve you care about modders, you're not going to be happy with this "solution" -- because even once your game is complete and on store shelves, and NVidia have stopped making new driver releases to 'fix' it, modders can make new shaders. And suddenly find their game runs like ass. You think NVidia are going to go chasing after modders? Bwahaha.
I suspect this is why Valve were careful about the benchmarks they let be
I never thought I'd see /. get it so wrong! :( (Score:3, Interesting)
It IS a conspiracy, but entirely of nVidia's own doing and creation...their hardware simply can't do DX9 well as it was never designed to. There's many reasons for this, but it mainly comes down to nVidia tried to redefine the standards of the graphics industry and failed and now are paying the consequences for their hubris.
The only thing surprising here is the size of Gabe Newell's balls to come out and directly address this in such a fashion, and I truly respect and admire him for it. He HAD to, the game is going to come out and if he didn't customers would be blaiming him and Valve for FX's shortcomings!
I'm terribly disapointed in the coverage I've seen of this on slashdot, I really thought you folks would be able to appreciate the subtle (and not so subtle) aspects of a giant company that has been resting on it's laurels and using PR fud to make up for it's hardware's shortcomings...it's just now there is really a game coming out that will highlight this and the rest of the world seems to be noticing it.
There is excellent coverage of this at www.beyond3d.com for in depth analysis, and www.nvnews.net has the best of the fanboys/ex-fanboys discussing it. (Our team at www.elitebastards.com is still the best at keeping up with all the latest stories though...
Re:Payola (Score:2, Interesting)
I wonder why they had to decrease IQ if their shader support was as good as you claim.
Re:Payola (Score:5, Informative)
Re:Payola (Score:2)
Re:Payola (Score:5, Informative)
The Nvidia FX series has been plagued with problems from the get go, with Nvidia resorting to a
massive pr blitz and outright cheating in their drivers to compete with ATI.
Parent post is truly laughable and shows an ignorance of what has transpired over the last year in the video card industry.
R300 was a new design studio for ATI (Score:5, Informative)
In the R300, ATI decided to do all their calculations in 24 bit floating point: essentially a pure next-gen chip. The NVidia Geforce FX design was based on their DX8 chips, which were far and away industry leaders in fixed-point calculations; NVidia didn't figure that floating-point performance would be very important this generation and tacked it on. What they ended up with was a chip that had a high transistor count, was very good at legacy, fixed-point operations but could not keep up with ATI in floating point. Even then (about a year ago) NVidia's chip might have been competitive but they had process problems that made the chip clock slower than expected and about 9 months late.
ATI's superiority in floating point shaders has been demonstrated by various benchmarks (including some open-source benchmarks, which are the only ones I really take seriously) time and again. NVidia can only be competitive this generation when they 'tweak' their drivers for particular benchmarks. These tweaks sometimes consist of rewriting floating-point shaders to use their legacy fixed-point functionality, and on some occasions of even using pre-generated shadow models to replace the dynamically generated models of benchmarks that run over a known scene.
NVidia's NV3x generation seems weak, compared to ATI, and very weak unless game coders ignore API standards and write custom shaders that do as much as possible in NVidia's legacy hardware. Of course, by historical standards NVidia's NV3x isn't weak at all--they blow away all their competitors and ATI's pre-R300 products. It's just that the design choices made by ATI's new designers allowed them to leapfrog a generation.
Re:R300 was a new design studio for ATI (Score:4, Interesting)
[disclaimer, I couldn't care less which one sucks less ati or nvidia, I personally like matrox as I like "solid state" cards with dual head and have no need for 3d stuff]
You, hypocrite bastard, forgot to mention that nv3x supports 16 *and* 32 bit floating point, while ati only supports 24, of course when you do things at 32bit precision, nv3x is slower than ati at 24bit, and that when you do it at 16 it's faster, but the quality isn't that good.
God Carmack has written more than enough about this in his
http://slashdot.org/comments.pl?sid=65617&cid=605
and
http://finger.planetquake.com/plan.asp?userid=joh
Now, what *I* want is the video card manufacturers(and hardware manufacturers in general) to stop behaving like 3 year olds and fucking document their products and release open source drivers, that would get ride of most of this bull shit that is just driven by marketoids, wastes everyones time and contributes nothing to improve the technology that is what really matters.
And maybe then we could get some decent 3D system under Plan9 that kicked everyone else's ass, just like draw* did for 2d
* Not related in any way to DirctDraw, you ignorant idiot.
ATI has been ahead since the original Radeon (Score:4, Interesting)
ATI WAS the first to market with a DirectX8.1 solution, in the Radeon 8500. The Radeon 8500's Pixel Shader v1.4 was more advanced than any nVidia product until the release of the Geforce FX. The Geforce4 Ti only supported PS1.3, which is significantly less advanced.
ATI WAS the first to market with a DirectX9.0 solution, the Radeon 9700 Pro. nVidia still lags behind, with the Geforce FX offering well below average shader performance even when using their reduced accuracy shader programs.
The best proof of the R300+ platform's superiority is that nVidia's own, in-house developed DirectX9.0 demos run faster and look better on Radeon hardware than on the Geforce FX. If that isn't a damning indictment of the poor quality of the NV30 architecture, I don't know what is.
Re:Seriously, what are they thinking? (Score:2)
How is this akin to that? Day of Defeat is totally free. Why does it matter where you get it from? Your concept of the importance of this matter appears to be minimal at best.
No this is terrible. Deceit, lies, cover-ups, buy-outs. I, for one, am not buying Half-Life 2. After Steam and all this baloney, I just don't need it. No, I'm not going to download it either. I also will not be patching any Half-life 1 products until S