NVidia Fight Back Against ATI At Editor's Day 75
Thanks to FiringSquad for their feature covering NVidia's recent editor's day, discussed in context of the graphics card company's continuing rivalry with ATI. The writer suggests: "It's become rather trendy to bash NVIDIA lately. People like winners and people love underdogs. ATI is both right now - they've climbed their way out of the abyss and even disregarding the NV30 production delays, their timetable was catching up to NVIDIA's." But, after an interview with Tim Little at Ion Storm Austin and technical questions answered by Tim Sweeney of Epic, the writer concludes: "What the benchmarks have proven is that NVIDIA's hardware is as fast as ATI's, depending on the game. Yes, it does take more work - NVIDIA admitted as much. The NV3X platform isn't as easy to program fast as R300 and R350 are."
So, this is what I'm getting out of that: (Score:1)
I can see already that I would terribly unenthused about working on nVidia specific performance enhancements.
Re:So, this is what I'm getting out of that: (Score:5, Insightful)
1) nVidia has fewer hard-wired limitations on the complexity of the code being run and the accuracy of the calculations being made, though each could come at the cost of speed if used heavily
2) nVidia might be easier to develop for under OpenGL because you have better access to the hardware, whereas DirectX9 in certain areas tends to more closely follow the ATI hardware (which was available to developers and MS before DirectX9 was complete)
3) As the two companies progress, the performance difference will diminish as nVidia's drivers are more heavily optimized and both manufacturers release new hardware which, on nVidia's side, means more speed to throw at the existing feature set, and on ATI's side improvements in the feature set to better leverage improvements in the speed of the hardware.
In other words, this is the closest things have ever been in this particular race, and neither company is out of it yet. The winner won't be determined by the current crop of games or hardware, but instead by what developers (and the 2 manufacturers) do after UT2004, Doom 3, and Half-Life 2.
Competition is usually good for the consumer (Score:4, Insightful)
NVidia got some very unexpected competition while sitting on their laurels. I think that this was a real wake-up call and lesson for them, not in the realm of technology so much, but in the realm of promotion and advertising. Their FUD actually got turned on them, and hard, when drivers were shown to be tuned for benchmarks and such.
However, once they accepted ATI as a real contender, it seems they started working on their technology again, instead of whiny press releases and bad pr.
And though consumers took a hit with hastily-released drivers and hardware, it looks like things are turning around for the good of us.
Re:Competition is usually good for the consumer (Score:2)
I wouldn't say they were sitting on their laurels. A more plausible analysis would be that they bit off more than they could chew when they decided to build a chip for the XBox, move to .13 micron technology, and acquire 3DFX all at the same time.
Cross platform compatibility (Score:2)
Asmo
Re: (Score:3, Insightful)
Re:Cross platform compatibility (Score:2)
Re:Cross platform compatibility (Score:3, Interesting)
But, as someone else has already mentioned, sometimes it's the pure scope 1600x1200x(full features) and not just the framerate.
Well... (Score:3, Informative)
50 is sort of a silly number - most people have their refresh at 60 or 72. To a seasoned FPS gamer, 60 is distinguishable from 50. Whether 130 is distinguishable from 120 is another question - the answer to which is definitely no, even if you had a monitor capable of such silliness.
However, these numbers are really not what we're worried about a lot of the time - we're worried about absolute minimum framerate. Often a game will be chugging al
Framerate and our eyes (Score:3, Informative)
As I recall, visual research indicates that humans can successfully discern fluid motion from frame based motion up until about 400fps. Of course, no one has a monitor that goes up that high, but still, the point stands.
I did try to find a cite, the closest I could find was this page [amo.net] which notes that framerates of 220fps have been proven distinguishable.
That's really quite impressive. (Score:2)
Re:Cross platform compatibility (Score:2)
Yes, higher detail. More polygons, more features, higher resolution. If you get 50fps in a highly-detailed scene, you'll get >200fps in minimally-detailed ones. Benchmarks work on the theory that the reverse is true as well (if P then Q; Q, therefore P; that's simplified, it's not actually fallacious reasoning). This is why Q3 benchmarks that yield >100fps for all cards still matter, because the benchmarks usually don't test highly-
Re:Cross platform compatibility (Score:1)
Drivers? (Score:1)
I always gathered that ATI was best for DirectX, and NVIDIA best for OpenGL, though that might be wrong. Surely though regardless of what is easy to program for, simple raw performance under GL/DX is pretty important, for all the apps that dont optimise for the major graphics cards?
Re:Drivers? (Score:1)
I especially enjoy playing Return to Castle Wolfenstein: Enemy Territory in Linux reducing the potential to reboot to windows!
Re:Drivers? (Score:3, Interesting)
Re:Drivers? (Score:1)
As a Radeon 8500 owner, I'm eagerly awaiting a patch to this annoyance.
Re:Drivers? (Score:1)
Big bummer. I spent Saturday evening after I updated my 9000pro to a 9600pro cursing and swearing while it kept crashing on Morrowind whenever I'd load a zone. Once I figured out that downgrading the patches to 3.7 catalyst all things worked well.
Re:Drivers? (Score:1)
On performance (Score:2)
Re:On performance (Score:1)
The article pretty much counters your statement, as it specifically mentions that the ATI cards have limitations on the complexity and accuracy of shaders well below those on the nVidia cards. The speed, on the other hand, will have to be generally accepted until nVidia can produce better drivers and optimized games are released.
Re:On performance (Score:2)
ATI's F-buffer renders the 64-instruction limit moot, offering support for essentially unlimited pixel shader program langths. Realistically, nVidia cards are going to be running in low-accuracy mode simply to attain usable framerates. Also, even in high accuracy mode, final frame output on a Geforce FX is still lower quality than on an Radeon due to the poor quality filtering forced on the FX to improve performance.
The fact of the matter is that updated drivers and games aren't going to make a difference
Re:On performance (Score:2)
Re:Drivers? (Score:2)
Since then ATI has recognized this as a huge obstacle to people using their products and did a complete 180. They now release drivers in a timely manner that work extremely well as well as adding cool new features. 3.8 let you make use of Shaders in games that originally weren't programmed with them.
Granted 3.8 have been reported as causing problems. But I never got around to upgrading an
Re:Drivers? (Score:2)
Also, I didn't think ati released a 3.7... didn't they go from 3.6 to 3.8? Or am I thinking 3.4 to 3.6?
Re:Drivers? (Score:2)
But yes they did release 3.7, I'm running them right now.
Seems ATI got busted cheating again (Score:5, Informative)
The accusations leveled against ATi at NVIDIA's Editors' Day two days ago thus become that much more serious. Epic's Mark Rein confirmed that in some cases, high-res detail textures were not displayed in some areas by ATis drivers and that standard, lower-res textures are used instead. Randy Pitchford of the Halo development team also mentioned that there were optimizations present in ATi's drivers which are detrimental to Halo's image quality.
The relevant link is here. [tomshardware.com]
Now that NVidia seems to be the image quality kings and owning the mid-range card market again with the FX5700 Ultra, It makes me wonder how the ATI performance would measure up if they didn't cheat.
Re:Seems ATI got busted cheating again (Score:1)
Re:Seems ATI got busted cheating again (Score:2)
Re:Seems ATI got busted cheating again (Score:1)
Re:Seems ATI got busted cheating again (Score:1)
Re:Seems ATI got busted cheating again (Score:1)
Re:Seems ATI got busted cheating again (Score:2)
How much does it cost? $925
Has any one besides THG benchmarked one?
Yanno, [google.com]
If you annoying, [firingsquad.com]
lazy, fanboy fuckers actually [tecchannel.de]
took the time to look for the answers to your own fucking questions, [google.com]
questions you wouldn't be so fucking annoying. [digital-daily.com]
That's from a 30 SECOND FUCKING GOOGLE SEARCH.
I call that very biased. Benchmarking something thats not out yet. It's not out and that is why AMD wins.
ON WHAT FUCKING PLANET. Jesus. EVERYONE benchmarks stuff before it's relea
Re:Seems ATI got busted cheating again (Score:1)
Re:Seems ATI got busted cheating again (Score:1)
Hard to say either one is ahead of the other at "cheating", though either side could claim development error. I wish cheating weren't a part of the game, but you can't claim one or
Re:Seems ATI got busted cheating again (Score:2)
I'll bite... (Score:3, Informative)
An excerpt from ATI's Response to recent allegations of benchmark cheating [rage3d.com]
AquaMark3: We are currently investigating our rendering in AquaMark3. We have identified that we are rendering an image that is slightly different than the reference rasterizer, but at this point in time we are unable to identify why that is. We believe that this does not have any impact on our performance. Our investigation will continue to identify the cause and resolve it as soon as possible. One point to note is that we rende
THG (Score:1)
THG has had a poor reputation for years now. I personally don't trust his site. I know that many, if not most, feel the same way. You may find your arguments more successful if you avoid mentioning THG as a source.
Re:THG (Score:2)
THG has a reputation for being a corporate whore, but that's doesn't mean the guy's not bright. Just too big for his britches in some ways - he thinks no one will notice when he's a corporate playboy. You have to remember that along with the dumbest people, the smartest people are browsing this here internet thing on a regular basis and following links, and they may come to you unexpected and call you on your bullshit.
On the other hand, just read that ATI press release snippet once more, slowly. They're
Re:THG (Score:1)
Re:Seems ATI got busted cheating again (Score:2)
hardocp review [hardocp.com]
The 9600XT comes with a free copy of HL2. So that's $50 righty there.
They are practically neck in neck in real world performance so I guess it's your call. I know which one I would pick though. $200 and HL2 or $200 and no HL2.... hmmmm tough one.
Re:Seems ATI got busted cheating again (Score:2)
Re:Seems ATI got busted cheating again (Score:2)
Really?
When you buy the new RADEON(TM) 9800 XT or RADEON(TM) 9600 XT you will get a FREE copy of Half-Life(R) 2! [ati.com]
Re:Seems ATI got busted cheating again (Score:2)
Re:Seems ATI got busted cheating again (Score:2)
right from ATI [ati.com]
Also bestbuy has the coupon for HL2 on ALL ATI cards now check here for a good deal [slickdeals.net]
I don't know what benchmarks you refer to but please check out the hardocp review. They actually give you graphs of in game performance rather than crap like 3dMark2003. Last I checked I didn't play benchmarks.
Please show me these benchmarks where the 5700 "utterly destroys the 9600XT" I really don't care one way or the other as long as I get the best deal
Re:Seems ATI got busted cheating again (Score:3, Informative)
That article is all about nVidias latest little shortcut. Rather than do linear interpolation between mip-map levels, they've introduced little plateaus where they only sample one mip-map level. It saves on the amount of memory bandwidth they have to use when reading a texture.
The thing that I find strangest about this little 'optimization', is that the GeForce FXs have heaps and heaps of memory bandwidth. It's not the area they really have to work on.
Re:Sorry I modded instead of posting but.... (Score:1)
nVidia - a developer writes (Score:2)
For example, I'd appreciate it if they could fix it so on my GeForce 4MX, antialiased lines with width >1 pixel draw properly, rather than being drawn as width 1 lines with no antialiasing. You know, little details like that.
I know I'm hardly likely to spend time trying to use nVidia-only optimizations when even core OpenGL doesn't work properly.
Re:Engineers (Score:1)
My GeForce 5600FX take up a single slot.
My GeForce 5600FX (256Mb) take all the necessary power from the AGP.
What's your point again?
Karem
Re:Engineers (Score:2)
"Depending on the game" (Score:2)
Re:"Depending on the game" (Score:1)
I got a Leadtek 5600FX, non ultra and I play Max Payne 2 without any problems at a respectable and lovely 1280x1024x32 (max my 17inch LCD screen goes to). H T&L, everything high, trilinear (no anti-aliasing) all fogging, flares, pixel shader skins etc. You know what...smooth as silk...I haven't had a slowdown that I noticed...
Now, I originally bought a 9600 (non pro) and you know what happened? EVERY D3D game that I tried would lock up after 5 seconds of switching into D3D mode. I
nVidia got cocky... (Score:3)
nVidia chose not to go to the initial meetings on DX9...That was their loss. DX9 has Y amount of features...designing in any more are just wasted space because the chip is out-of-date in 9 months anyway! In a sense they got bit by their own gringing machine. ATI was catching up, and nVidia management lost the chance to keep pushing the specs...ATI turned down the heat just enough for them to come out on top RIGHT NOW...
But this is just 1 round...Aside from what nVidia did to 3DFX, ATI is just gaining some turf back. What NOBODY is saying is that it's not R350 & GFX duking it out anyway...it's the built in stuff [compaq, HP, etc] the el-cheapos that are still buying TNT & 128 [should be banned I say!] where both companies sell their units. The stuff we play with is just icing on the procuct lines. This is just one round in the long-term match...but it serves to keep nVidia honest...and that's a good thing!
ATI lost a customer long ago (Score:2)
I was repeatedly burned by them with how quickly they would drop support for their older cards once they came out with a newer product line.
Their included video tuner/capture software was bloated and poorly designed, and their drivers were constantly failing me.
This was about five years ago. I do not intend to change my mind despite any improvments they've made. They *really* pissed me off back then with consta