Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Graphics Software Entertainment Games

NVidia Fight Back Against ATI At Editor's Day 75

Thanks to FiringSquad for their feature covering NVidia's recent editor's day, discussed in context of the graphics card company's continuing rivalry with ATI. The writer suggests: "It's become rather trendy to bash NVIDIA lately. People like winners and people love underdogs. ATI is both right now - they've climbed their way out of the abyss and even disregarding the NV30 production delays, their timetable was catching up to NVIDIA's." But, after an interview with Tim Little at Ion Storm Austin and technical questions answered by Tim Sweeney of Epic, the writer concludes: "What the benchmarks have proven is that NVIDIA's hardware is as fast as ATI's, depending on the game. Yes, it does take more work - NVIDIA admitted as much. The NV3X platform isn't as easy to program fast as R300 and R350 are."
This discussion has been archived. No new comments can be posted.

NVidia Fight Back Against ATI At Editor's Day

Comments Filter:
  • Were I a game developer, I have the option of supporting ATI, which produces fine performance and is easy to develop for, and nVidia, which produces find performance and is a pain to develop for.

    I can see already that I would terribly unenthused about working on nVidia specific performance enhancements.
    • by PainKilleR-CE ( 597083 ) on Monday November 03, 2003 @04:00PM (#7380119)
      The article also made a few other things fairly clear, though:

      1) nVidia has fewer hard-wired limitations on the complexity of the code being run and the accuracy of the calculations being made, though each could come at the cost of speed if used heavily

      2) nVidia might be easier to develop for under OpenGL because you have better access to the hardware, whereas DirectX9 in certain areas tends to more closely follow the ATI hardware (which was available to developers and MS before DirectX9 was complete)

      3) As the two companies progress, the performance difference will diminish as nVidia's drivers are more heavily optimized and both manufacturers release new hardware which, on nVidia's side, means more speed to throw at the existing feature set, and on ATI's side improvements in the feature set to better leverage improvements in the speed of the hardware.

      In other words, this is the closest things have ever been in this particular race, and neither company is out of it yet. The winner won't be determined by the current crop of games or hardware, but instead by what developers (and the 2 manufacturers) do after UT2004, Doom 3, and Half-Life 2.
  • by bugnuts ( 94678 ) on Monday November 03, 2003 @03:29PM (#7379797) Journal
    argh, original article /.'ed.

    NVidia got some very unexpected competition while sitting on their laurels. I think that this was a real wake-up call and lesson for them, not in the realm of technology so much, but in the realm of promotion and advertising. Their FUD actually got turned on them, and hard, when drivers were shown to be tuned for benchmarks and such.

    However, once they accepted ATI as a real contender, it seems they started working on their technology again, instead of whiny press releases and bad pr.

    And though consumers took a hit with hastily-released drivers and hardware, it looks like things are turning around for the good of us.
    • NVidia got some very unexpected competition while sitting on their laurels. I think that this was a real wake-up call and lesson for them, not in the realm of technology so much, but in the realm of promotion and advertising.

      I wouldn't say they were sitting on their laurels. A more plausible analysis would be that they bit off more than they could chew when they decided to build a chip for the XBox, move to .13 micron technology, and acquire 3DFX all at the same time.

  • Given the lack of control over the end user environment, and the general capability of modern cards, one wonders if it is worth trying to squeeze the 1M frames a second out by writing vendor specific code or if its better to go for minimal extensions to the standard API to enhance compatibility. Does anyone really have a justification for more than 50fps?

    Asmo
    • Re: (Score:3, Insightful)

      Comment removed based on user account deletion
      • Eh. There's a really good reason for it. The framerate is not going to be steady in many cases/games. So if you're averaging 50fps, you're going above and below that - the hope is that you don't drop down to 30 or below FPS of course.
    • With the push on modern hardware, I'm surprised there aren't more stereoscopic LCD shutter glasses in use. You'd need to do a consistent 150fps to give each eye 75fps.

      But, as someone else has already mentioned, sometimes it's the pure scope 1600x1200x(full features) and not just the framerate.
    • Well... (Score:3, Informative)

      by JMZero ( 449047 )
      Does anyone really have a justification for more than 50fps?

      50 is sort of a silly number - most people have their refresh at 60 or 72. To a seasoned FPS gamer, 60 is distinguishable from 50. Whether 130 is distinguishable from 120 is another question - the answer to which is definitely no, even if you had a monitor capable of such silliness.

      However, these numbers are really not what we're worried about a lot of the time - we're worried about absolute minimum framerate. Often a game will be chugging al
      • by Alereon ( 660683 )

        As I recall, visual research indicates that humans can successfully discern fluid motion from frame based motion up until about 400fps. Of course, no one has a monitor that goes up that high, but still, the point stands.

        I did try to find a cite, the closest I could find was this page [amo.net] which notes that framerates of 220fps have been proven distinguishable.

    • Does anyone really have a justification for more than 50fps?

      Yes, higher detail. More polygons, more features, higher resolution. If you get 50fps in a highly-detailed scene, you'll get >200fps in minimally-detailed ones. Benchmarks work on the theory that the reverse is true as well (if P then Q; Q, therefore P; that's simplified, it's not actually fallacious reasoning). This is why Q3 benchmarks that yield >100fps for all cards still matter, because the benchmarks usually don't test highly-

    • I don't think there is a justification for more than 50 *fps* as long as it stays that high as other people have pointed out. Having your monitor at say 100 hz refresh rate along with taht is justifiable though due to it being kinder on the eyes...
  • What are ATI drivers like these days? When I think of what they used to be like... But I read somewhere they were ok these days, can anyone confirm this?

    I always gathered that ATI was best for DirectX, and NVIDIA best for OpenGL, though that might be wrong. Surely though regardless of what is easy to program for, simple raw performance under GL/DX is pretty important, for all the apps that dont optimise for the major graphics cards?

    • I run a Radeon 9700 Pro in Linux and Windows XP and the recent Catalyst drivers seem to do very well. Now I mainly bought this card because it was the top end when I updated my machine, so I didn't buy it for any specific games, but for those I do play, it does well.

      I especially enjoy playing Return to Castle Wolfenstein: Enemy Territory in Linux reducing the potential to reboot to windows!

    • Re:Drivers? (Score:3, Interesting)

      by cgenman ( 325138 )
      The latest catalyst drivers, 3.8, break 2d rendering in Empires: Dawn of the Modern World. This wouldn't be such a big deal if the 3.8 didn't ship after we went gold but before we hit shelves. Users tell us it has happened in other games, but I can't confirm this.

      • HALO PC has bluescreen problems with the latest Catalyst drivers with the 7500, 8000, 8500 and 9000 Radeon cards. Same situation; if you downgrade your drivers to the version Microsoft did their QA testing with, the game runs flawlessly.

        As a Radeon 8500 owner, I'm eagerly awaiting a patch to this annoyance.
      • Morrowind also broke.

        Big bummer. I spent Saturday evening after I updated my 9000pro to a 9600pro cursing and swearing while it kept crashing on Morrowind whenever I'd load a zone. Once I figured out that downgrading the patches to 3.7 catalyst all things worked well.
      • ATI drivers do have a habbit of brakeing in certain instances. They normally fix it for the next release.
    • ATI is just the best all around. Modern Radeon series cards can render faster with better image quality than Geforce FX cards, regardless of API or rendering features in use.
      • Modern Radeon series cards can render faster with better image quality than Geforce FX cards, regardless of API or rendering features in use.

        The article pretty much counters your statement, as it specifically mentions that the ATI cards have limitations on the complexity and accuracy of shaders well below those on the nVidia cards. The speed, on the other hand, will have to be generally accepted until nVidia can produce better drivers and optimized games are released.
        • ATI's F-buffer renders the 64-instruction limit moot, offering support for essentially unlimited pixel shader program langths. Realistically, nVidia cards are going to be running in low-accuracy mode simply to attain usable framerates. Also, even in high accuracy mode, final frame output on a Geforce FX is still lower quality than on an Radeon due to the poor quality filtering forced on the FX to improve performance.

          The fact of the matter is that updated drivers and games aren't going to make a difference

      • As long as you aren't running Empires or Morrowind or a bunch of other games with their latest supposedly release quality drivers. I got the pleasure of talking my sister and her hubby through removing these shit drivers from their new 9600 pro equipt gateway and putting the last ones back. Nice to see ATI still putting the same level of attention to detail in their driver releases.
    • I too loathed using ATI's drivers back when they released the first Radeon. Nothing worked whereas Nvidia's always did.

      Since then ATI has recognized this as a huge obstacle to people using their products and did a complete 180. They now release drivers in a timely manner that work extremely well as well as adding cool new features. 3.8 let you make use of Shaders in games that originally weren't programmed with them.

      Granted 3.8 have been reported as causing problems. But I never got around to upgrading an
      • I concur with the pi radian turn regarding drivers. I've had little trouble with them, although I don't play some of the games reported with problems. As a side-note, I've had more games crash with my GF3 (random restarts -- ugh), which forced me to find some beta drivers that somehow worked around the problem.

        Also, I didn't think ati released a 3.7... didn't they go from 3.6 to 3.8? Or am I thinking 3.4 to 3.6?
  • by Sevn ( 12012 ) on Monday November 03, 2003 @04:19PM (#7380325) Homepage Journal
    To be fair, it was their turn I guess. Next year it will be NVidia's turn no doubt. The person that busted them this time was Tom at Tom's Hardware Guide.

    The accusations leveled against ATi at NVIDIA's Editors' Day two days ago thus become that much more serious. Epic's Mark Rein confirmed that in some cases, high-res detail textures were not displayed in some areas by ATis drivers and that standard, lower-res textures are used instead. Randy Pitchford of the Halo development team also mentioned that there were optimizations present in ATi's drivers which are detrimental to Halo's image quality.

    The relevant link is here. [tomshardware.com]

    Now that NVidia seems to be the image quality kings and owning the mid-range card market again with the FX5700 Ultra, It makes me wonder how the ATI performance would measure up if they didn't cheat.
    • Nvidia is far from image quality kings. The ATI stuff is bugs not cheats. Nvidia in fact has some problems with the same games. Of course you won't hear that at Nvidia's Editors Day or on Tom's Hardware.
      • You'd need to actually look at the screenshots at THG to know what you are talking about. The new forceware drivers made a huge difference. To *me* the nvidia caps look better, but to be fair to THG, they said they looked identical. And for the record, THG has been a very loud and very vocal ATI supporter and NVidia downer in the past. Just like he's given Intel shit when they are sucking and AMD shit when they are sucking also. He's perhaps the only impartial reviewer I know of. Instead of doing straight D
        • I have both Nvidia and ATI cards so I'm not a ranting fanboy. The forceware drivers did improve the image quality but at the cost of performance. THG had to finally admit ATI was good when it was getting 50 to 100% better benchmarks. Its funny you bring up AMD and Intel. AMD is winning right now and THG still doesn't want to admit that.
    • Don't forget that NVIDIA's best comeback to ATi's HL2 deal was that they didn't use their (then beta) 5x.xx drivers which didn't even show fog at the time! (a major part of the demo) Even if ATi cheated as well, we're not exactly talking about an honest competition either. I suspect that we'll find another slip up by NVIDIA next.

      Hard to say either one is ahead of the other at "cheating", though either side could claim development error. I wish cheating weren't a part of the game, but you can't claim one or
      • And I completely agree. They both suck. They've gotten themselves into this pathetic squabble where they both have resorted to doing very unethical stuff and they both take turns trying to take the highground when the other one messes up. This time it's ATI's turn to eat crow but they'll admit no wrongdoing as usual just like NVidia will the next time it's their turn to get caught doing dumb shit.
    • I'll bite... (Score:3, Informative)

      by Alereon ( 660683 )

      An excerpt from ATI's Response to recent allegations of benchmark cheating [rage3d.com]

      AquaMark3: We are currently investigating our rendering in AquaMark3. We have identified that we are rendering an image that is slightly different than the reference rasterizer, but at this point in time we are unable to identify why that is. We believe that this does not have any impact on our performance. Our investigation will continue to identify the cause and resolve it as soon as possible. One point to note is that we rende

    • Uh, owning the mid-range market with the 5700? Not really, it costs $200 which competes with the $200 ATI 9600XT.

      hardocp review [hardocp.com]

      The 9600XT comes with a free copy of HL2. So that's $50 righty there.

      They are practically neck in neck in real world performance so I guess it's your call. I know which one I would pick though. $200 and HL2 or $200 and no HL2.... hmmmm tough one.

    • Cheating? nVidia? You mean like this [3dcenter.org]?

      That article is all about nVidias latest little shortcut. Rather than do linear interpolation between mip-map levels, they've introduced little plateaus where they only sample one mip-map level. It saves on the amount of memory bandwidth they have to use when reading a texture.

      The thing that I find strangest about this little 'optimization', is that the GeForce FXs have heaps and heaps of memory bandwidth. It's not the area they really have to work on.
  • Perhaps nVidia could spend more of their valuable time fixing the bugs in their OpenGL drivers, and less of it whining.

    For example, I'd appreciate it if they could fix it so on my GeForce 4MX, antialiased lines with width >1 pixel draw properly, rather than being drawn as width 1 lines with no antialiasing. You know, little details like that.

    I know I'm hardly likely to spend time trying to use nVidia-only optimizations when even core OpenGL doesn't work properly.
  • Keep this phrase in mind. I've owned nVidia cards back to the GeForce2 MX (although I'm sure many of you go back farther than that), but the point is, I'm no fanboy when I have negative things to say about NV30+ architecture. It just doesn't have the DX9 horsepower. The high-end cards score quite well in 3DMark03 despite this, which I cannot account for without making shaky claims. However, NV30+ simply falls on its face when forced to do pure DX9 gaming environment instructions, particularly with the late
    • Ok, enough of this...

      I got a Leadtek 5600FX, non ultra and I play Max Payne 2 without any problems at a respectable and lovely 1280x1024x32 (max my 17inch LCD screen goes to). H T&L, everything high, trilinear (no anti-aliasing) all fogging, flares, pixel shader skins etc. You know what...smooth as silk...I haven't had a slowdown that I noticed...

      Now, I originally bought a 9600 (non pro) and you know what happened? EVERY D3D game that I tried would lock up after 5 seconds of switching into D3D mode. I

  • by mabhatter654 ( 561290 ) on Monday November 03, 2003 @09:56PM (#7383346)
    This looks more like nVidia's management lost this round than the engineers. ATI is old school...they are like IBM..they know how to talk to suits and make the right product to SELL, even if it isn't the greatest.

    nVidia chose not to go to the initial meetings on DX9...That was their loss. DX9 has Y amount of features...designing in any more are just wasted space because the chip is out-of-date in 9 months anyway! In a sense they got bit by their own gringing machine. ATI was catching up, and nVidia management lost the chance to keep pushing the specs...ATI turned down the heat just enough for them to come out on top RIGHT NOW...

    But this is just 1 round...Aside from what nVidia did to 3DFX, ATI is just gaining some turf back. What NOBODY is saying is that it's not R350 & GFX duking it out anyway...it's the built in stuff [compaq, HP, etc] the el-cheapos that are still buying TNT & 128 [should be banned I say!] where both companies sell their units. The stuff we play with is just icing on the procuct lines. This is just one round in the long-term match...but it serves to keep nVidia honest...and that's a good thing!

  • I will never buy another ATI card, unless they are the last video card company left on earth.

    I was repeatedly burned by them with how quickly they would drop support for their older cards once they came out with a newer product line.

    Their included video tuner/capture software was bloated and poorly designed, and their drivers were constantly failing me.

    This was about five years ago. I do not intend to change my mind despite any improvments they've made. They *really* pissed me off back then with consta

Solutions are obvious if one only has the optical power to observe them over the horizon. -- K.A. Arsdall

Working...