Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Games Entertainment

ATI & Nvidia Duke It Out In New Gaming War 208

geek_on_a_stick writes "I found this PC World article about ATI and Nvidia battling it out over paper specs on their graphics cards. Apparently ATI's next board will support pixel shader 1.4, while Nvidia's GeForce3 will only go up to ps 1.3. The bigger issue is that developers will have to choose which board they want to develop games for, or, write the code twice--one set for each board. Does this mean that future games will be hardware specific?"
This discussion has been archived. No new comments can be posted.

ATI & Nvidia Duke It Out In New Gaming War

Comments Filter:
  • This sucks (Score:3, Insightful)

    by levik ( 52444 ) on Wednesday August 01, 2001 @09:14AM (#1431) Homepage
    Whatever happened to standards? Remember when things were "100% compatible"? IBM-PC compatible. SoundBlaster compatible. VESA compatible. Compatibility in harware was nice, because you could take it and your software would work on any OS with a piece of compatible hardware without needing special drivers.

    Now the hardware industry has moved away from that, instead giving us free drivers for windows. Which not only are crappy in their first release, but are also useless on other platforms which the vendor decides not to support.

    Bring hardware standards back, and MS will lose much of the power it's able to leverage through the high degreee of hardware support their system provides. I for one would sacrifice a little technological progress for the ability to have things work together as expected out of the box.

  • by LordZardoz ( 155141 ) on Wednesday August 01, 2001 @09:01AM (#6273)
    Its much like the choice to support AMD's 3DNOW or Intel's SIMD instructions. If you use DirectX 8 or OpenGL, the issue is usually dealt with by the graphics library and the card drivers. Some bleeding edge features are initially only supportible by writing specific code, but that is the exception.

    END COMMUNICATION
  • by volpe ( 58112 ) on Wednesday August 01, 2001 @09:20AM (#15625)
    ...is that developers shouldn't HAVE to develop for specific hardware. I don't work in the game industry specifically, but I don't see how this is necessarily good for software in general, or graphics software in particular. This doesn't give developers "more choice in the hardware they develop for" It gives them less choice, because they have to decide how to allocate limited resources on a per-platform basis. When you have a common API, you're not forced to choose in the first place. That's why hardware specific features and capabilities ought to be abstracted-out into a common API. What these guys should do is come up with a dozen or so different kinds of high-level magic (e.g. water waves, flame, smoke,bullet-holes, whatever) that they can work with their pixel and vertex shaders, lobby Microsoft to get that magic incorporated into the DirectX spec, and then supply drivers that meet those specs by sending a few pre-packaged routines to the pixel/vertex shaders, rather than have game developers worry about this stuff directly. Or am I missing something?
  • by Znork ( 31774 ) on Wednesday August 01, 2001 @10:21AM (#16456)
    Oh, but you forgot the fourth option:

    Say screw'em both and develop for neither, just using lowest common denominator stuff, and spend the saved time on improving the other parts of the game.

    If your game cant stand on its own using that... well, maybe, just maybe, it sucks?
  • by HelloKitty ( 71619 ) on Wednesday August 01, 2001 @09:00AM (#21940) Homepage
    why use something like direct x when opengl is an open standard with sourcecode and specification open to all?

    It's scary that so many people are relying on M$'s proprietary graphicx technology. at any time they could discontinue it, or change the API in such a way to make all games broken. I wouldn't put it past them.

    subatomic
    http://www.mp3.com/subatomicglue [mp3.com]
  • I believe (Score:3, Insightful)

    by RyuuzakiTetsuya ( 195424 ) <<taiki> <at> <cox.net>> on Wednesday August 01, 2001 @08:48AM (#24883)
    The Pixel Shader technology will be backwards compatable as far as the DirectX 8.0 API is concerned. Imagine that. Microsoft using an API to bring software developers together across various hardware choices. Now only if they could get Win32 cleaned up and a decent kernel, then I'd THINK about purchasing that OS. Although I'm not saying that there won't be card specific code, but as far as Pixel shader tech goes, as long as the drivers are DX 8 compatable, there's no problem with code for one card not working on the other. Besides, most systems sold in the last year have 810/810e/815E chipsets and stuck with those old i740 Starfighter chips.
  • Re:This sucks (Score:2, Insightful)

    by Francis ( 5885 ) on Wednesday August 01, 2001 @12:50PM (#25165) Homepage

    Standards are always developed later. Maybe you fail to grasp how new this technology. GeForce 3 was the first video card to support hardware vertex/pixel shaders. That was released 2 months ago.

    Remember when things were "100% compatible"? ... SoundBlaster compatible

    Do you really remember what those days were like, when sound-cards *just* came out? You had to pick which sound card you wanted to lock your life into. Adlib? SoundBlaster? ELS? I can't hardly remember anymore.

    ATI and nVidia *are* arguing about standards right now. They're working from a common frame. It's not that bad. You're just exagerating the problem.

  • Re:why arent.... (Score:2, Insightful)

    by cougio ( 205014 ) on Wednesday August 01, 2001 @04:06PM (#25791)
    HehE. The two libraries you named are proprietary (3DFX's Glide and M$'s DirectX)... the standard is OpenGL.
  • by Earlybird ( 56426 ) <slashdot @ p u r e f i c t ion.net> on Wednesday August 01, 2001 @09:19AM (#26026) Homepage

    Does this mean that future games will be hardware specific?

    Well, no. Game developers do prefer the state of the art, but common sense dictates that you target something that is exists and is popular.

    Comparisons to browser market shares are appropriate here: When Internet Explorer became the norm, web sites tended to take advantage of IE's superior DHTML and DOM support, but developers have mostly strived to make pages backwards-compatible with Netscape and other less capable browsers. After Mozilla caught up, most web sites still aren't targeting it specifically.

    Keep in mind that, according to the article, the board does not currently exist. One's desire to write custom code for a nonexistent board is contingent on several factors, such as the manufacturer's present and potential future market share.

    Case in point: Developers used to target Glide, 3Dfx' low-level rendering API. Games these days don't bother: 3Dfx has DirectX support, the effort to squeeze a few extra FPS from writing "straight to the metal" usually isn't worth the time and money, and most importantly, 3Dfx is dead. Its user base is dwindling, and there is no incentive to use the (generally) hardware-specific Glide over the generic DirectX.

    As for the development effort: As a former game developer and Direct3D user, I agree with the claim that when targeting both shaders, "they'll have to write more code". A few hundred lines, perhaps, for detecting and using the two extra texture shaders per pass. It's not like it's a new, different API.

  • by mallan ( 37663 ) on Wednesday August 01, 2001 @01:09PM (#27577) Homepage
    I'm surprised at the lack of comments about platform support for these new features.

    If you own a GeForce3 *today*, you can access all of the hardware's features on Linux, Windows and Mac through OpenGL.

    I don't know about ATI's Mac support, but under Linux the Radeon drivers still don't support T&L, cube maps, 3D textures or all three texture units. The card has been available for well over a year, but the driver only enables Rage128-level features. How long do you think it's going to take for the pixel and vertex shader capabilities to make it into the Linux drivers? And what about the Mac?

    I've been extremely impressed by the balanced approach NVIDIA has been taking: they do a great deal of work on D3D 8 with Microsoft, but they simultaneously create OpenGL extensions for interesting hardware features, allowing Windows developers to target OpenGL, and also allowing alternate plaforms to access the new features. Their OpenGL support surpasses any other consumer grade hardware manufacturer's, and they offer better cross plaform support than any graphics company.

    The safest choice any game developer can make is NVIDIA.

    -Mark
  • by bribecka ( 176328 ) on Wednesday August 01, 2001 @08:58AM (#28738) Homepage
    A bipolar competition is ALWAYS good for the consumer.

    You mean like when Netscape and IE were competing? In case you haven't noticed, HTML rendering between the two browsers haven't exactly meshed.

  • Re:General8 (Score:2, Insightful)

    by Gingko ( 195226 ) on Wednesday August 01, 2001 @09:17AM (#36316)
    Say what? NVidia's cards have always rocked (except the ZX chipset admittedly), I agree. But NVidia provide a level of community support *far and away* better than ATI. NVidia host conferences for grad students and their professors. They have developer conferences in many different countries. Matt and Cass from NVidia hang out on opengl.org's [opengl.org] discussion forums and help everyone out (newbies, old hands, the lot). The developer documentation is sublime - and everyone can get at it. Plus their drivers *just work* 9 times out of 10.

    I could care less about driver specs. The 3dfx ones are around if I want to see how modern-ish graphics cards are set up. And their drivers are such good quality, I can see why they don't want mutations springing up all over the web. I certainly don't have a problem with such a pleasant company to work with wanting to hold on to a few secrets.

    Henry
  • Deja vu. (Score:4, Insightful)

    by AFCArchvile ( 221494 ) on Wednesday August 01, 2001 @09:38AM (#40600)
    "Does this mean that future games will be hardware specific?"

    If so, it won't be the first time; remember the days of 3dfx? Original Unreal would only run on Glide hardware acceleration; if you didn't have a 3dfx card, you were forced to run it in software. Of course, this didn't sit well with the growing NVidia user base who consistently pointed out that Quake 2 and Half-Life both rendered on anything running OpenGL (including 3dfx cards; remember those mini-driver days?), and OpenGL and Direct3D renderers were finally introduced in a patch. That's about when 3dfx started to go down the toilet; delaying product releases and missing features (32-bit color and large texture support being two of the most blatant omissions) eventually tainted the 3dfx brand to the point of extinction.

    Since then, 3D gaming has been a less lopsided world. Linux gaming was taken seriously. Standardised APIs that could run on almost anything were the rule; if it wasn't OpenGL, it would at least be Direct3D. Then the GL extensions war heated up, with NVidia developing proprietary extensions that would work only on their cards. But this wasn't a problem; you could still run OpenGL games on anything that could run OpenGL; you'd just be missing out on a few features that would only slightly enhance the scenery.

    Leave it to Microsoft to screw it all up with DirectX 8. They suddenly started talking about pixel shaders and other new ideas. John Carmack has already described the shortfalls and antics of DX8 [planetquake.com]. And now 3D programmers will have to program for multiple rendering platforms, but at least you can still run it with anything.

    Sure, this entire disagreement between ATI and NVidia is bad for the 3D industry, but things could be worse. A LOT worse.

  • by Xugumad ( 39311 ) on Wednesday August 01, 2001 @08:55AM (#43340)

    How is this meant to be good for developers, or consumers? Developers now have three options:

    • Develop for NVidia based cards, which is slower if you have an ATI card
    • Develop for ATI based cards, completely ignoring the NVidia market
    • Develop for both, significantly adding to development effort

    This is also terrible for the consumer. Sorry, but that new card you just spend a small fortune on doesn't support the pixel shader version the game you want uses. Oh well, you'll just have to upgrade to the next card, when it comes out, hope that's okay. But don't worry, it will have lots of new features too (which no-one elses card will support).

  • by Anonymous Coward on Wednesday August 01, 2001 @07:49PM (#43441)
    Five passes sounds like too much to get decent FPS on a GeForce 2
  • by Skynet ( 37427 ) on Wednesday August 01, 2001 @08:46AM (#43550) Homepage
    This is good for hardware because ATI and NVidia will continue to push the envelope, developing more and more advanced graphics boards. Features will creep from one end to the other, just staggered a generation.

    This is good for software because developers will have more choice in the hardware that they develop for. ATI doesn't support super-duper-gooified blob rendering? Ah, NVidia does in their new Geforce5. No worries, ATI will have to support it in their next generation boards.

    A bipolar competition is ALWAYS good for the consumer.

New York... when civilization falls apart, remember, we were way ahead of you. - David Letterman

Working...