Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Graphics Software Entertainment Games

ATI Talks Game Support, Future Of Graphics Cards 41

Sergio writes "Slo-Tech have posted an interview with ATI's representative in game developer support, Richard Huddy. He touched on many subjects, including the evolution of DirectX and OpenGL, why ATI doesn't provide much information to Linux driver developers, and the most common mistakes of game developers ('Nine out of ten games under-use the graphics card. That's amazing, and it's been true for the last three or four years.')"
This discussion has been archived. No new comments can be posted.

ATI Talks Game Support, Future Of Graphics Cards

Comments Filter:
  • ...because I don't play 3D games on PC. For the price of a high-end graphics card I can buy an Xbox or PS2 already, and they provide better gaming experiences. On the Windows desktop, 3D acceleration is completely unused, and many people spend money on expensive graphics cards for no reason. I am looking forward to seeing how Longhorn utilizes 3D acceleration for the next generation Windows UI.
    • Comment removed based on user account deletion
    • I can buy an Xbox or PS2 already, and they provide better gaming experiences.

      Really? That depends on what experience you're looking for. I play both PC and console games and I generally prefer PC games to consoles for the exact reason you cited: Better gaming experiences.

      I like the available titles better. I like the graphics better. I even like the controls better. For example, playing an FPS with an Xbox or PS2 controller doesn't come close to the keyboard/mouse or keyboard/controller-of-choice comb

      • by DrSkwid ( 118965 )
        Full PAL resolution is 720x576

        however, that doesn't negate your agument.

        & another reply to the parent - I'd like to see your cheap card run my dual head setup at 4096x1536 in 32 bit @ 75Hz!

        Treat almost all computer parts as consumables and then the sums are quite simple, every two years you're gonna need 1. a new VGA card & 2. a new CPU/MB/RAM

        If you're clever you will offset these a year apart.
        The required budget for these will be in the region of $500 each. So every month pop $50 in your saving
        • I'd like to see your cheap card run my dual head setup at 4096x1536 in 32 bit @ 75Hz!

          You are asking me to drive my Honda Civic @ 300mph. Nice if you have a Ferrari and like to drive that fast, but for me 70mph is enough. About the refresh rate issue, I had a SVGA monitor and my cheap card was able to display in 75Hz.

    • According to this page :

      http://www.pricewatch.com/1/37/3846-1.htm [pricewatch.com]

      the cheapest video card is $4

      So you have one of those yeah?

      or are you talking out of your Arse?

      • A few years back when I built my 60MHhz Pentium desktop, I specifically asked the shop to include the cheapest video card they had. And it turned out to be similar to the one your pointed out (PCI 1MB). I was running Windows NT 4.0 without any games installed, the cheap video card was simply enough for me. Anyway, my current PC is a laptop, so I don't get to choose the video card.
        • Hmmm, I'm not going to give you a hard time about not wanting a top end card for you computer, and as you have a laptop its not such an issue.

          However, do consider that a reasonabley high end card can help for eye strain. Having a high refresh rate is a major plus on CRTs. Anything below 75HZ is tough on my eyes. I shoot for 85HZ though. Most cards now will do that, but on under-memory'd cards still have trouble doing that at a high resolution (1600/1200) at 24/32 bit color.
          • Don't you realize that refresh is balanced directly against persistance to create a stable image? Depending upon the monitor's persistance, a lower scanrate will be as good or better on the eyes than your very high ones. High persistance of course makes rapid motion stuff suffer, so it's not so common, but I've definitely used systems at 60hz without any sort of problem and others (more recently) at 80hz with unpleasantness.
      • This deserves a +1 Funny, b/c I have never seen a $4.95 graphics card - and that is amusing.

        Watch the sass Captain Sassy Pants
  • by Yeechang Lee ( 3429 ) on Monday June 23, 2003 @11:12PM (#6280847)
    The reason I don't know what kind of card I have is that, although I purchased a brand new, shrinkwrapped Radeon 9500 Pro retail box, I noticed that the card looked somewhat different from the OEM (I think Sapphire) 9500 Pro I'd used for a few days previously. Didn't think much of this until I started seeing indications in places like the XFree86 log that the card might be a 9700, instead of a 9500 Pro (I do realize the two cards both use the R300 NE chipset). Also, that Antalus flyby score is meaningfully higher than what Tom's Hardware [tomshardware.com] found for 1280x1024x32 on a much faster system than my two-year old Athlon 1.4GHz. Haven't bothered to crack the case open again since, so until I do I'll happily enjoy the illusion that I somehow ended up with a 9700 for the price of a 9500 Pro.

    The driver I use is a binary-only one from a German reseller [schneider-digital.de]. They appear to be betas of forthcoming versions of ATi's own drivers. As I have XFree86 4.3.0 this has been a great blessing, as ATi's own Web site [ati.com] only has drivers for XFree86 4.1 and 4.2.
  • by reaper20 ( 23396 ) on Monday June 23, 2003 @11:40PM (#6281049) Homepage
    To hear an ATI engineer criticize NVIDIA drivers?

    I read it as a blatant attempt to persuade buyers that NVIDIA hardware is better than it actually is.

    Well gee Rich ... my NVIDIA card works at nearly the same performance in both Windows AND linux. Kind of hard to critize NVIDIA when you're drivers barely work, on ANY OS.

    when they should be producing better drivers which could enhance the value of all their existing hardware in the market.

    Maybe ATI should be the ones doing a little "enhancement of value" ... I can't count how many ATI users try linux just to find that their brand new shiny Radeon is worthless in the OS. In the meantime nvidia users are enjoying full support from their manufacturer.

    Keep whining about IP issues with linux and trumpeting DirectX 9 compliance. What good is a card with horrible drivers?
    • Not all of use care about linux though. There aren't even any of the latest games being released on it to make use of high end graphics.
      • Maybe if the drivers were better, more games would be released on Linux. More people might play games on Linux if the performance was higher, which leads to developers seeing an interest in Linux ports or more money going to WineX [transgaming.com].
      • 4096x1536 in 32bit colour @ 75Hz across two 21" monitors.

        Worth every penny.

      • Not all of use care about linux though. There aren't even any of the latest games being released on it to make use of high end graphics.

        Some people do care and they do not care how far your head is shoved up your...
        Ever hear of Unreal Tournament 2003/2004, Doom 3 or Neverwinter Nights?
        All are high graphic games that demand the use of high end graphic cards
        • yawm... 4 games out of how many have been (or about to be) released.

          would that even total 1%.

          Maybe you should remove your head first...
          • I only listed 4 because I did not want to be listing them all day. it's still a young market but it has a steady increase as more and more games are released for Linux. Mac has a few more released titles, but there are a lot of projects that make Linux more of a gaming platform then Mac. I've been seeing more titles released for Linux then Mac's recently.

            ATi refuses to see this and devote all thier time making thier Windows customers happy over and over and over again...Fine with me, if ATi does not want t

    • Everyone knows ATI has shit driver support.

      One of the most annoying things about ATI is their drivers. I'm not talking Radeon series here, b/c I have no experience with them, but

      I have an Xpert 2000 in my pc (i know, i know - I got it in '99 I think); and the driver support is absolutely horrible. When I first got the card, drivers wouldn't come out nearly as often as NVidia's. Which is a major problem when the few drivers that did come out were complete crap.

      Granted, if ATI released drivers once in a
      • One of the most annoying things about ATI is their drivers. I'm not talking Radeon series here, b/c I have no experience with them

        ATI has really cleaned up their act with respect to drivers. They've moved to a unified driver architecture (CATALYST) for their Raedon series, not unlike NVidia's Detonator drivers. So they might be a little slower with driver rollout currently than NVidia is. ATI has impressed me with the speed at which they caught up with NVidia architecturally, and given the bad rap the
        • I was just fighting with a Raedeon card last night that would REFUSE to install on a PC with a week old install of XP. It gave a cryptic error that isn't mentioned in the documentation, nor on ATI's site, nor on the manufacturer's (PowerColor) site, hell I only found a few hits on google for it, most of which were in german.

          It's all about the drivers. I've owned 4 ATI products. Starting with a first gen Rage chip card that gave me no problems, then two cards over the next couple years (both All In Wonde
  • by Screaming Lunatic ( 526975 ) on Tuesday June 24, 2003 @12:02AM (#6281209) Homepage
    Well duh. It's called the lowest common denominator. The game has to run well on the minimum requirements machine. And the game has to be solid on the recommended requirements machine. It's called scale. You have to support the greatest number of users by putting in a reasonable amount of effort.

    Newer features are exposed in newer drivers. So they may be buggy and perform sub-optimally. There's a learning curve. And enough newer hardware is difficult to get for the developer, how many users will have it?

    Build a modular engine. And try to squeeze in that feature in the next iteration.

    • Personally, I could care less if a game uses all the fancy features of a graphics card... if the game isn't fun, all the graphics in the world won't help it. At least ATI has a card with ASCII accelleration [bbspot.com], heheh.

      Seriously though, I used to have an ATI a while back, and eventually bought a GeForce2 MX mostly because it was supported in Linux. I completely expect this card to still be in my system 4 years from now, since I hardly ever use 3D acceleration. If I want video games, I have my Dreamcast.
    • by Apreche ( 239272 ) on Tuesday June 24, 2003 @01:11AM (#6281582) Homepage Journal
      That is true. However, if you install the newest DirectX (which will install no matter how shitty your pc is), and you try to play a game written for say a GeForce4 and you have a TNT2, then your pc can figure out what the card should do and what the cpu should do. So, there is no reason for game designers to code to the least common denominator, they just have to code for the newest DirectX/OpenGL and the user's pc turns on available options for their card, and disables the others.

      The reason that the graphics card is under-used is for two reasons. The first is that game designers, as you said, want the game to work well on lower end platforms. If your cpu is good enough you can add better graphical features even if your video card doesn't support it. For example Descent 3 adds motion blur if your cpu is L33t enough. However, the main reason that video cards are underused is that it's a fucking pain in the ass.

      Let's say for instance that your name is not John Carmack, and you want to make a 3d game. You code it in C++, Cg, Objective C, Visual C++, OpenGL, DirectX9, whatever. You have all these different layers of 3d graphics you have to deal with, unless you just use someone else's engine. You've got shadows, lighting, colors, textures, mip mapping, bump mapping, shading, animating, it goes on and on with the insane number of things you have to do. When new apis emerge that let you use the video card to do more stuff it is very difficult to learn that much more crap.

      It happens all the time no matter what you are coding. Ohhh, there's a function in that library that does that for me! I just wasted my time! Ohhh, there's a way that I can make the video card do that really easily and I will save some cpu and some ram! Nobody, sans Carmack, knows this stuff well enough that they can use the graphics card 100% of the time it is possible and better to do so. Except of course in the most graphically simple of programs. However, I do urge game developers to try to use the GPU more than the CPU. It results in many advantages for the gamer.
      • That is true. However, if you install the newest DirectX (which will install no matter how shitty your pc is), and you try to play a game written for say a GeForce4 and you have a TNT2, then your pc can figure out what the card should do and what the cpu should do.

        If you're saying that the performance of a GF4 can be equivalent to the performance of a TNT2 + fast CPU, then you my friend are on crack.

        So, there is no reason for game designers to code to the least common denominator, they just have to co

        • If you're saying that the performance of a GF4 can be equivalent to the performance of a TNT2 + fast CPU, then you my friend are on crack.

          If you'd read the post, you'd see that that's not what he's saying at all. What he's saying is that by coding to an abstracted API that is common across many different chipsets, you don't have to code to the chipset at all. So if the game uses DirectX, you can run the game on a TNT2 or a GeForce4 and the application will still *work*. He didn't say it would work we
          • What he's saying is that by coding to an abstracted API that is common across many different chipsets, you don't have to code to the chipset at all. So if the game uses DirectX, you can run the game on a TNT2 or a GeForce4 and the application will still *work*. He didn't say it would work well on a TNT2 if it's written to GeForce4 level capabilities.

            But, if you code to an abstracted API in such a way that it will work on a TNT2, you are definitely *not* using the full capabilities of the card, unless you'
      • That is true. However, if you install the newest DirectX (which will install no matter how shitty your pc is), and you try to play a game written for say a GeForce4 and you have a TNT2, then your pc can figure out what the card should do and what the cpu should do. So, there is no reason for game designers to code to the least common denominator, they just have to code for the newest DirectX/OpenGL and the user's pc turns on available options for their card, and disables the others.

        Well, you have some cl
  • by blincoln ( 592401 ) on Tuesday June 24, 2003 @12:11AM (#6281263) Homepage Journal
    "As we've seen from the recent furore over driver cheats it seems likely that they don't plan to let their own inferior hardware come between them and first place."

    I kept expecting him to rip into Nvidia's momma at some point.
  • Yeah, right (Score:1, Insightful)

    by Anonymous Coward
    "ATI gives Linux drivers quite a high priority"

    Can you say bullshit?
    • Re:Yeah, right (Score:3, Insightful)

      by mahdi13 ( 660205 )
      Can you say bullshit?

      BULLSHIT, yup...I can say it

      This guy has no idea what he's talking about claiming that ATi makes good drivers for Linux...they barely have ANY drivers for Linux. They have 2 RPMs, one for xfree4.1 and one for xfree4.2...xfree4.3 has be available for 6 months and they don't have squat for it. Unless you trust a third party driver for a card that is not even a Radeon, I would much rather see the CHIP MAKER releasing any kind of driver that works...binary or not

      Not to mention these
  • by Anonymous Coward
    What balls! ATI were the first to do 'application specific optimisations' [slashdot.org]! And their DX table fog (or was it vertex fog?) still doesn't work properly!
  • Well gee whiz. And I suppose we should thank them ?
    After all, it is already necessary to practically change PC every 18 months to keep the framerates fluid.
  • ...the graphic card companies hardly ever release information on how to optimize cards for a given operating system.

    Dolemite
    __________________

After the last of 16 mounting screws has been removed from an access cover, it will be discovered that the wrong access cover has been removed.

Working...