ATI Talks Game Support, Future Of Graphics Cards 41
Sergio writes "Slo-Tech have posted an interview with ATI's representative in game developer support, Richard Huddy. He touched on many subjects, including the evolution of DirectX and OpenGL, why ATI doesn't provide much information to Linux driver developers, and the most common mistakes of game developers ('Nine out of ten games under-use the graphics card. That's amazing, and it's been true for the last three or four years.')"
I always buy the cheapest graphics card (Score:2, Interesting)
Re: (Score:2)
Better gaming experience? (Score:3, Insightful)
I can buy an Xbox or PS2 already, and they provide better gaming experiences.
Really? That depends on what experience you're looking for. I play both PC and console games and I generally prefer PC games to consoles for the exact reason you cited: Better gaming experiences.
I like the available titles better. I like the graphics better. I even like the controls better. For example, playing an FPS with an Xbox or PS2 controller doesn't come close to the keyboard/mouse or keyboard/controller-of-choice comb
FYI (Score:2)
however, that doesn't negate your agument.
& another reply to the parent - I'd like to see your cheap card run my dual head setup at 4096x1536 in 32 bit @ 75Hz!
Treat almost all computer parts as consumables and then the sums are quite simple, every two years you're gonna need 1. a new VGA card & 2. a new CPU/MB/RAM
If you're clever you will offset these a year apart.
The required budget for these will be in the region of $500 each. So every month pop $50 in your saving
Re:FYI (Score:2)
You are asking me to drive my Honda Civic @ 300mph. Nice if you have a Ferrari and like to drive that fast, but for me 70mph is enough. About the refresh rate issue, I had a SVGA monitor and my cheap card was able to display in 75Hz.
Liar (Score:2)
http://www.pricewatch.com/1/37/3846-1.htm [pricewatch.com]
the cheapest video card is $4
So you have one of those yeah?
or are you talking out of your Arse?
Did you have to call me liar? (Score:2)
Re:Did you have to call me liar? (Score:1)
However, do consider that a reasonabley high end card can help for eye strain. Having a high refresh rate is a major plus on CRTs. Anything below 75HZ is tough on my eyes. I shoot for 85HZ though. Most cards now will do that, but on under-memory'd cards still have trouble doing that at a high resolution (1600/1200) at 24/32 bit color.
Re:Did you have to call me liar? (Score:1)
Re:Liar (Score:1)
Watch the sass Captain Sassy Pants
A couple of clarifications (Score:3, Informative)
The driver I use is a binary-only one from a German reseller [schneider-digital.de]. They appear to be betas of forthcoming versions of ATi's own drivers. As I have XFree86 4.3.0 this has been a great blessing, as ATi's own Web site [ati.com] only has drivers for XFree86 4.1 and 4.2.
Re:A couple of clarifications (Score:1)
Re:A couple of clarifications (Score:2)
Anyone find it ironic.... (Score:3, Interesting)
I read it as a blatant attempt to persuade buyers that NVIDIA hardware is better than it actually is.
Well gee Rich
when they should be producing better drivers which could enhance the value of all their existing hardware in the market.
Maybe ATI should be the ones doing a little "enhancement of value"
Keep whining about IP issues with linux and trumpeting DirectX 9 compliance. What good is a card with horrible drivers?
Re:Anyone find it ironic.... (Score:1)
Re:Anyone find it ironic.... (Score:2, Interesting)
graphics isn't just games (Score:1)
Worth every penny.
Re:Anyone find it ironic.... (Score:2)
Some people do care and they do not care how far your head is shoved up your...
Ever hear of Unreal Tournament 2003/2004, Doom 3 or Neverwinter Nights?
All are high graphic games that demand the use of high end graphic cards
Re:Anyone find it ironic.... (Score:1)
would that even total 1%.
Maybe you should remove your head first...
Re:Anyone find it ironic.... (Score:2)
ATi refuses to see this and devote all thier time making thier Windows customers happy over and over and over again...Fine with me, if ATi does not want t
Re:Anyone find it ironic.... (Score:1)
Everyone knows ATI has shit driver support.
One of the most annoying things about ATI is their drivers. I'm not talking Radeon series here, b/c I have no experience with them, but
I have an Xpert 2000 in my pc (i know, i know - I got it in '99 I think); and the driver support is absolutely horrible. When I first got the card, drivers wouldn't come out nearly as often as NVidia's. Which is a major problem when the few drivers that did come out were complete crap.
Granted, if ATI released drivers once in a
Re:Anyone find it ironic.... (Score:3, Informative)
ATI has really cleaned up their act with respect to drivers. They've moved to a unified driver architecture (CATALYST) for their Raedon series, not unlike NVidia's Detonator drivers. So they might be a little slower with driver rollout currently than NVidia is. ATI has impressed me with the speed at which they caught up with NVidia architecturally, and given the bad rap the
Re:Anyone find it ironic.... (Score:2)
It's all about the drivers. I've owned 4 ATI products. Starting with a first gen Rage chip card that gave me no problems, then two cards over the next couple years (both All In Wonde
Nine out of ten games under-use the graphics card (Score:5, Informative)
Newer features are exposed in newer drivers. So they may be buggy and perform sub-optimally. There's a learning curve. And enough newer hardware is difficult to get for the developer, how many users will have it?
Build a modular engine. And try to squeeze in that feature in the next iteration.
Graphics are nice, but gameplay is where it's at (Score:2)
Seriously though, I used to have an ATI a while back, and eventually bought a GeForce2 MX mostly because it was supported in Linux. I completely expect this card to still be in my system 4 years from now, since I hardly ever use 3D acceleration. If I want video games, I have my Dreamcast.
Re:Nine out of ten games under-use the graphics ca (Score:4, Interesting)
The reason that the graphics card is under-used is for two reasons. The first is that game designers, as you said, want the game to work well on lower end platforms. If your cpu is good enough you can add better graphical features even if your video card doesn't support it. For example Descent 3 adds motion blur if your cpu is L33t enough. However, the main reason that video cards are underused is that it's a fucking pain in the ass.
Let's say for instance that your name is not John Carmack, and you want to make a 3d game. You code it in C++, Cg, Objective C, Visual C++, OpenGL, DirectX9, whatever. You have all these different layers of 3d graphics you have to deal with, unless you just use someone else's engine. You've got shadows, lighting, colors, textures, mip mapping, bump mapping, shading, animating, it goes on and on with the insane number of things you have to do. When new apis emerge that let you use the video card to do more stuff it is very difficult to learn that much more crap.
It happens all the time no matter what you are coding. Ohhh, there's a function in that library that does that for me! I just wasted my time! Ohhh, there's a way that I can make the video card do that really easily and I will save some cpu and some ram! Nobody, sans Carmack, knows this stuff well enough that they can use the graphics card 100% of the time it is possible and better to do so. Except of course in the most graphically simple of programs. However, I do urge game developers to try to use the GPU more than the CPU. It results in many advantages for the gamer.
Re:Nine out of ten games under-use the graphics ca (Score:2)
If you're saying that the performance of a GF4 can be equivalent to the performance of a TNT2 + fast CPU, then you my friend are on crack.
So, there is no reason for game designers to code to the least common denominator, they just have to co
Re:Nine out of ten games under-use the graphics ca (Score:1)
If you'd read the post, you'd see that that's not what he's saying at all. What he's saying is that by coding to an abstracted API that is common across many different chipsets, you don't have to code to the chipset at all. So if the game uses DirectX, you can run the game on a TNT2 or a GeForce4 and the application will still *work*. He didn't say it would work we
Re:Nine out of ten games under-use the graphics ca (Score:1)
But, if you code to an abstracted API in such a way that it will work on a TNT2, you are definitely *not* using the full capabilities of the card, unless you'
Re:Nine out of ten games under-use the graphics ca (Score:1)
Well, you have some cl
Best quote ever from a hardware company (Score:4, Funny)
I kept expecting him to rip into Nvidia's momma at some point.
Yeah, right (Score:1, Insightful)
Can you say bullshit?
Re:Yeah, right (Score:3, Insightful)
BULLSHIT, yup...I can say it
This guy has no idea what he's talking about claiming that ATi makes good drivers for Linux...they barely have ANY drivers for Linux. They have 2 RPMs, one for xfree4.1 and one for xfree4.2...xfree4.3 has be available for 6 months and they don't have squat for it. Unless you trust a third party driver for a card that is not even a Radeon, I would much rather see the CHIP MAKER releasing any kind of driver that works...binary or not
Not to mention these
Pot! Kettle! Black! Pot! Kettle! Black! (Score:2, Informative)
under use the graphics card ? (Score:1)
After all, it is already necessary to practically change PC every 18 months to keep the framerates fluid.
Re:under use the graphics card ? (Score:2)
That would be yours. Someone putting a gun to your head to make you buy that shit-hot game? Didn't think so.
9 out of 10 gamers underuse the cards because... (Score:2)
Dolemite
__________________