DirectX Architect — Consoles as We Know Them Are Gone 434
ThinSkin writes "DirectX architect Alex St. John swims against the current and predicts the demise not of PC gaming, but of game consoles, in an exclusive two-part interview at ExtremeTech. In part one, Alex blasts Intel for pushing its inferior onboard graphics technology to OEMs, insists that fighting piracy is the main reason for the existence of gaming consoles, and explains how the convergence of the GPU and the CPU is the next big thing in gaming. Alex continues in part two with more thoughts on retail and 3D games, and discusses in detail why he feels 'Vista blows' and what's to become of DirectX 10."
Why Microsoft Dislikes Intel Graphics (Score:5, Informative)
Bruce
WildTangent has been a dead end since 2001 (Score:4, Informative)
Bottom line: Nothing to see here, move along.
Re:Why Microsoft Dislikes Intel Graphics (Score:5, Informative)
I'll enumerate the primary reasons quickly, since I don't expect you to be intimately familiar with the relationship between graphics programmers and graphics driver developers (it's drastically different from Intel's relationship with the X developers):
1) Intel graphics drivers are possibly the most inconsistent drivers on the market. Any given user with a particular Intel chipset might have one of a hundred different driver configurations, as a result of the fact that the chips are bundled with different motherboards which then come with their own driver package... and when you add pre-built machine vendors into the mix the situation is only worse. If their driver quality was extremely high across the board, this wouldn't be an issue, but...
2) Intel graphics drivers have a bad stability track record, at least on Windows. They have a tendency to return invalid/nonsensical error codes from driver calls that shouldn't be able to fail, or to silently fail out inside a driver call instead of returning the error code they're supposed to... resulting in graphics programmers having to special-case handling of individual Intel graphics chipsets (and even driver revisions). In my case, I ended up just having to shut off entire blocks of my hardware-accelerated pipeline on Intel chipsets and replace them with custom software implementations to avoid the incredible hassle involved in coming up with specific fixes. (The wide variety of chipsets and drivers out there meant that for my particular project - an indie game - it was impossible to ensure that I had worked around every bug a user was likely to hit, so I had to just opt out of hardware accel in problem areas entirely).
3) Intel graphics chipsets have sub-par performance across the board, despite marketing claims otherwise. This is mostly problematic for people developing 'cutting-edge' games software, where it creates a 'he-said-she-said' situation with a game developer/publisher claiming that a user's video chipset is insufficient to run a game while Intel claims the complete opposite. (in most cases, Intel is lying.) This is particularly troublesome in areas like support for cutting-edge shader technology, where an Intel chipset may 'support' a feature like Pixel Shader Model 3.0 but implement it in such a way to make it completely unusable. Users don't benefit from this, and neither do developers.
4) Intel graphics chipsets harm the add-on graphics market by discouraging users from picking up a (significantly better) bargain video card from NVidia/ATI for $50 and dropping it into their machine. This hurts everyone because even though that bargain card is significantly better (and most likely more reliable), the user already 'paid' for the integrated chipset on their motherboard, and the documentation that comes with it attempts to make them believe that they don't need a video card. I consider this a dramatic step backward compared to the situation years ago, when integrated graphics chipsets were unheard of and people instead had the option of 'bargain 2d' video cards like Trident or Matrox that would do everything needed for desktop 2D, but also had the option of fairly affordable 3D accelerator cards if they wanted to play games occasionally.
On the bright side, most integrated ATI/NVidia GPUs these days are mature enough to be able to run games acceptably and meet the needs of a typical user. The only thing really holding the market back here, in my opinion, is Intel's insistence on marketing inferior products instead of partnering with ATI or NVidia to please their customers.
Of course, this is unrelated to your point that their Linux/Free Software support is superb, as is their documentation - I'm inclined to agree with you here, but it unfortunately doesn't do much to outweigh their other grievous sins.
Re:Piracy? (Score:3, Informative)
First, correlation is not causation.
Second, NPD showed that PS3 has been outselling 360 in Jan '08 and Feb '08.
Re:WildTangent has been a dead end since 2001 (Score:4, Informative)
I try to avoid business with companies that employ those kind of tactics.
Re:Why consoles will win (Score:3, Informative)
Re:That's great, but this isn't a hardware problem (Score:3, Informative)
When was the last time your Play station got a virus?
PCs may be notorious for viruses. That's if you don't keep them secure.
Besides, a PC game-only PC wouldn't have to worry about viruses if they never downloaded anything from the internet. Granted, even if they download stuff, it takes, what, under 20 seconds to scan a file? I've gotten a couple of game patches with viruses.
How much do you spend on your Play station's anti-virus software every month?
AVG, many FOSS alternatives, etc. are free as in beer.
How many controllers can you plug into your PC?
Lets see: Joystick, keyboard, mouse, gamepad, guitar...
You don't even need some of those. A standard keyboard has over 100 keys and replaces gamepads. Then the mouse replaces joysticks and, again gamepads.
When was the last time you had to install a game on your XBox?
Good point.
Or install drivers for your newest controller?
Never, since all mine are plug and play. When's the last time a wireless controller was standard with your PC and you had to buy extra things to make it so you don't have to use batteries?
PC: Mouse to USB, Keyboard to USB, headphones to headphone jack, microphone to microphone jack
360: Batteries to controller, trial and error making controller work since I didn't read manual, headset which I never use except on Live, batteries to trash after only 12 hours of straight playing then find more batteries. OR: Go to store, try to find a freaking charge pack, plug in controller, then plug in 360.
Or work through compatibility issues between your latest game and your PS3's GPU?
Or had the ability to work through customizing graphics to meet your tastes?
It's also true that for the price of a microwave, I can get a nice laptop, that connects to the internet and all that. But it kinda sucks at heating food, doesn't it?
What? That makes no sense. Okay, it makes sense, but not in context.
There's a reason the Wii is selling so well, even though it doesn't even support HD graphics. People don't want something with internet, that can do their taxes, that catches viruses, that they can read their email on, or that has the bestest fastest hardware.
So I suppose Xbox Live is wasted since people don't want internet? I suppose people don't have PCs, but have Xboxes now?
They want something they can play fun games on, with other people, in their living area where the television is, on something that isn't the size of a desktop PC.
How big is a PC case? You do also realize that there are S-Video hookups, right? There's also other ways to hook up a PC to the TV. No monitor required.
And they want those games to work when they plug them in, every time. About the limit you can expect from a console consumer is blowing the dust off the cartridge pins.
The secret to stable PC gaming: Clean installations of Windows without viruses and other malware.
Are PC's more powerful? Sure. But there is a whole bunch of overhead that comes with the advantages of the PC over a game console that are just not worth it to the majority of console players.
So, customizable graphics, modding, (generally) free internet play, 100+ keys, a mouse, and fully customizable controls are not worth it? Their loss.
As far as I am concerned, they both have their pros and cons. PCs cons are major compatibility issues if you have borderline hardware, a dirty system, or old drivers. Drivers also have to be updated all the freaking time.
Consoles, on the other hand, seem to lose par with PC in terms of graphics after the first year and a half (unless they have super powerful hardware unavailable to the PC market). They also have forced control schemes like FPSs: Maybe I want melee to be 'right trigger'? But, no! It won't let me! It must be 'B', 'ri
3D? Meh. (Score:2, Informative)
One of my fav games was Beyond Good and Evil. I *liked* the stylised, cartoonish characterisations. Anyone who loves Anime feels some trepidation at the rise of completely 3D-rendered visuals. They have their place, but better 3D doesn't make a better movie or a better game.
I with they'd put more effort into AI and character movement. What we really need for *immersion* (and better 3D is not equivalent to better immersion either) is dynamic character movement and AI. Sod all this 3D stuff, it's just serving the hardware industry and in the meantime real innovation is being sidelined.