Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Games Entertainment Hardware

DirectX Architect — Consoles as We Know Them Are Gone 434

ThinSkin writes "DirectX architect Alex St. John swims against the current and predicts the demise not of PC gaming, but of game consoles, in an exclusive two-part interview at ExtremeTech. In part one, Alex blasts Intel for pushing its inferior onboard graphics technology to OEMs, insists that fighting piracy is the main reason for the existence of gaming consoles, and explains how the convergence of the GPU and the CPU is the next big thing in gaming. Alex continues in part two with more thoughts on retail and 3D games, and discusses in detail why he feels 'Vista blows' and what's to become of DirectX 10."
This discussion has been archived. No new comments can be posted.

DirectX Architect — Consoles as We Know Them Are Gone

Comments Filter:
  • by Bruce Perens ( 3872 ) * <bruce@perens.com> on Friday March 21, 2008 @07:32PM (#22825154) Homepage Journal
    Microsoft dislikes Intel graphics because they're publicly documented for full 3D use by Linux and other Free Software. Intel has put a tremendous time into developing X for them, employing many of the key X developers. I use them on a laptop and desktop, and they work excellently. They are not yet as fast as some other graphics chips. But then again they are better than anything we had at Pixar when I was there :-) Time flies.

    Bruce

  • by Qbertino ( 265505 ) <moiraNO@SPAMmodparlor.com> on Friday March 21, 2008 @07:42PM (#22825226)
    WildTangent actually gained some attention back in 2001, when the offered a web 3D plugin and a dev-enviroment that didn't cost a bazillion dollars. They let their heels drag, only kept offering their plattform for Windows and basically ignored any opinion-leaders in multimedia or VM-based gaming & 3D. WildTangent today is next to insignificant and their 'Orb' VM console (which afaict only runs on MS OSes) is nothing but a pimped WildTangent Plugin/Player and won't gain any traction beyond some niche group who wants to play a console game on the PC. For whatever reasons there may be.

    Bottom line: Nothing to see here, move along.
  • That's interesting, but this article is about someone who doesn't work for Microsoft anymore, and hates Intel graphics chips for the same reason any other game developer hates them: They're utter garbage.

    I'll enumerate the primary reasons quickly, since I don't expect you to be intimately familiar with the relationship between graphics programmers and graphics driver developers (it's drastically different from Intel's relationship with the X developers):

    1) Intel graphics drivers are possibly the most inconsistent drivers on the market. Any given user with a particular Intel chipset might have one of a hundred different driver configurations, as a result of the fact that the chips are bundled with different motherboards which then come with their own driver package... and when you add pre-built machine vendors into the mix the situation is only worse. If their driver quality was extremely high across the board, this wouldn't be an issue, but...

    2) Intel graphics drivers have a bad stability track record, at least on Windows. They have a tendency to return invalid/nonsensical error codes from driver calls that shouldn't be able to fail, or to silently fail out inside a driver call instead of returning the error code they're supposed to... resulting in graphics programmers having to special-case handling of individual Intel graphics chipsets (and even driver revisions). In my case, I ended up just having to shut off entire blocks of my hardware-accelerated pipeline on Intel chipsets and replace them with custom software implementations to avoid the incredible hassle involved in coming up with specific fixes. (The wide variety of chipsets and drivers out there meant that for my particular project - an indie game - it was impossible to ensure that I had worked around every bug a user was likely to hit, so I had to just opt out of hardware accel in problem areas entirely).

    3) Intel graphics chipsets have sub-par performance across the board, despite marketing claims otherwise. This is mostly problematic for people developing 'cutting-edge' games software, where it creates a 'he-said-she-said' situation with a game developer/publisher claiming that a user's video chipset is insufficient to run a game while Intel claims the complete opposite. (in most cases, Intel is lying.) This is particularly troublesome in areas like support for cutting-edge shader technology, where an Intel chipset may 'support' a feature like Pixel Shader Model 3.0 but implement it in such a way to make it completely unusable. Users don't benefit from this, and neither do developers.

    4) Intel graphics chipsets harm the add-on graphics market by discouraging users from picking up a (significantly better) bargain video card from NVidia/ATI for $50 and dropping it into their machine. This hurts everyone because even though that bargain card is significantly better (and most likely more reliable), the user already 'paid' for the integrated chipset on their motherboard, and the documentation that comes with it attempts to make them believe that they don't need a video card. I consider this a dramatic step backward compared to the situation years ago, when integrated graphics chipsets were unheard of and people instead had the option of 'bargain 2d' video cards like Trident or Matrox that would do everything needed for desktop 2D, but also had the option of fairly affordable 3D accelerator cards if they wanted to play games occasionally.

    On the bright side, most integrated ATI/NVidia GPUs these days are mature enough to be able to run games acceptably and meet the needs of a typical user. The only thing really holding the market back here, in my opinion, is Intel's insistence on marketing inferior products instead of partnering with ATI or NVidia to please their customers.

    Of course, this is unrelated to your point that their Linux/Free Software support is superb, as is their documentation - I'm inclined to agree with you here, but it unfortunately doesn't do much to outweigh their other grievous sins.
  • Re:Piracy? (Score:3, Informative)

    by Jeff DeMaagd ( 2015 ) on Friday March 21, 2008 @08:55PM (#22825738) Homepage Journal
    Whereas the PS3, still a long way from being hacked, doesn't sell as well. Go figure.

    First, correlation is not causation.

    Second, NPD showed that PS3 has been outselling 360 in Jan '08 and Feb '08.
  • by jadin ( 65295 ) on Friday March 21, 2008 @10:06PM (#22826166) Homepage
    Didn't one of their products forcefully install also? (maybe just sneaking in with another install or something) I seem to remember uninstalling something with the name WildTangent that I _know_ I didn't agree to, with the exception of a EULA fine-print /grumble.

    I try to avoid business with companies that employ those kind of tactics.
  • by Tycho ( 11893 ) on Friday March 21, 2008 @11:12PM (#22826524)
    This may make you feel worse. The game play speed on Unreal Tournament III for the PS3 runs 15% slower than the Windows version. This is to say that independent of actual system or frame rate, perceptually the PS3 version will not feel as fast. However the controls should respond in a similar way. This speed difference is perceptible, it may have been introduced to bring the PS3 version of UT3 down to more of a Halo 3-like speed, like console gamers would have expected. The speed difference also ruins any sort of chance for cross-platform play between the Windows and PS3 versions. and is probably bad enough to require some reacclimation time for people who are used to one version and who move to the other. Besides, any console controller is a poor input device for UT3 or any other FPS, because the difference in the level of competitiveness is skewed far in the direction of the mouse and keyboard. Roughly speaking, experienced controller players lose on a consistant basis to experienced mouse and keyboard players. It is significant enough that the PS3 version of UT3 allows an admin to ban mouse and keyboard players.
  • by Minozake ( 1227554 ) <ltdonny@gmail.com> on Saturday March 22, 2008 @01:02AM (#22827004) Journal

    When was the last time your Play station got a virus?

    PCs may be notorious for viruses. That's if you don't keep them secure.

    Besides, a PC game-only PC wouldn't have to worry about viruses if they never downloaded anything from the internet. Granted, even if they download stuff, it takes, what, under 20 seconds to scan a file? I've gotten a couple of game patches with viruses.

    How much do you spend on your Play station's anti-virus software every month?

    AVG, many FOSS alternatives, etc. are free as in beer.

    How many controllers can you plug into your PC?

    Lets see: Joystick, keyboard, mouse, gamepad, guitar...

    You don't even need some of those. A standard keyboard has over 100 keys and replaces gamepads. Then the mouse replaces joysticks and, again gamepads.

    When was the last time you had to install a game on your XBox?

    Good point.

    Or install drivers for your newest controller?

    Never, since all mine are plug and play. When's the last time a wireless controller was standard with your PC and you had to buy extra things to make it so you don't have to use batteries?

    PC: Mouse to USB, Keyboard to USB, headphones to headphone jack, microphone to microphone jack

    360: Batteries to controller, trial and error making controller work since I didn't read manual, headset which I never use except on Live, batteries to trash after only 12 hours of straight playing then find more batteries. OR: Go to store, try to find a freaking charge pack, plug in controller, then plug in 360.

    Or work through compatibility issues between your latest game and your PS3's GPU?

    Or had the ability to work through customizing graphics to meet your tastes?

    It's also true that for the price of a microwave, I can get a nice laptop, that connects to the internet and all that. But it kinda sucks at heating food, doesn't it?

    What? That makes no sense. Okay, it makes sense, but not in context.

    There's a reason the Wii is selling so well, even though it doesn't even support HD graphics. People don't want something with internet, that can do their taxes, that catches viruses, that they can read their email on, or that has the bestest fastest hardware.

    So I suppose Xbox Live is wasted since people don't want internet? I suppose people don't have PCs, but have Xboxes now?

    They want something they can play fun games on, with other people, in their living area where the television is, on something that isn't the size of a desktop PC.

    How big is a PC case? You do also realize that there are S-Video hookups, right? There's also other ways to hook up a PC to the TV. No monitor required.

    And they want those games to work when they plug them in, every time. About the limit you can expect from a console consumer is blowing the dust off the cartridge pins.

    The secret to stable PC gaming: Clean installations of Windows without viruses and other malware.

    Are PC's more powerful? Sure. But there is a whole bunch of overhead that comes with the advantages of the PC over a game console that are just not worth it to the majority of console players.

    So, customizable graphics, modding, (generally) free internet play, 100+ keys, a mouse, and fully customizable controls are not worth it? Their loss.

    As far as I am concerned, they both have their pros and cons. PCs cons are major compatibility issues if you have borderline hardware, a dirty system, or old drivers. Drivers also have to be updated all the freaking time.

    Consoles, on the other hand, seem to lose par with PC in terms of graphics after the first year and a half (unless they have super powerful hardware unavailable to the PC market). They also have forced control schemes like FPSs: Maybe I want melee to be 'right trigger'? But, no! It won't let me! It must be 'B', 'ri

  • 3D? Meh. (Score:2, Informative)

    by cavebison ( 1107959 ) on Saturday March 22, 2008 @08:48AM (#22828594)
    His whole argument rests on the assumption that better 3D = better games. Everyone knows that's essentially untrue. UT3 is a case in point. Is it more fun to play that UT2004 simply because the gfx are way better? WoW is another case. Of course it would look nicer with better gfx, but would it be more fun or more popular because of it? Doubt it.

    One of my fav games was Beyond Good and Evil. I *liked* the stylised, cartoonish characterisations. Anyone who loves Anime feels some trepidation at the rise of completely 3D-rendered visuals. They have their place, but better 3D doesn't make a better movie or a better game.

    I with they'd put more effort into AI and character movement. What we really need for *immersion* (and better 3D is not equivalent to better immersion either) is dynamic character movement and AI. Sod all this 3D stuff, it's just serving the hardware industry and in the meantime real innovation is being sidelined.

The key elements in human thinking are not numbers but labels of fuzzy sets. -- L. Zadeh

Working...