Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Microsoft Graphics Software Entertainment Games

MS Says Windows 7 Will Run DirectX 10 On the CPU 503

arcticstoat writes "In what could be seen as an easy answer to the Vista-capable debacle, Microsoft has introduced a 'fully conformant software rasterizer' called WARP (Windows Advanced Rasterization Platform) 10, which does away with the need for a dedicated hardware 3D accelerator altogether. Microsoft says that WARP 10 will support all the features and precision requirements of Direct3D 10 and 10.1, as well as up to 8x multi-sampled anti-aliasing, anisotropic filtering and all optional texture formats. The minimum CPU spec needed is just 800MHz, and it doesn't even need MMX or SSE, although it will work much quicker on multi-core CPUs with SSE 4.1. Of course, software rendering on a single desktop CPU isn't going to be able to compete with decent dedicated 3D graphics cards when it comes to high-end games, but Microsoft has released some interesting benchmarks that show the system to be quicker than Intel's current integrated DirectX 10 graphics. Running Crysis at 800 x 600 with the lowest quality settings, an eight-core Core i7 system managed an average frame rate of 7.36fps, compared with 5.17fps from Intel's DirectX 10 integrated graphics."
This discussion has been archived. No new comments can be posted.

MS Says Windows 7 Will Run DirectX 10 On the CPU

Comments Filter:
  • by Pr0xY ( 526811 ) on Sunday November 30, 2008 @04:40AM (#25931427)

    Running Crysis isn't the point of the demo. The point was that it was a DX 10 application running entirely in software. In the end, this means that systems without higher end 3D cards would be able to run Aero. THAT's the point.

    They are trying to address the main complaint of the "Vista Capable" debacle. Running Crysis was just a way of demonstrating the capability.

  • Re:From the summary: (Score:5, Informative)

    by Pr0xY ( 526811 ) on Sunday November 30, 2008 @04:41AM (#25931433)

    As I said in another post:

    Running Crysis isn't the point of the demo. The point was that it was a DX 10 application running entirely in software. In the end, this means that systems without higher end 3D cards would be able to run Aero. THAT's the point.

    They are trying to address the main complaint of the "Vista Capable" debacle. Running Crysis was just a way of demonstrating the capability.

  • Comment removed (Score:5, Informative)

    by account_deleted ( 4530225 ) on Sunday November 30, 2008 @04:50AM (#25931477)
    Comment removed based on user account deletion
  • by Sparks23 ( 412116 ) * on Sunday November 30, 2008 @05:41AM (#25931703)

    Because, quite frankly, people were upset that their 'Vista Capable' computers couldn't run Vista with Aero enabled. The integrated cards don't have the 'oomph' for Aero's glassy transparency effects, but Microsoft had tooted the horn of 'Look! Shiny!' loud and long, so people expected that functionality. In addition, there are other places extended graphics capabilities are used (the Vista DVD maker program, for instance), where if your card isn't up to snuff, you can't use those programs.

    By showing 'we can make this work in software, slowly, but work,' they're trying to address that. This isn't for gaming, despite the demo. This is an attempt to solve the problem out of the gate in Windows 7 so that they don't have another Vista Capable type class action suit.

  • Re:From the summary: (Score:1, Informative)

    by Anonymous Coward on Sunday November 30, 2008 @06:11AM (#25931823)

    If they did Aero in openGL they'd have software rendering from the beginning. Oh right, not invented here.. my mistake..

  • Re:From the summary: (Score:5, Informative)

    by vux984 ( 928602 ) on Sunday November 30, 2008 @06:31AM (#25931905)

    Well... then they better try again. It still sounds like a complete failure to me. Since the integrated graphics is equivalent, there is no advantage, and no resolution to the problem. What exactly are you try to get at?

    Except the integrated graphics on a bunch of 'Vista Capable' laptops DON'T do DirectX10 or Aero... but if a patch to Vista (or Windows 7) will get Aero working on directX10 on the CPU... a buttload of PCs that CAN'T currently do Aero, now CAN.

  • Re:lol (Score:1, Informative)

    by Anonymous Coward on Sunday November 30, 2008 @06:37AM (#25931931)

    amusingly, the larrabee drivers, ie the software renderer which runs on the larrabee cores, work in exactly the same way.
    also, the main guy behind those, is an ex-ms guy (abrash) who also previously has worked on a software DX renderer for RAD...

  • by Ralish ( 775196 ) <sdl@@@nexiom...net> on Sunday November 30, 2008 @07:00AM (#25931993) Homepage

    I'm sorry, but Peter Gutmann is not a reputable source for accurate information on Vista graphics, or anything related to Vista at all. Several of his claims have been widely proven to be exaggerated or downright false, and when asked to provide proof, he has refused. His claims have been picked apart on numerous sites both directly and indirectly through the sourcing of benchmarks.

    I suggest you read these articles for instance, which provide a good overview:
    http://blogs.zdnet.com/Ou/?p=673 [zdnet.com]
    http://blogs.zdnet.com/Ou/?p=718 [zdnet.com]

    Some of his points are admittedly valid, there are genuine flaws in the new graphics driver device spec., but he's clearly most concerned with pushing an anti-Vista agenda, even if that requires resorting to FUD.

    Choose your "experts" carefully.

  • by cbhacking ( 979169 ) <been_out_cruisin ... m ['hoo' in gap]> on Sunday November 30, 2008 @07:17AM (#25932065) Homepage Journal

    I presume you're referring to the article in which he described Vista as something like "quite possibly the longest suicide note in history"? I read it back in 06 shortly after it was initially published, I didn't know Gutmann's work terribly well before reading that, but he came highly recommended.

    However, that article cost him about 98% credibility with me. Some of it - even some of the really bad stuff - might in fact be true. However, there were trivially verfiable claims he made which were blatantly untrue (an example being that ATI, nVidia, and other graphics companies were going to need to switch away from unified drivers, and provide a different driver for each card model - which by the time the article went public was an obvious falsehood since you could download and install the beta Vista drivers for any card in a given family and they would work fine).

    If the man can't be bothered to do even that minimal an amount of research (it also didn't help that he refused to disclose any of his sources) then he has no business publishing in anything but tabloids, nor does he have any place in academic circles. I am a student, not a professor, but if I had written such tripe and submitted it to anybody who knew what I was talking about, I'd have been laughed out of the department.

    Incidentally, the article has been edited at leas three times since its initial publication. While I have no objection to revising, it is usually done prior to publication, not afterwards. Furthermore, while some of the more blatantly false claims are missing from the latest version, Gutmann neither addresses nor explicitly retracts those statements. It is although he wishes to remove the original statements entirely, though nothing controversial on the Internet ever vanishes so thoroughly as that.

  • by mike_sucks ( 55259 ) on Sunday November 30, 2008 @07:51AM (#25932209) Homepage

    Actually, the whole point of OpenGL was to provide software- and hardware- vendor agnostic API for writing applications that perform 3D rendering. You've clearly been living in a monoculture too long if you can't see that.

    Software fallback is nice to have but, it's certainly not the reason OGL exists.

    /Mike

  • Re:Ummmm (Score:4, Informative)

    by Sycraft-fu ( 314770 ) on Sunday November 30, 2008 @08:10AM (#25932295)

    No, those soundcards aren't anything more than that. They just get the signal and convert it for the computer. I've owned a few, and worked with many more. Thus far I haven't seen any that do anything past conversion, routing, and perhaps basic mixing (basically those that have more advanced routing). Their function is to convert the sound to a format the PC can use and hand it off, nothing more. That they are external has nothing to do with it. That is done for convenience (hard to pack a lot of inputs on a PCI card) and noise (don't need to worry about dealing with all the RF from the computer). Firewire is often used since it has DMA and thus works well for low latency sound, but there's others that use their own PCI card and interface (MOTU does both, for example).

    Now I leave open the possibility there are ones I haven't encountered that do something more, but those I've seen are just soundcards.

    You forget that timing isn't an issue on the computer. Everything on there is asynchronous, clockless. The audio is just a stream of bits on disk. The computer never processes it at the sample rate, that is just a number stored in the file. So soundcards don't do anything special in this regard other than have a good clock to which everything is slaved (or perhaps a word clock input for external clocking). Once the audio has been converted and handed off to the system, timing isn't an issue anymore. The only difference between a cheap consumer card and an expensive pro card in this regard is the quality of timing source, and perhaps if everything is locked to a single source.

    In fact, you'll find that there is often more processing done on consumer cards, than on pro cards. Pro cards just convert the signal from analogue or S/PDIF or whatever and feed it in to the computer. Consumer cards often do sample rate conversion, and sometimes various other kinds of processing. In fact the card with the most muscle I'm aware of (leaving out dedicated hardware like the HDAccel) is the SoundBlaster X-Fi. That can handle 128 different sound sources in hardware, do SRC on all of them, place them in space, and perform effects on them. Compare that to a MOTU HD192 which does little more than deal with audio input and output, and mix/route it as you specify.

    The money/hardware in pro cards is in high quality circuitry, mostly in the analogue phase, not any special processing.

  • Re:Yes. (Score:3, Informative)

    by Targon ( 17348 ) on Sunday November 30, 2008 @08:47AM (#25932451)

    If you have people running DX 10 games on a server, then you either have major staffing problems, or the server is nothing more than a glorified workstation. There is ZERO need for DX 10 graphics on a true server, and really, the need for a GUI should be near zero(unless the server software vendor can't figure out how to code for a true server).

    Yes, there are good uses for having a GUI available on a server, but for normal maintenance, a command line SHOULD be all that is needed to reduce the overhead. the GUI places on the system.

  • by abigsmurf ( 919188 ) on Sunday November 30, 2008 @08:57AM (#25932489)
    Because Microsoft, amazingly enough, realised there were going to be plenty of cheapo DX10 cards which aren't fully featured, it would run the unsupported elements using DX9 functions or emulation taking a visual or performance hit. These hacks almost certainly cause DirectX to look at your DX10 card in XP and all it will be able to see is a cheapo card which can't handle a lot of functions.

    With XP you're never going to have full DX10 support. The kernel can't physically do a lot of the functions itself. With DX10, Microsoft wanted to give developers a whole new framework without having to worry about legacy DX code.

    Get someone to code a tech demo comprised of nothing but DX10 specific functions (the large texture sizes etc.) and you'll be able see the difference. It's hard to tell the difference at the moment because stuff like Crysis implements DX10 poorly.

  • Re:Oh boy. (Score:5, Informative)

    by RedK ( 112790 ) on Sunday November 30, 2008 @09:16AM (#25932571)
    Windows 95 didn't invent disk compression. Stacker and Doublespace were some products that did the same thing for DOS.
  • Re:Grrrreat! (Score:3, Informative)

    by UserChrisCanter4 ( 464072 ) * on Sunday November 30, 2008 @11:25AM (#25933307)

    Just a heads up: the PCI 6200 has some known problems with video playback. They were all driver-related, but as far as I know, NVidia never fixed them because the 6200 was always a fairly low volume unit and has now been dropped altogether.

    Google "GeForce 6200 video won't play" or something similar and you'll see the number of forum threads and posts where people complain about how this version of the drivers works but not this version and so on.

    The solution that's usually thrown about: disable hardware acceleration.

  • by TheRaven64 ( 641858 ) on Sunday November 30, 2008 @11:29AM (#25933333) Journal

    The client-server model of OpenGL works well because GPU programming is a client-server model - the application is the client, running on the CPU, and the server is on the GPU. You need to transfer data to a coprocessor and process it remotely. Direct3D does exactly the same thing.

    If you'd looked at the OpenGL 3 spec, instead of reading tabloid reports, then you'd see that it has some pretty major changes. The entire fixed-function pipeline is basically gone (although it can be emulated easily in shaders) and a load of stuff is marked as deprecated, and will be completely removed in the new release. There is a clean and simple subset of OpenGL 3 that is forwards-compatible, and another subset that is backwards compatible. Bringing out OpenGL 3 which was completely different to OpenGL 2 would have been pointless - why switch to it rather than another API?

  • The argument is dead (Score:1, Informative)

    by Anonymous Coward on Sunday November 30, 2008 @12:16PM (#25933647)
    Exactly how delusional are you to believe that a piece of software can't be ran on another OS just because some company says that it can't? Have you never heard of Wine? Emulators? Virtual machines?

    The fact that the Wine crew have already reverse engineered part of DX10 and have it running over OpenGL should make it exceptionally obvious that there is no concern that DX10 has that cannot be performed by some other abstraction beneath it. DX10 is not a kernel, it's not it's own operating system. And even if it were, we could sneak another piece of software under it *still* and have it running on other hardware.

    Microsoft's chief argument for DX10 was some bullshit argument about missing features in hardware. Guess what? They just wrote a damned software rasterizer, which does the exact same damned thing that they said wasn't possible for XP. The argument is dead.

    When you wake up from your delusional coma, maybe you'll understand that.
  • Re:Oh boy. (Score:5, Informative)

    by Belial6 ( 794905 ) on Sunday November 30, 2008 @03:15PM (#25935225)
    And in fact, DOS 6 also had disk compression. You could even see that they had illegally copied Stackers code because they forgot to take out Stackers copyright notice.
  • Re:Oh boy. (Score:5, Informative)

    by oakgrove ( 845019 ) on Sunday November 30, 2008 @04:05PM (#25935767)
    I would prescribe a healthy dose of Arch Linux for this problem you're having. I have an old Toshiba laptop laying around here that I had given up for dead. 600MHz Celeron, 192MB RAM, 12GB HDD. It came with Windows 2000 and was tolerable, I suppose. Only problem is, I don't know anything about Windows and none of my command line-fu worked on it so, off it went. I tried Ubuntu first which was horrible. Even with a lightweight window manager like IceWM and most of the unneeded services like bluetooth, etc. turned off, it bumped against 100 MB RAM doing nothing. Load Firefox with a couple of tabs (don't care for Opera and Konqueror needs more extensions to be useful for me), and it was over. Swap city. So, to get to the point, I tried Gentoo, and after waiting 7 hours for KDE to compile and then ending up with an error, I then wiped it in disgust.

    Enter Arch Linux. Installed to a CLI in about 10 minutes. Getting the wi-fi working from the cli with wpa_supplicant and the zd1211 firmware for my card was a breeze. Then I proceeded to download and install xorg and icewm. All told, at a cli with wi-fi working it idles at eleven MB. Logged in to icewm it sits at 17. And with firefox running, a grand total of 51 Megabytes. And of course, it's blazing. With Firefox 2, it's at least as fast as my Pentium 4 laptop running Debian with Firefox 3. And, of course everything works in Firefox. Flash 10, etc.

    Although what I've said doesn't speak completely to your point, suffice it to say, depending on your setup, you aren't doomed to a slower computer when running reasonably up to date software.

  • Re:Oh boy. (Score:4, Informative)

    by Belial6 ( 794905 ) on Sunday November 30, 2008 @07:17PM (#25937469)
    At the time that DOS 6.0 came out, and the Stacker copyright was still in the code, MS had NOT licensed anything from Stac Electronics. They did not buy the software. They did illegally copy it. So, no joke, AND not uninformed.
  • by ozphx ( 1061292 ) on Sunday November 30, 2008 @08:11PM (#25937925) Homepage

    If it can't do simple things better, faster. How is it going to handle the more complex stuff better?

    I guess you failed CS 101. A more sophisticated API is always going to be slower than "poke xxx".

    Graphics hardware is moving from being a specialised device which can handle basic primitive drawing to a full fledged massive vector processor. Doing this makes it less efficient at the original task.

    Your time might be worthless, but in the real world developer time costs money - and at the rate of hardware improvement its pretty clear that MS and its customers are happy to take a few % performance hit to have a more featured/safe/simpler/etc API.

    v10 brought much better memory management and reworked the shader model to add geometry shaders. The smallish hit in state management / data stream overhead is made up by the fact that half the procedural geometry can be done in the GPU.

    v11 will introduce compute shaders. Presumably the older API functions will be slower on equivalent hardware again. But I'll have compute shaders to play with (and it won't be by using some dodgy ass API which is the equivalent of poke).

    Only counterstrike tards care about getting 100fps vs 110fps. Hint: That shiny new intel quad core would get smoked at performing FFTs by an ASIC. Does that make the Q6600 the lesser CPU because its traded raw performance on certain tasks for generalisation?

    TBH I don't get where all this whinging is coming from. I have a 9600GT, its one of the cheapest cards you can get, it runs everything fine under vista? Why the butthurt?

HELP!!!! I'm being held prisoner in /usr/games/lib!

Working...