Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Microsoft Graphics Software Entertainment Games

MS Says Windows 7 Will Run DirectX 10 On the CPU 503

arcticstoat writes "In what could be seen as an easy answer to the Vista-capable debacle, Microsoft has introduced a 'fully conformant software rasterizer' called WARP (Windows Advanced Rasterization Platform) 10, which does away with the need for a dedicated hardware 3D accelerator altogether. Microsoft says that WARP 10 will support all the features and precision requirements of Direct3D 10 and 10.1, as well as up to 8x multi-sampled anti-aliasing, anisotropic filtering and all optional texture formats. The minimum CPU spec needed is just 800MHz, and it doesn't even need MMX or SSE, although it will work much quicker on multi-core CPUs with SSE 4.1. Of course, software rendering on a single desktop CPU isn't going to be able to compete with decent dedicated 3D graphics cards when it comes to high-end games, but Microsoft has released some interesting benchmarks that show the system to be quicker than Intel's current integrated DirectX 10 graphics. Running Crysis at 800 x 600 with the lowest quality settings, an eight-core Core i7 system managed an average frame rate of 7.36fps, compared with 5.17fps from Intel's DirectX 10 integrated graphics."
This discussion has been archived. No new comments can be posted.

MS Says Windows 7 Will Run DirectX 10 On the CPU

Comments Filter:
  • Oh boy. (Score:2, Insightful)

    by Anonymous Coward

    So we can play things at 7fps with ultra low settings. Whoopee.

    Seriously, buy a goddamn graphics card.

    • Re:Oh boy. (Score:5, Funny)

      by White Flame ( 1074973 ) on Sunday November 30, 2008 @03:55AM (#25931225)

      Seriously, buy a goddamn graphics card.

      I did, but then I only got 5fps. :-P

    • Re:Oh boy. (Score:5, Funny)

      by jadedoto ( 1242580 ) on Sunday November 30, 2008 @04:00AM (#25931255)
      But what if I want to play Crysis on my EeePC during that boring office meeting!?
    • by WiiVault ( 1039946 ) on Sunday November 30, 2008 @04:13AM (#25931309)
      Just imagine the demo. "Here is the slooooow intel extreme, geez what a dog, they should be ashamed! Now check out the BRAND NEW straight out of the labs tech, this will blow your mind (cues 7fps slideshow). I know, I know, we do seriously kick butt.
    • Re:Oh boy. (Score:5, Funny)

      by Joce640k ( 829181 ) on Sunday November 30, 2008 @06:16AM (#25931851) Homepage

      Eight cores at 3GHz beat one core at 400MHz!!!

      Film at eleven.

    • Re: (Score:3, Insightful)

      Perhaps if you would try some cognition before you type, this is more about running the interface graphics than it is about gaming graphics.

  • by Anonymous Coward on Sunday November 30, 2008 @03:52AM (#25931217)

    What a revolutionary & useful idea.

  • Yes. (Score:5, Insightful)

    by James_Duncan8181 ( 588316 ) on Sunday November 30, 2008 @03:52AM (#25931219) Homepage
    In other news, Intel graphics chips said to be designed for minimal power draw rather than all out performance. This power draw is decidedly not beaten by running a software renderer that will stress the CPU till it sucks power like an electric chair as the CPU is only general hardware, not specific. More at 11.
  • by Anonymous Coward on Sunday November 30, 2008 @03:57AM (#25931245)

    How much is an 8-core system going to cost vs the system with integrated graphics? At that point, it seems wiser to invest more money in a graphics card than in faster CPUs if that's what you're going to be doing.

    By far the more useful thing is that it's probably better for development because the driver developers will have a reference point of how the graphics are supposed to render. Also, larger game companies will be able to point out these differences to get bug fixes out of the graphics card companies. "Your graphics card renders this incorrectly with regards to the reference, fix it" is much more forceful than "your graphics card behaves differently than your competitor".

    • by Lord Crc ( 151920 ) on Sunday November 30, 2008 @04:43AM (#25931447)

      Also, larger game companies will be able to point out these differences to get bug fixes out of the graphics card companies. "Your graphics card renders this incorrectly with regards to the reference, fix it" is much more forceful than "your graphics card behaves differently than your competitor".

      DirectX already contains a reference rasterizer, which is better suited for that. This thing seems instead to be meant for applications that doesn't necessarily need more than "interactive" frame rates, but do need to run on a broad class of machines. Or for easing development of applications which could benefit from hardware acceleration when available (image processing f.i.).

      From the MSDN page [microsoft.com] on WARP:

      We don't see WARP10 as a replacement for graphics hardware, particularly as reasonably performing low end Direct3D 10 discrete hardware is now available for under $25. The goal of WARP10 was to allow applications to target Direct3D 10 level hardware without having significantly different code paths or testing requirements when running on hardware or when running in software.

    • Re: (Score:3, Insightful)

      by ceoyoyo ( 59147 )

      No no, see, now when Windows 7 requires video cards that nobody has but MS puts Windows 7 Ready stickers on all of the new computers anyway, when people say "my Windows 7 Ready computer won't run Windows 7!" MS can point out that yes, it does. Any version of Windows 7. Sure, it takes ten minutes to draw a menu, but it runs!

  • From the summary: (Score:4, Insightful)

    by ben0207 ( 845105 ) <ben.burton@g m a i l . com> on Sunday November 30, 2008 @03:59AM (#25931251)

    "Running Crysis at 800 x 600 with the lowest quality settings, an eight-core Core i7 system managed an average frame rate of 7.36fps, compared with 5.17fps from Intel's DirectX 10 integrated graphics."

    So the game went from unplayable at the lowest settings possible, to being still unplayable at the lowest settings possible?

    Great move MS, youv'e really solved a problem there.

    • Re: (Score:2, Funny)

      managed an average frame rate of 7.36fps, compared with 5.17fps

      But, but, that's like, a 42% improvement! That's like, massive, man! MS are awesome!
    • Re:From the summary: (Score:5, Informative)

      by Pr0xY ( 526811 ) on Sunday November 30, 2008 @04:41AM (#25931433)

      As I said in another post:

      Running Crysis isn't the point of the demo. The point was that it was a DX 10 application running entirely in software. In the end, this means that systems without higher end 3D cards would be able to run Aero. THAT's the point.

      They are trying to address the main complaint of the "Vista Capable" debacle. Running Crysis was just a way of demonstrating the capability.

      • by jonaskoelker ( 922170 ) <`jonaskoelker' `at' `yahoo.com'> on Sunday November 30, 2008 @08:48AM (#25932455)

        be able to run Aero. Running Crysis was just a way of demonstrating the capability.

        I think running Aero at would be a better way to demonstrate that capability.

    • by Sycraft-fu ( 314770 ) on Sunday November 30, 2008 @04:43AM (#25931445)

      Seriously, this is a good thing. One could compare it to Mesa 3D. You have the option of running graphics in software, if you lack the hardware to accelerate it. This is highly useful in two situations:

      1) You have something intensive and need to see it on a computer that lacks the requisite accelerator. Though it won't be fast, at least you can see the output rather than just being SOL.

      2) You have a non-intensive task and don't wish to purchase dedicated hardware. While Crysis crawls, I'm going to guess something like, say, Thief wouldn't.

      This is just a software lawyer to allow the OS to do 3D rendering even if there's not an accelerator present. I'm sure that 99.99% of people who do 3D in any capacity will use an accelerator as they are extremely cheap and extremely high performance. However it isn't a bad thing to have a software implementation. MS has actually had one for a long time, however it only comes with the development version of DirectX. It allows you to check the expected output for a program against the reference renderer as compared to an actual card.

      Sounds like this is the same thing, just sped up and packed for end user use, rather than just developers.

      Could have applications in the future too. For example what will computer hardware be capable of in 15 years? Processors are likely to be much faster as compared to today. Well, this might allow for 3D to be useful when emulating Windows for old programs. People remember people emulate DOS today (see DOSBox) for various purposes. I don't think it is out of the question that a decade or two later people will emulate Windows 7. Ok however part of that will be dealing with the 3D layer. A large number of apps today make use of Direct3D. Well, if Windows 7 has a software 3D layer, and processors are blazing fast you are good. Just use that. If it doesn't you then have to make your emulator emulate the 3D hardware, since I'm guessing a decade from now the 3D subsystem will be vastly different than it is now.

      This is not intended to be a "Oh you don't need a graphics card ever," thing. It is intended to give people the option to get 3D without having to have a graphics card. It won't be as good, but at least it'll work.

    • Comment removed (Score:5, Informative)

      by account_deleted ( 4530225 ) on Sunday November 30, 2008 @04:50AM (#25931477)
      Comment removed based on user account deletion
  • Grrrreat! (Score:5, Insightful)

    by Chordonblue ( 585047 ) on Sunday November 30, 2008 @04:01AM (#25931257) Journal

    Does anyone else remember the 'good old days' when certain 3D graphics cards (the ViRGE comes to mind), were actually SLOWER than software renderers?

    The term used then was 'decelerator' and I think MS's stupid decision to (once again) bow to Intel on this should share the same term.

    How long will it take for true 3D acceleration to become an expected standard feature on PC's?

    • Sadly, never as long as the GUI works most Joe and Jane sixpacks will be just fine; and yes I do know about the Vista debacle but I think the point is still valid.
      • by vux984 ( 928602 )

        Sadly, never as long as the GUI works most Joe and Jane sixpacks will be just fine; and yes I do know about the Vista debacle but I think the point is still valid.

        Then you'd be mistaken. Both OSX and Microsoft effectively require hardware 3D acceleration for their desktop effects. All new Macs and any PC that actually meets Vista's real requirements feature 3D acceleration.

      • Re:Grrrreat! (Score:5, Insightful)

        by A Life in Hell ( 6303 ) <jaymz@artificial-stupidity.net> on Sunday November 30, 2008 @04:32AM (#25931387) Homepage

        Sadly, never as long as the GUI works most Joe and Jane sixpacks will be just fine; and yes I do know about the Vista debacle but I think the point is still valid.

        How is that sad? If people don't need it, it seems like a waste of money to me.

      • Comment removed (Score:5, Interesting)

        by account_deleted ( 4530225 ) on Sunday November 30, 2008 @05:06AM (#25931549)
        Comment removed based on user account deletion
        • Re: (Score:3, Informative)

          Just a heads up: the PCI 6200 has some known problems with video playback. They were all driver-related, but as far as I know, NVidia never fixed them because the 6200 was always a fairly low volume unit and has now been dropped altogether.

          Google "GeForce 6200 video won't play" or something similar and you'll see the number of forum threads and posts where people complain about how this version of the drivers works but not this version and so on.

          The solution that's usually thrown about: disable hardware ac

    • Ummmm (Score:5, Interesting)

      by Sycraft-fu ( 314770 ) on Sunday November 30, 2008 @04:56AM (#25931493)

      3D accelerators are an expected feature on standard PCs. I can't think of one you can get these days without one. All the current integrated Intel and ATi and nVidia chips are 3D accelerators. Not powerful ones, but they do the trick. Any ad in card is, of course, an accelerator.

      However here's a better question: How long until we don't need that anymore? Personally, I'm not thrilled with the idea of having to have lots of dedicated hardware. The whole point of a PC is a general purpose machine that can do pretty much anything because it is all programmed in software. You replace dedicated units that did only one thing with a general purpose computer that does everything. Ok well that is somewhat undermined by the requirement of specialized hardware.

      Now, I understand the need for it. Graphics are intense and there is just no way, at this time, for a CPU to handle it. A dedicated processor optimized for the kind of math graphics need is the way to go. However wouldn't it be nice if that weren't the case? Wouldn't it be nice if the CPU again did everything?

      We won't see that day tomorrow, but perhaps we'll see it in a decade or two.

      I look back to the changes in audio production and hope to see it come to graphics as well:

      Originally, PCs used in audio production were little more than interfaces for complex dedicated hardware. A normal PC simply couldn't handle it. You had a PC that was loaded full of Pro Tool cards, which were massive bunches of specialized hardware, to do anything. Well as CPUs got better, you started to be able to do more on a regular PC. At first it was still nothing really useful in the pro market. You had to do everything non-realtime, spend lots of time rendering a change then listening to it and so on. But at least you could actually do it on normal computers. Yet more time passed and now non-destructive realtime software was available on normal systems. You could overload it pretty easy, you still had to bounce tracks and such, it wasn't the unrestricted power of an accelerated solution, but it worked pretty well and in fact lots of project studios did just that.

      Then we come to now. Now, the hardware accelerated audio production system is a relic. They are still made, but they are unpopular. Most professional studios don't bother, they just get a nice powerful PC (by PC I mean personal computer, Macs are included in this) with a couple of multi core processors and go to town. The CPUs easily handle large number of tracks with multiple effects and so on all in realtime. There is simply no need for dedicated hardware, and not using it means much greater flexibility. Everything is just changed in software.

      So, I'd love to see that same sort of thing come to graphics. At this point, CPUs have a long way to go. But then, technology moves fast. Everything I'm talking about in the audio world has happened in about 2 decades. In just 20 years or so it went from something you could only do with amazingly expensive special hardware to something that is easy for a $1000 computer to handle.

      20 years from now, may be the same deal with graphics.

      • Re: (Score:3, Insightful)

        Now, the hardware accelerated audio production system is a relic. They are still made, but they are unpopular.

        This isn't quite true. Certainly the mixing, EQ, effects processing and a lot of signal generation (softsynths, etc.) is done on board the host PC nowadays, but where the rubber meets the road and there's a need have to have really good sample-accurate synchronized input/output in real time without the possibility of clicks and pops, people are still relying on outboard hardware, usually in the
        • Re:Ummmm (Score:4, Informative)

          by Sycraft-fu ( 314770 ) on Sunday November 30, 2008 @08:10AM (#25932295)

          No, those soundcards aren't anything more than that. They just get the signal and convert it for the computer. I've owned a few, and worked with many more. Thus far I haven't seen any that do anything past conversion, routing, and perhaps basic mixing (basically those that have more advanced routing). Their function is to convert the sound to a format the PC can use and hand it off, nothing more. That they are external has nothing to do with it. That is done for convenience (hard to pack a lot of inputs on a PCI card) and noise (don't need to worry about dealing with all the RF from the computer). Firewire is often used since it has DMA and thus works well for low latency sound, but there's others that use their own PCI card and interface (MOTU does both, for example).

          Now I leave open the possibility there are ones I haven't encountered that do something more, but those I've seen are just soundcards.

          You forget that timing isn't an issue on the computer. Everything on there is asynchronous, clockless. The audio is just a stream of bits on disk. The computer never processes it at the sample rate, that is just a number stored in the file. So soundcards don't do anything special in this regard other than have a good clock to which everything is slaved (or perhaps a word clock input for external clocking). Once the audio has been converted and handed off to the system, timing isn't an issue anymore. The only difference between a cheap consumer card and an expensive pro card in this regard is the quality of timing source, and perhaps if everything is locked to a single source.

          In fact, you'll find that there is often more processing done on consumer cards, than on pro cards. Pro cards just convert the signal from analogue or S/PDIF or whatever and feed it in to the computer. Consumer cards often do sample rate conversion, and sometimes various other kinds of processing. In fact the card with the most muscle I'm aware of (leaving out dedicated hardware like the HDAccel) is the SoundBlaster X-Fi. That can handle 128 different sound sources in hardware, do SRC on all of them, place them in space, and perform effects on them. Compare that to a MOTU HD192 which does little more than deal with audio input and output, and mix/route it as you specify.

          The money/hardware in pro cards is in high quality circuitry, mostly in the analogue phase, not any special processing.

      • As an engineer... (Score:3, Interesting)

        by gillbates ( 106458 )

        the brain-dead architecture of the x86 PC strikes me as funny.

        Here, you've got 2, 4, what - now 8 cores which can't compete with a decent FPGA?! The problem isn't the CPU speed. The problem is that CPUs make very poor DSPs. A TI DSP can encode/decode HD video in realtime using only a few percent of the power required by the desktop CPU. A large part of that GPU's performance comes from the fact that it has hardware optimized for video, which, of course, Intel has steadfastly refused to add to their p

  • by tftp ( 111690 ) on Sunday November 30, 2008 @04:03AM (#25931273) Homepage

    Running Crysis at 800 x 600 with the lowest quality settings, an eight-core Core i7 system managed an average frame rate of 7.36fps, compared with 5.17fps from Intel's DirectX 10 integrated graphics."

    So they compared one unusable (and dirt cheap) setup to another, super-expensive and still unusable one, and then they brag about sucking 20% less?

    This is typical for MS. They are mostly a software company, and there are too many people who advocate software-only solutions that make no sense, just because that's the only thing they know how to do (maybe.)

  • Many server motherboards come with some chintzy onboard video, yet have plenty of CPU and RAM to throw around.

    But who is going to be running D3D10.1 apps on a server? Is MS going to rewrite their GUI layers on top of their 3d API a la Apple?

    • Comment removed (Score:5, Insightful)

      by account_deleted ( 4530225 ) on Sunday November 30, 2008 @04:26AM (#25931359)
      Comment removed based on user account deletion
    • by SanityInAnarchy ( 655584 ) <ninja@slaphack.com> on Sunday November 30, 2008 @04:31AM (#25931379) Journal

      Is MS going to rewrite their GUI layers on top of their 3d API a la Apple?

      They did that in Vista. They did it so poorly that customers sued over being sold "Vista-capable" machines which weren't -- including Intel video cards that weren't enough.

      Meanwhile, Ubuntu runs on Compiz, which does just fine on Intel -- and Apple has been so far ahead that someone took the audio from one of the original Vista presentations, and combined it with video from Tiger, thus showing that really everything "new" about Vista was just playing catch-up with Tiger, while Leopard was just around the corner.

      More to the point: I believe it's now possible to run a Windows Server without a video card -- or, indeed, any GUI at all, depending on what apps you need.

  • by Antlerbot ( 1419715 ) on Sunday November 30, 2008 @04:11AM (#25931295)

    Say you get a new computer with a decent CPU, but no graphics card for work. You guys remember that thing, right? Work? Spreadsheets and documents and...yeah. That stuff.

    Anyway, now you can play Tomb Raider on it. The original one. Sweet.

  • by Hymer ( 856453 ) on Sunday November 30, 2008 @04:11AM (#25931297)
    ...about the impossibility of running DirectX 10 on Windows XP.
    If you can run it on software you'll be able to run it on any OS version. Gee... that was another lie from Redmond, why am I not surprised... maybe 'cause I do run he DirectX 10 hack on my XP and no it didn't raise the CPU usage (as claimed be the union of MS Windoze Vista Fanboyz)... it lowered it.
    • Re: (Score:3, Insightful)

      How does that even make sense? Not to defend Microsoft's bullshit, but how does coding a software renderer on one OS suddenly mean it should work with every OS? There's no possible logical leap there. Hell, why not DOS?

      maybe 'cause I do run he DirectX 10 hack on my XP and no it didn't raise the CPU usage (as claimed be the union of MS Windoze Vista Fanboyz)... it lowered it.

      What? There is no way to use DX10 on XP at this time; the only "hacks" are game-specific, allowing you to use DX10 games on DX9, or bump up the graphics detail on games when in DX9 mode to something closer to what they do in DX10 mode. All that proves is that these particular games don't actu

  • lol (Score:5, Insightful)

    by DigitalisAkujin ( 846133 ) on Sunday November 30, 2008 @04:12AM (#25931305) Homepage

    /. is silly

    they made this to run the desktop effects

    not crysis xD

    • Re:lol (Score:5, Funny)

      by WiiVault ( 1039946 ) on Sunday November 30, 2008 @04:21AM (#25931351)
      If so then why would they demo Crysis?
      • To prove that their implementation is complete, and doesn't completely suck, even if it mostly sucks.

        And, if you think about it, this could be good for Larabee, which is supposed to be just a bunch of x86 CPUs on a card.

        • I certainly see the point you are making, its valid, however I would never, as a company trot around how the fastest Intel CPU's available (just recently) with the most cores can produce a slideshow of Crysis at crap resolution and detail.
      • by Cyberax ( 705495 )

        Why not?

        It certainly shows that their software renderer is fast and feature-complete enough to run fairly recent games.

      • Do you run benchmarks copying one file from home to tmp and back again?
        Crysis is a good place to test dx10 performance in a way people will understand, would you rather they said they could do x thousand polygon operations per second vs n thousand?

    • Arg the first insightful post but in such a stupid form!

      On the one hand I want you to get modded up, as on a modern system being able to use software to render desktop effects will be useful (I wonder what the FPS on aero is though). However I also wish you to die a horrible and painful death for using the subject "lol" and ending an unpuctuated post with an emoticon.

  • by WiiVault ( 1039946 ) on Sunday November 30, 2008 @04:18AM (#25931337)
    To think that anybody would want to run a DX10 game on an 800mhz no SSE CPU is insane, even considering the company involved. Perhaps for DX 7,8 and perhaps 9 games this might be reasonable (though not likely) but jesus, no thanks!
  • by FranTaylor ( 164577 ) on Sunday November 30, 2008 @04:18AM (#25931339)

    "Every time Andy gives us more power, Bill takes it away".

  • by Whiteox ( 919863 ) on Sunday November 30, 2008 @05:32AM (#25931661) Journal

    Running Crysis at 800 x 600 with the lowest quality settings, an eight-core Core i7 system managed an average frame rate of 7.36fps, compared with 5.17fps from Intel's DirectX 10 integrated graphics.

    and this is ball-slapping good news?

    • Re: (Score:3, Insightful)

      by Legion303 ( 97901 )

      Man, that's like 2 whole fps more. With further optimization they might even crank it up to 15fps, which would get it close to the framerate I got from Crysis on medium settings with my laptop. And the best part is, you can run it on your enterprise-class server when you aren't busy serving up hundreds of thousands of SQL searches! Why pay $400 for a lousy video card when you can buy a $20K server instead?

  • Yay! (Score:3, Insightful)

    by zmollusc ( 763634 ) on Sunday November 30, 2008 @05:58AM (#25931767)

    Hurrah! In the future, when i switch off pointless Aero crap, it will free up lots more cpu cycles for the annoying microsoft apps i need to run to see simple 2d spreadsheet data sent to me by retards who use proprietary microsoft file formats. Microsoft FTW!!

  • by abigsmurf ( 919188 ) on Sunday November 30, 2008 @06:20AM (#25931869)
    How dare MS maximise compatibility for Windows 7 and implement what will be a handy feature for low end systems, particularly netbooks (it's the chipset that draws all the power in atom based systems, not the CPU).

    Improving performance over a dedicated graphics chip (albeit a weak one) is still a respectable achievement, especially when you consider games typically use ~100% cpu anyway. Whilst it may be unplayable for crysis, I can see it giving a solid frame rate on things like WoW.

  • by Ed Avis ( 5917 ) <ed@membled.com> on Sunday November 30, 2008 @09:40AM (#25932705) Homepage

    The news here is not the existence of a software renderer, but one with good performance (such that a high-end CPU is competitive with a low-end GPU for some tasks). I wonder how the trusty Mesa GL renderer compares to Microsoft's latest offering? (They implement different APIs, but Wine provides DirectX 10 on top of OpenGL, so you can get an apples to apples comparison.)

  • by soupforare ( 542403 ) on Sunday November 30, 2008 @10:56AM (#25933171)

    I doubt big multicore chips will be cheap enough for the Xbox 3, but win7 probably won't be released in time for it anyway.

    This isn't for Aero on GMAs. This is so you can target both Xbox4 and Win7/GFW without even bothering to think.
    One set-top box, one platform. They've had a hard-on for it for a decade+, it's coming.

Our OS who art in CPU, UNIX be thy name. Thy programs run, thy syscalls done, In kernel as it is in user!

Working...