Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Microsoft Graphics Software Entertainment Games

MS Says Windows 7 Will Run DirectX 10 On the CPU 503

arcticstoat writes "In what could be seen as an easy answer to the Vista-capable debacle, Microsoft has introduced a 'fully conformant software rasterizer' called WARP (Windows Advanced Rasterization Platform) 10, which does away with the need for a dedicated hardware 3D accelerator altogether. Microsoft says that WARP 10 will support all the features and precision requirements of Direct3D 10 and 10.1, as well as up to 8x multi-sampled anti-aliasing, anisotropic filtering and all optional texture formats. The minimum CPU spec needed is just 800MHz, and it doesn't even need MMX or SSE, although it will work much quicker on multi-core CPUs with SSE 4.1. Of course, software rendering on a single desktop CPU isn't going to be able to compete with decent dedicated 3D graphics cards when it comes to high-end games, but Microsoft has released some interesting benchmarks that show the system to be quicker than Intel's current integrated DirectX 10 graphics. Running Crysis at 800 x 600 with the lowest quality settings, an eight-core Core i7 system managed an average frame rate of 7.36fps, compared with 5.17fps from Intel's DirectX 10 integrated graphics."
This discussion has been archived. No new comments can be posted.

MS Says Windows 7 Will Run DirectX 10 On the CPU

Comments Filter:
  • by Hymer ( 856453 ) on Sunday November 30, 2008 @04:11AM (#25931297)
    ...about the impossibility of running DirectX 10 on Windows XP.
    If you can run it on software you'll be able to run it on any OS version. Gee... that was another lie from Redmond, why am I not surprised... maybe 'cause I do run he DirectX 10 hack on my XP and no it didn't raise the CPU usage (as claimed be the union of MS Windoze Vista Fanboyz)... it lowered it.
  • by Anonymous Coward on Sunday November 30, 2008 @04:38AM (#25931417)

    Did we not do this already back in 1993? From the MESA project an excerpt: "August, 1993: I begin working on Mesa in my spare time. The ..."
    So what is the news.

  • by Sycraft-fu ( 314770 ) on Sunday November 30, 2008 @04:43AM (#25931445)

    Seriously, this is a good thing. One could compare it to Mesa 3D. You have the option of running graphics in software, if you lack the hardware to accelerate it. This is highly useful in two situations:

    1) You have something intensive and need to see it on a computer that lacks the requisite accelerator. Though it won't be fast, at least you can see the output rather than just being SOL.

    2) You have a non-intensive task and don't wish to purchase dedicated hardware. While Crysis crawls, I'm going to guess something like, say, Thief wouldn't.

    This is just a software lawyer to allow the OS to do 3D rendering even if there's not an accelerator present. I'm sure that 99.99% of people who do 3D in any capacity will use an accelerator as they are extremely cheap and extremely high performance. However it isn't a bad thing to have a software implementation. MS has actually had one for a long time, however it only comes with the development version of DirectX. It allows you to check the expected output for a program against the reference renderer as compared to an actual card.

    Sounds like this is the same thing, just sped up and packed for end user use, rather than just developers.

    Could have applications in the future too. For example what will computer hardware be capable of in 15 years? Processors are likely to be much faster as compared to today. Well, this might allow for 3D to be useful when emulating Windows for old programs. People remember people emulate DOS today (see DOSBox) for various purposes. I don't think it is out of the question that a decade or two later people will emulate Windows 7. Ok however part of that will be dealing with the 3D layer. A large number of apps today make use of Direct3D. Well, if Windows 7 has a software 3D layer, and processors are blazing fast you are good. Just use that. If it doesn't you then have to make your emulator emulate the 3D hardware, since I'm guessing a decade from now the 3D subsystem will be vastly different than it is now.

    This is not intended to be a "Oh you don't need a graphics card ever," thing. It is intended to give people the option to get 3D without having to have a graphics card. It won't be as good, but at least it'll work.

  • Ummmm (Score:5, Interesting)

    by Sycraft-fu ( 314770 ) on Sunday November 30, 2008 @04:56AM (#25931493)

    3D accelerators are an expected feature on standard PCs. I can't think of one you can get these days without one. All the current integrated Intel and ATi and nVidia chips are 3D accelerators. Not powerful ones, but they do the trick. Any ad in card is, of course, an accelerator.

    However here's a better question: How long until we don't need that anymore? Personally, I'm not thrilled with the idea of having to have lots of dedicated hardware. The whole point of a PC is a general purpose machine that can do pretty much anything because it is all programmed in software. You replace dedicated units that did only one thing with a general purpose computer that does everything. Ok well that is somewhat undermined by the requirement of specialized hardware.

    Now, I understand the need for it. Graphics are intense and there is just no way, at this time, for a CPU to handle it. A dedicated processor optimized for the kind of math graphics need is the way to go. However wouldn't it be nice if that weren't the case? Wouldn't it be nice if the CPU again did everything?

    We won't see that day tomorrow, but perhaps we'll see it in a decade or two.

    I look back to the changes in audio production and hope to see it come to graphics as well:

    Originally, PCs used in audio production were little more than interfaces for complex dedicated hardware. A normal PC simply couldn't handle it. You had a PC that was loaded full of Pro Tool cards, which were massive bunches of specialized hardware, to do anything. Well as CPUs got better, you started to be able to do more on a regular PC. At first it was still nothing really useful in the pro market. You had to do everything non-realtime, spend lots of time rendering a change then listening to it and so on. But at least you could actually do it on normal computers. Yet more time passed and now non-destructive realtime software was available on normal systems. You could overload it pretty easy, you still had to bounce tracks and such, it wasn't the unrestricted power of an accelerated solution, but it worked pretty well and in fact lots of project studios did just that.

    Then we come to now. Now, the hardware accelerated audio production system is a relic. They are still made, but they are unpopular. Most professional studios don't bother, they just get a nice powerful PC (by PC I mean personal computer, Macs are included in this) with a couple of multi core processors and go to town. The CPUs easily handle large number of tracks with multiple effects and so on all in realtime. There is simply no need for dedicated hardware, and not using it means much greater flexibility. Everything is just changed in software.

    So, I'd love to see that same sort of thing come to graphics. At this point, CPUs have a long way to go. But then, technology moves fast. Everything I'm talking about in the audio world has happened in about 2 decades. In just 20 years or so it went from something you could only do with amazingly expensive special hardware to something that is easy for a $1000 computer to handle.

    20 years from now, may be the same deal with graphics.

  • Re:lol (Score:3, Interesting)

    by Barny ( 103770 ) on Sunday November 30, 2008 @04:59AM (#25931509) Journal

    But then, isn't the whole point of aero (excusing the prettiness) to get load OFF the CPU and onto something else?

  • Comment removed (Score:5, Interesting)

    by account_deleted ( 4530225 ) on Sunday November 30, 2008 @05:06AM (#25931549)
    Comment removed based on user account deletion
  • by wwahammy ( 765566 ) on Sunday November 30, 2008 @05:21AM (#25931609)
    I wish I had mod points to use on the parent. The GPU companies (emphasis on Nvidia though) knew the Vista driver model 18 months prior to its release and they still couldn't come up with decent drivers on time or ever two years later. I finally gave up on Nvidia's shitty drivers when a driver update in June caused all AVI files to skip when emule was open. Combine that with Nvidia refusing to implement DVD anti-aliasing on hardware for Vista (something that they have in the XP drivers) I had had enough being a free beta tester for Nvidia. My new ATI card works just fine and I don't have to install additional crapware for its drivers. I don't plan on ever going back to Nvidia.
  • by LingNoi ( 1066278 ) on Sunday November 30, 2008 @06:05AM (#25931795)

    Isn't this the point of openGL? An API to dedicated graphics hardware with a backup software renderer if the hardware isn't supported?

    Whose idea is this again? It doesn't look like much of an idea, more like a step backwards..

  • by abigsmurf ( 919188 ) on Sunday November 30, 2008 @06:20AM (#25931869)
    How dare MS maximise compatibility for Windows 7 and implement what will be a handy feature for low end systems, particularly netbooks (it's the chipset that draws all the power in atom based systems, not the CPU).

    Improving performance over a dedicated graphics chip (albeit a weak one) is still a respectable achievement, especially when you consider games typically use ~100% cpu anyway. Whilst it may be unplayable for crysis, I can see it giving a solid frame rate on things like WoW.

  • Re:Oh boy. (Score:5, Interesting)

    by Joce640k ( 829181 ) on Sunday November 30, 2008 @06:35AM (#25931927) Homepage

    Just use the Intel ray tracer...

    http://video.google.com/videoplay?docid=7079133482718383307 [google.com]

    Much more impressive than "DX10 rasterizer".

  • Re:Oh boy. (Score:2, Interesting)

    by neumayr ( 819083 ) on Sunday November 30, 2008 @07:19AM (#25932081)
    Looks interesting, yes.
    But what's this talk about a 3D internet in the Details box? How does this technology enable something like a 3D internet, when people's been distributing information in 2D forever and computer interfaces are build around that?
  • Re:Oh boy. (Score:2, Interesting)

    by SupremoMan ( 912191 ) on Sunday November 30, 2008 @07:38AM (#25932145)
    Mod parent up +1 nostalgia. I also had win95 on 486. It did the job, though I still ran most games in DOS mode. I was able to use win95 to compress part of my hard drive for added storage. Something I wouldn't be able to do without it.
  • by miknix ( 1047580 ) on Sunday November 30, 2008 @08:53AM (#25932471) Homepage

    Sure, just imagine Vista running on a 800MHz computer with software render.

  • Re:Yes. (Score:4, Interesting)

    by Bert64 ( 520050 ) <bert AT slashdot DOT firenzee DOT com> on Sunday November 30, 2008 @09:07AM (#25932529) Homepage

    A GUI on a server should be entirely optional, and never the default...
    Serial consoles enable me to rebuild my servers without traveling to the location where they are hosted. Even if the OS is screwed to the point i can't login using it's existing remote logon features, i can get on via serial and fix it or do a complete reinstall.
    I've never found a need for a GUI on any of my servers, because everything i've ever needed to do was possible from the CLI. I would try to avoid any server software which required a GUI, as you pointed out poorly coded, and failing that i would install it locally and copy the configuration if possible. Having to install GUI libraries and the like would end up doubling the footprint on most of my machines, and therefore double the patching requirements.

  • by Ed Avis ( 5917 ) <ed@membled.com> on Sunday November 30, 2008 @09:40AM (#25932705) Homepage

    The news here is not the existence of a software renderer, but one with good performance (such that a high-end CPU is competitive with a low-end GPU for some tasks). I wonder how the trusty Mesa GL renderer compares to Microsoft's latest offering? (They implement different APIs, but Wine provides DirectX 10 on top of OpenGL, so you can get an apples to apples comparison.)

  • by soupforare ( 542403 ) on Sunday November 30, 2008 @10:56AM (#25933171)

    I doubt big multicore chips will be cheap enough for the Xbox 3, but win7 probably won't be released in time for it anyway.

    This isn't for Aero on GMAs. This is so you can target both Xbox4 and Win7/GFW without even bothering to think.
    One set-top box, one platform. They've had a hard-on for it for a decade+, it's coming.

  • by Anonymous Coward on Sunday November 30, 2008 @10:59AM (#25933187)

    Gutmann was right about one thing: that content protection mechanisms will require parts of hardware specification to be kept confidential.

    We see this with AMD: they are careful about releasing hardware specs for their hardware to X.org community because they have to omit documenting certain components (e.g. UVD engine) because their hardware is not yet safe against hacking to get out unencrypted content. Rendertest is performed to check if the card is genuine. With hardware spec, someone could easily build a fake card (or hook up some kind of emulation on the PCIe port) and rip out unencrypted video bitstream.

    Fortunately for them, people are currently hacking software players, but in a year or two that might become unavailable, so they will have to turn to hardware.

  • by Anonymous Coward on Sunday November 30, 2008 @01:50PM (#25934329)

    Even with 3D hardware, this can be useful. If you have the previous generation 3D adaptor, you will be able to run 80-90% of the code accelerated, but you will still be able to see the effects only otherwise available on newer adaptors. This encourages developers to just go for the newest version of DirectX instead of having a chick and egg-problem, where there needs to be enough adaptors developed and sold before game developers start using that standard and vice versa.

    Apple is working on technology making this even better. They basically do runtime optimization of the code, so all the conditionals (use software or use hardware) are optimized away and the correct path just used, essentially allowing a single software implementation to automatically adapt itself to all adaptors.

    This, of course, is also beneficial for hardware developers, which can release feature-complete drivers for new adaptors much earlier, by simply not implementing difficult things and later releasing updated drivers.

  • by Anonymous Coward on Sunday November 30, 2008 @05:21PM (#25936479)

    First, I should note that I seem to be missing how the conversation shifted from hardware vendors not supporting devices properly to Gutmann's paper. That said, let's discuss that.

    If you want argue that "numerous sites" have debunked Gutmann's paper, you should at least cite two different ones. If you're going to cite only one site to debunk Gutmann, I don't recommend citing George Ou.

    In the first link, Ou attacks Gutmann's paper on two fronts. First, Ou argues against Gutmann's concerns about downsampling by pointing out that the studios, as of his 2007 article (I have no idea whether this remains the case) had not turned on the ICT flag yet.

    That may be a fair point for much of the market. It might make sense for someone who needn't be sensitive about the issue to just buy and assume that the media protections won't cause them any major issues.

    However, Gutmann is not presenting arguments for the broad market, he's presenting arguments for an audience whose focus is security. Gutmann is paid not to take unnecessary bets, and he wouldn't be doing his job if he recommended betting that studios wouldn't flip on the ICT flag anytime soon.

    Even worse, Ou expands his attack to note that Gutmann hasn't tested his ideas. Frankly, this attack is hypocritical, since Ou has just pointed out that the studios are not producing media with the ICT flag enabled. As a result, one has to ask how Gutmann could be expected to reasonably test his concerns for lack of media...and how Ou could have done any more testing than Gutmann. (Theoretically, one or both could pay the required license fees and produce their own media, but the cost of such an effort dwarfs that of buying the OS and suitable test hardware.) Furthermore, the scant testing results Ou cites in the article fall far short of that required to rebut Gutmann's concerns about false positives triggering media protection.

    That's the point where Ou performs better. The second point he claims to address (as derived from the quote he gives) was that media protection would require that more components remain active when a PC was in power saving mode. To rebut this argument, Ou notes that the CPU utilization he measures when playing videos when the computer isn't "asleep" is reasonable. Chewbacca defense anyone?

    If one assumes that Ou just wrote poorly and wanted to counter the processing power concerns Gutmann raises, then one just has to look at what Gutmann wrote to see that Ou's counter is tripe.

    Gutmann quotes Microsoft (a hint that Ou needs to think harder as writing):

    In the case of premium content, whether video can play back smoothly when using regular AES with uncompressed video will be a function of the resolution of the uncompressed video and the power of the processor. It is unlikely to work well in 2006 for uncompressed HD premium content

    There is a reason the quote refers to 2006; it was the time when Gutmann wrote the bulk of his paper. (He's made a few updates since, the last of which was apparently in 2007.)

    To debate this point, Ou runs some CPU requirement calculations using 3.5 megabytes per second for a 1080p HD stream. Since a quick calculation (1920 * 1080 * 24) shows that a 1080p HD stream to contain nearly 50M pixels a second, it seems pretty clear that Ou is running computational requirements for an compressed stream, rather than the uncompressed one Gutmann worries about. Then, to top it all off, Ou compares his number to a processor that didn't exist when Gutmann wrote his paper. Apples to oranges, anyone?

    I'll grant some points against Gutmann. His paper is very speculative, and his concerns may not be relevant to a good chunk of the market. (In Gutmann's defense, I'll point out that I vaguely recall reading it before Vista was even publicly released, making a healthy dose of speculation unavoidable. Furthermore, the concerns in Gutmann's paper are valid and worthy of strenuous investigation in the security circles for which it was originally written.)

    How

  • Re:Oh boy. (Score:3, Interesting)

    by nschubach ( 922175 ) on Sunday November 30, 2008 @07:10PM (#25937417) Journal

    Yeah, and I also had the "manual" way of doing it as well. We had an old DOS menu system that would run batch files for programs. I created a batch file to extract the program to be run, run it and on exit zip it back up. That was my solution to making the most of that 40MB hard drive. ;)

  • Re:Oh boy. (Score:3, Interesting)

    by Pathwalker ( 103 ) * <hotgrits@yourpants.net> on Sunday November 30, 2008 @07:54PM (#25937789) Homepage Journal

    There were lots of third party compression utilities before DOS 6.

    I used to use one called diet. It would intercept calls to read from files, and check to see if it had compressed them. If it had, it would unpack them to another location (I used a resizable ramdisk) and redirect the read to the uncompressed copy.

    When the file was closed, it would delete the decompressed copy.

    It would only work on read only files, but it worked pretty well. In the days before disk caching, uncompressing to the ramdisk actually made things faster despite the overhead of the decompression.

  • As an engineer... (Score:3, Interesting)

    by gillbates ( 106458 ) on Sunday November 30, 2008 @11:09PM (#25939351) Homepage Journal

    the brain-dead architecture of the x86 PC strikes me as funny.

    Here, you've got 2, 4, what - now 8 cores which can't compete with a decent FPGA?! The problem isn't the CPU speed. The problem is that CPUs make very poor DSPs. A TI DSP can encode/decode HD video in realtime using only a few percent of the power required by the desktop CPU. A large part of that GPU's performance comes from the fact that it has hardware optimized for video, which, of course, Intel has steadfastly refused to add to their processors. Instead, they push multimedia instructions which, as hard as they try, are still hamstrung by the memory architecture, and hence, non-competitive compared to a GPU.

    What we really need is for PC architecture to include a standard FPGA which can be reprogrammed on the fly by the OS. You need a GPU? Simply program the FPGA for 3D tasks (you need not emulate the entire GPU - just the parts you need at the moment for your application). You want to do audio processing? Filter implementation in the FPGA is as simple as loading the correct software. Instead of writing the algorithm in software, and it being implemented by software, you configure the hardware to do the computations you need directly. That way, you get the flexibility of software with the speed of dedicated hardware.

    But, alas, market forces trump all others. I remember seeing $20 motherboards recently!? When even a Spartan FPGA costs $10 in quantity, I'm not going to hold my breath for a standard FPGA. But it sure would be nice.

No man is an island if he's on at least one mailing list.

Working...