MS Says Windows 7 Will Run DirectX 10 On the CPU 503
arcticstoat writes "In what could be seen as an easy answer to the Vista-capable debacle, Microsoft has introduced a 'fully conformant software rasterizer' called WARP (Windows Advanced Rasterization Platform) 10, which does away with the need for a dedicated hardware 3D accelerator altogether. Microsoft says that WARP 10 will support all the features and precision requirements of Direct3D 10 and 10.1, as well as up to 8x multi-sampled anti-aliasing, anisotropic filtering and all optional texture formats. The minimum CPU spec needed is just 800MHz, and it doesn't even need MMX or SSE, although it will work much quicker on multi-core CPUs with SSE 4.1. Of course, software rendering on a single desktop CPU isn't going to be able to compete with decent dedicated 3D graphics cards when it comes to high-end games, but Microsoft has released some interesting benchmarks that show the system to be quicker than Intel's current integrated DirectX 10 graphics. Running Crysis at 800 x 600 with the lowest quality settings, an eight-core Core i7 system managed an average frame rate of 7.36fps, compared with 5.17fps from Intel's DirectX 10 integrated graphics."
Oh boy. (Score:2, Insightful)
So we can play things at 7fps with ultra low settings. Whoopee.
Seriously, buy a goddamn graphics card.
Re:Oh boy. (Score:5, Funny)
I did, but then I only got 5fps. :-P
Re:Oh boy. (Score:4, Insightful)
I had been always confused by software advertising (especially Microsoft's). When they say it (i.e. a new version of Windows) would run faster than previous versions, I thought: "Hey! This will work great on my old computer!" - until I saw that the product requirements included the next generation of CPUs. WTF?
Granted, it may be for some of the new CPU instructions that eliminated latency, but still, I felt kinda deceived.
Re:Oh boy. (Score:5, Informative)
Re:Oh boy. (Score:5, Funny)
Re:Oh boy. (Score:5, Informative)
Re: (Score:3, Interesting)
There were lots of third party compression utilities before DOS 6.
I used to use one called diet. It would intercept calls to read from files, and check to see if it had compressed them. If it had, it would unpack them to another location (I used a resizable ramdisk) and redirect the read to the uncompressed copy.
When the file was closed, it would delete the decompressed copy.
It would only work on read only files, but it worked pretty well. In the days before disk caching, uncompressing to the ramdisk actu
Re:Oh boy. (Score:4, Informative)
Re: (Score:3, Interesting)
Yeah, and I also had the "manual" way of doing it as well. We had an old DOS menu system that would run batch files for programs. I created a batch file to extract the program to be run, run it and on exit zip it back up. That was my solution to making the most of that 40MB hard drive. ;)
Re:Oh boy. (Score:5, Informative)
Enter Arch Linux. Installed to a CLI in about 10 minutes. Getting the wi-fi working from the cli with wpa_supplicant and the zd1211 firmware for my card was a breeze. Then I proceeded to download and install xorg and icewm. All told, at a cli with wi-fi working it idles at eleven MB. Logged in to icewm it sits at 17. And with firefox running, a grand total of 51 Megabytes. And of course, it's blazing. With Firefox 2, it's at least as fast as my Pentium 4 laptop running Debian with Firefox 3. And, of course everything works in Firefox. Flash 10, etc.
Although what I've said doesn't speak completely to your point, suffice it to say, depending on your setup, you aren't doomed to a slower computer when running reasonably up to date software.
Re: (Score:3, Insightful)
With Arch installed, according to free, there are 17 MB being used by actual programs at the desktop
Re:Oh boy. (Score:5, Funny)
Re:Oh boy. (Score:5, Funny)
But what if I want to play Crysis on my EeePC during that boring office meeting!?
Your 8 core Core i7 EeePC?
Re:Oh boy. (Score:4, Funny)
Nope, correct question would be:
your eeepc with 800x600 resoultion?
Re:Oh boy. (Score:5, Funny)
Yes, the battery life is awesoNOCARRIER
Re:Oh boy. (Score:5, Interesting)
Just use the Intel ray tracer...
http://video.google.com/videoplay?docid=7079133482718383307 [google.com]
Much more impressive than "DX10 rasterizer".
Re:Oh boy. (Score:5, Funny)
At one frame per meeting, you're at least better off than people who play Quake over email ;)
Re:Oh boy. (Score:5, Funny)
To: "John"
Subject: Re: Quake
Boom headshot!
Re:Oh boy. (Score:5, Funny)
Much better frame rate than Halo over e-mail, where half the e-mails are spent teabagging each other.
Re:Oh boy. (Score:5, Insightful)
Please hand over your geek card.
Wish they had done a press conference... (Score:5, Funny)
Re:Oh boy. (Score:5, Funny)
Eight cores at 3GHz beat one core at 400MHz!!!
Film at eleven.
Re: (Score:3, Insightful)
Perhaps if you would try some cognition before you type, this is more about running the interface graphics than it is about gaming graphics.
Software rendering (Score:5, Funny)
What a revolutionary & useful idea.
Re: (Score:3, Funny)
Still won't be vista capable.
Yes. (Score:5, Insightful)
Re:Yes. (Score:5, Insightful)
Re:Yes. (Score:5, Insightful)
Re:Yes. (Score:4, Funny)
Running DX10 games on servers? Get back to work you lazy servermonkey!
Re:Yes. (Score:5, Insightful)
Aye; "wannabe computer companies worry about clock speed. Real computer companies worry about cooling."
Re: (Score:3, Informative)
If you have people running DX 10 games on a server, then you either have major staffing problems, or the server is nothing more than a glorified workstation. There is ZERO need for DX 10 graphics on a true server, and really, the need for a GUI should be near zero(unless the server software vendor can't figure out how to code for a true server).
Yes, there are good uses for having a GUI available on a server, but for normal maintenance, a command line SHOULD be all that is needed to reduce the overhead. th
Re:Yes. (Score:4, Interesting)
A GUI on a server should be entirely optional, and never the default...
Serial consoles enable me to rebuild my servers without traveling to the location where they are hosted. Even if the OS is screwed to the point i can't login using it's existing remote logon features, i can get on via serial and fix it or do a complete reinstall.
I've never found a need for a GUI on any of my servers, because everything i've ever needed to do was possible from the CLI. I would try to avoid any server software which required a GUI, as you pointed out poorly coded, and failing that i would install it locally and copy the configuration if possible. Having to install GUI libraries and the like would end up doubling the footprint on most of my machines, and therefore double the patching requirements.
Unbalanced comparison: cost (Score:5, Insightful)
How much is an 8-core system going to cost vs the system with integrated graphics? At that point, it seems wiser to invest more money in a graphics card than in faster CPUs if that's what you're going to be doing.
By far the more useful thing is that it's probably better for development because the driver developers will have a reference point of how the graphics are supposed to render. Also, larger game companies will be able to point out these differences to get bug fixes out of the graphics card companies. "Your graphics card renders this incorrectly with regards to the reference, fix it" is much more forceful than "your graphics card behaves differently than your competitor".
Re:Unbalanced comparison: cost (Score:5, Insightful)
Also, larger game companies will be able to point out these differences to get bug fixes out of the graphics card companies. "Your graphics card renders this incorrectly with regards to the reference, fix it" is much more forceful than "your graphics card behaves differently than your competitor".
DirectX already contains a reference rasterizer, which is better suited for that. This thing seems instead to be meant for applications that doesn't necessarily need more than "interactive" frame rates, but do need to run on a broad class of machines. Or for easing development of applications which could benefit from hardware acceleration when available (image processing f.i.).
From the MSDN page [microsoft.com] on WARP:
We don't see WARP10 as a replacement for graphics hardware, particularly as reasonably performing low end Direct3D 10 discrete hardware is now available for under $25. The goal of WARP10 was to allow applications to target Direct3D 10 level hardware without having significantly different code paths or testing requirements when running on hardware or when running in software.
Re: (Score:3, Insightful)
No no, see, now when Windows 7 requires video cards that nobody has but MS puts Windows 7 Ready stickers on all of the new computers anyway, when people say "my Windows 7 Ready computer won't run Windows 7!" MS can point out that yes, it does. Any version of Windows 7. Sure, it takes ten minutes to draw a menu, but it runs!
From the summary: (Score:4, Insightful)
"Running Crysis at 800 x 600 with the lowest quality settings, an eight-core Core i7 system managed an average frame rate of 7.36fps, compared with 5.17fps from Intel's DirectX 10 integrated graphics."
So the game went from unplayable at the lowest settings possible, to being still unplayable at the lowest settings possible?
Great move MS, youv'e really solved a problem there.
Re: (Score:2, Funny)
But, but, that's like, a 42% improvement! That's like, massive, man! MS are awesome!
Re:From the summary: (Score:5, Informative)
As I said in another post:
Running Crysis isn't the point of the demo. The point was that it was a DX 10 application running entirely in software. In the end, this means that systems without higher end 3D cards would be able to run Aero. THAT's the point.
They are trying to address the main complaint of the "Vista Capable" debacle. Running Crysis was just a way of demonstrating the capability.
Re:From the summary: (Score:5, Insightful)
be able to run Aero. Running Crysis was just a way of demonstrating the capability.
I think running Aero at would be a better way to demonstrate that capability.
Re:From the summary: (Score:5, Informative)
Well... then they better try again. It still sounds like a complete failure to me. Since the integrated graphics is equivalent, there is no advantage, and no resolution to the problem. What exactly are you try to get at?
Except the integrated graphics on a bunch of 'Vista Capable' laptops DON'T do DirectX10 or Aero... but if a patch to Vista (or Windows 7) will get Aero working on directX10 on the CPU... a buttload of PCs that CAN'T currently do Aero, now CAN.
Re:From the summary: (Score:5, Insightful)
Except the integrated graphics on a bunch of 'Vista Capable' laptops DON'T do DirectX10 or Aero... but if a patch to Vista (or Windows 7) will get Aero working on directX10 on the CPU... a buttload of PCs that CAN'T currently do Aero, now CAN.
But at what performance cost. If we are talking about the whole "Vista Capable" debacle, aren't we talking about low spec machines that coughed and wheezed when running the low-end version of the OS. Great, lets add 3D rendering to the processor load on those machines.
I like the idea of rendering the graphics in the CPU rather than an expensive accelerator card for one-off situations, as long as that feature can be turned off. But then, I'm not a gamer, and I'm not into all the eye-candy. If I were a gamer or into eye candy, there's no way this side of hell that I would want to render the graphics in the CPU. I would get the best video card money could buy.
Re:From the summary: (Score:4, Insightful)
And if the CPU is pegged rendering the GUI, what effect is this going to have on whatever the user is actually trying to do?
Re: (Score:3, Insightful)
Aero does not require DirectX 10 [wikipedia.org]; it only needs DirectX 9 with the right features, enough memory, and a suitable driver.
How many "Vista Capable" laptops aren't Aero-compatible? The Intel 945GM chipset runs Aero, and it began shipping in January 2006 (a year before Vis
So does MS hate get an automatic upmod? (Score:5, Interesting)
Seriously, this is a good thing. One could compare it to Mesa 3D. You have the option of running graphics in software, if you lack the hardware to accelerate it. This is highly useful in two situations:
1) You have something intensive and need to see it on a computer that lacks the requisite accelerator. Though it won't be fast, at least you can see the output rather than just being SOL.
2) You have a non-intensive task and don't wish to purchase dedicated hardware. While Crysis crawls, I'm going to guess something like, say, Thief wouldn't.
This is just a software lawyer to allow the OS to do 3D rendering even if there's not an accelerator present. I'm sure that 99.99% of people who do 3D in any capacity will use an accelerator as they are extremely cheap and extremely high performance. However it isn't a bad thing to have a software implementation. MS has actually had one for a long time, however it only comes with the development version of DirectX. It allows you to check the expected output for a program against the reference renderer as compared to an actual card.
Sounds like this is the same thing, just sped up and packed for end user use, rather than just developers.
Could have applications in the future too. For example what will computer hardware be capable of in 15 years? Processors are likely to be much faster as compared to today. Well, this might allow for 3D to be useful when emulating Windows for old programs. People remember people emulate DOS today (see DOSBox) for various purposes. I don't think it is out of the question that a decade or two later people will emulate Windows 7. Ok however part of that will be dealing with the 3D layer. A large number of apps today make use of Direct3D. Well, if Windows 7 has a software 3D layer, and processors are blazing fast you are good. Just use that. If it doesn't you then have to make your emulator emulate the 3D hardware, since I'm guessing a decade from now the 3D subsystem will be vastly different than it is now.
This is not intended to be a "Oh you don't need a graphics card ever," thing. It is intended to give people the option to get 3D without having to have a graphics card. It won't be as good, but at least it'll work.
Comment removed (Score:5, Informative)
Re:From the summary: (Score:5, Funny)
No, the game goes full speed, you just die randomly however as someone runs in, head shots you, runs out and your computer is still trying to render the first frame.
Grrrreat! (Score:5, Insightful)
Does anyone else remember the 'good old days' when certain 3D graphics cards (the ViRGE comes to mind), were actually SLOWER than software renderers?
The term used then was 'decelerator' and I think MS's stupid decision to (once again) bow to Intel on this should share the same term.
How long will it take for true 3D acceleration to become an expected standard feature on PC's?
Re: (Score:2)
Re: (Score:2)
Sadly, never as long as the GUI works most Joe and Jane sixpacks will be just fine; and yes I do know about the Vista debacle but I think the point is still valid.
Then you'd be mistaken. Both OSX and Microsoft effectively require hardware 3D acceleration for their desktop effects. All new Macs and any PC that actually meets Vista's real requirements feature 3D acceleration.
Re: (Score:2)
Re:Grrrreat! (Score:5, Insightful)
Sadly, never as long as the GUI works most Joe and Jane sixpacks will be just fine; and yes I do know about the Vista debacle but I think the point is still valid.
How is that sad? If people don't need it, it seems like a waste of money to me.
Comment removed (Score:5, Interesting)
Re: (Score:3, Informative)
Just a heads up: the PCI 6200 has some known problems with video playback. They were all driver-related, but as far as I know, NVidia never fixed them because the 6200 was always a fairly low volume unit and has now been dropped altogether.
Google "GeForce 6200 video won't play" or something similar and you'll see the number of forum threads and posts where people complain about how this version of the drivers works but not this version and so on.
The solution that's usually thrown about: disable hardware ac
Ummmm (Score:5, Interesting)
3D accelerators are an expected feature on standard PCs. I can't think of one you can get these days without one. All the current integrated Intel and ATi and nVidia chips are 3D accelerators. Not powerful ones, but they do the trick. Any ad in card is, of course, an accelerator.
However here's a better question: How long until we don't need that anymore? Personally, I'm not thrilled with the idea of having to have lots of dedicated hardware. The whole point of a PC is a general purpose machine that can do pretty much anything because it is all programmed in software. You replace dedicated units that did only one thing with a general purpose computer that does everything. Ok well that is somewhat undermined by the requirement of specialized hardware.
Now, I understand the need for it. Graphics are intense and there is just no way, at this time, for a CPU to handle it. A dedicated processor optimized for the kind of math graphics need is the way to go. However wouldn't it be nice if that weren't the case? Wouldn't it be nice if the CPU again did everything?
We won't see that day tomorrow, but perhaps we'll see it in a decade or two.
I look back to the changes in audio production and hope to see it come to graphics as well:
Originally, PCs used in audio production were little more than interfaces for complex dedicated hardware. A normal PC simply couldn't handle it. You had a PC that was loaded full of Pro Tool cards, which were massive bunches of specialized hardware, to do anything. Well as CPUs got better, you started to be able to do more on a regular PC. At first it was still nothing really useful in the pro market. You had to do everything non-realtime, spend lots of time rendering a change then listening to it and so on. But at least you could actually do it on normal computers. Yet more time passed and now non-destructive realtime software was available on normal systems. You could overload it pretty easy, you still had to bounce tracks and such, it wasn't the unrestricted power of an accelerated solution, but it worked pretty well and in fact lots of project studios did just that.
Then we come to now. Now, the hardware accelerated audio production system is a relic. They are still made, but they are unpopular. Most professional studios don't bother, they just get a nice powerful PC (by PC I mean personal computer, Macs are included in this) with a couple of multi core processors and go to town. The CPUs easily handle large number of tracks with multiple effects and so on all in realtime. There is simply no need for dedicated hardware, and not using it means much greater flexibility. Everything is just changed in software.
So, I'd love to see that same sort of thing come to graphics. At this point, CPUs have a long way to go. But then, technology moves fast. Everything I'm talking about in the audio world has happened in about 2 decades. In just 20 years or so it went from something you could only do with amazingly expensive special hardware to something that is easy for a $1000 computer to handle.
20 years from now, may be the same deal with graphics.
Re: (Score:3, Insightful)
This isn't quite true. Certainly the mixing, EQ, effects processing and a lot of signal generation (softsynths, etc.) is done on board the host PC nowadays, but where the rubber meets the road and there's a need have to have really good sample-accurate synchronized input/output in real time without the possibility of clicks and pops, people are still relying on outboard hardware, usually in the
Re:Ummmm (Score:4, Informative)
No, those soundcards aren't anything more than that. They just get the signal and convert it for the computer. I've owned a few, and worked with many more. Thus far I haven't seen any that do anything past conversion, routing, and perhaps basic mixing (basically those that have more advanced routing). Their function is to convert the sound to a format the PC can use and hand it off, nothing more. That they are external has nothing to do with it. That is done for convenience (hard to pack a lot of inputs on a PCI card) and noise (don't need to worry about dealing with all the RF from the computer). Firewire is often used since it has DMA and thus works well for low latency sound, but there's others that use their own PCI card and interface (MOTU does both, for example).
Now I leave open the possibility there are ones I haven't encountered that do something more, but those I've seen are just soundcards.
You forget that timing isn't an issue on the computer. Everything on there is asynchronous, clockless. The audio is just a stream of bits on disk. The computer never processes it at the sample rate, that is just a number stored in the file. So soundcards don't do anything special in this regard other than have a good clock to which everything is slaved (or perhaps a word clock input for external clocking). Once the audio has been converted and handed off to the system, timing isn't an issue anymore. The only difference between a cheap consumer card and an expensive pro card in this regard is the quality of timing source, and perhaps if everything is locked to a single source.
In fact, you'll find that there is often more processing done on consumer cards, than on pro cards. Pro cards just convert the signal from analogue or S/PDIF or whatever and feed it in to the computer. Consumer cards often do sample rate conversion, and sometimes various other kinds of processing. In fact the card with the most muscle I'm aware of (leaving out dedicated hardware like the HDAccel) is the SoundBlaster X-Fi. That can handle 128 different sound sources in hardware, do SRC on all of them, place them in space, and perform effects on them. Compare that to a MOTU HD192 which does little more than deal with audio input and output, and mix/route it as you specify.
The money/hardware in pro cards is in high quality circuitry, mostly in the analogue phase, not any special processing.
As an engineer... (Score:3, Interesting)
the brain-dead architecture of the x86 PC strikes me as funny.
Here, you've got 2, 4, what - now 8 cores which can't compete with a decent FPGA?! The problem isn't the CPU speed. The problem is that CPUs make very poor DSPs. A TI DSP can encode/decode HD video in realtime using only a few percent of the power required by the desktop CPU. A large part of that GPU's performance comes from the fact that it has hardware optimized for video, which, of course, Intel has steadfastly refused to add to their p
Great news then... (Score:4, Funny)
Running Crysis at 800 x 600 with the lowest quality settings, an eight-core Core i7 system managed an average frame rate of 7.36fps, compared with 5.17fps from Intel's DirectX 10 integrated graphics."
So they compared one unusable (and dirt cheap) setup to another, super-expensive and still unusable one, and then they brag about sucking 20% less?
This is typical for MS. They are mostly a software company, and there are too many people who advocate software-only solutions that make no sense, just because that's the only thing they know how to do (maybe.)
For server use, I guess? (Score:2)
Many server motherboards come with some chintzy onboard video, yet have plenty of CPU and RAM to throw around.
But who is going to be running D3D10.1 apps on a server? Is MS going to rewrite their GUI layers on top of their 3d API a la Apple?
Comment removed (Score:5, Insightful)
Re:For server use, I guess? (Score:4, Insightful)
Is MS going to rewrite their GUI layers on top of their 3d API a la Apple?
They did that in Vista. They did it so poorly that customers sued over being sold "Vista-capable" machines which weren't -- including Intel video cards that weren't enough.
Meanwhile, Ubuntu runs on Compiz, which does just fine on Intel -- and Apple has been so far ahead that someone took the audio from one of the original Vista presentations, and combined it with video from Tiger, thus showing that really everything "new" about Vista was just playing catch-up with Tiger, while Leopard was just around the corner.
More to the point: I believe it's now possible to run a Windows Server without a video card -- or, indeed, any GUI at all, depending on what apps you need.
Well...I think it's kinda cool. (Score:3, Insightful)
Say you get a new computer with a decent CPU, but no graphics card for work. You guys remember that thing, right? Work? Spreadsheets and documents and...yeah. That stuff.
Anyway, now you can play Tomb Raider on it. The original one. Sweet.
...and kills their own argument / lie... (Score:4, Interesting)
If you can run it on software you'll be able to run it on any OS version. Gee... that was another lie from Redmond, why am I not surprised... maybe 'cause I do run he DirectX 10 hack on my XP and no it didn't raise the CPU usage (as claimed be the union of MS Windoze Vista Fanboyz)... it lowered it.
Re: (Score:3, Insightful)
How does that even make sense? Not to defend Microsoft's bullshit, but how does coding a software renderer on one OS suddenly mean it should work with every OS? There's no possible logical leap there. Hell, why not DOS?
maybe 'cause I do run he DirectX 10 hack on my XP and no it didn't raise the CPU usage (as claimed be the union of MS Windoze Vista Fanboyz)... it lowered it.
What? There is no way to use DX10 on XP at this time; the only "hacks" are game-specific, allowing you to use DX10 games on DX9, or bump up the graphics detail on games when in DX9 mode to something closer to what they do in DX10 mode. All that proves is that these particular games don't actu
Re: (Score:3, Informative)
With XP you're never going to have full DX10 support. The kernel can't physically do a lot of the functions itself.
Re: (Score:3, Informative)
I guess you failed CS 101. A more sophisticated API is always going to be slower than "poke xxx".
Graphics hardware is moving from being a specialised device which can handle basic primitive drawing to a full fledged massive vector processor. Doing this makes it less efficient at the original task.
Your time might be worthless, but in the real world developer time costs money - and at the rate of hardware impr
Re: (Score:3, Insightful)
DirectX is middleware between the hardware and software, there's no reason you couldn't implement the frontend side of things, regardless of how it's actually handled on the back end... Just look at wine.
lol (Score:5, Insightful)
/. is silly
they made this to run the desktop effects
not crysis xD
Re:lol (Score:5, Funny)
Re: (Score:2)
To prove that their implementation is complete, and doesn't completely suck, even if it mostly sucks.
And, if you think about it, this could be good for Larabee, which is supposed to be just a bunch of x86 CPUs on a card.
Re: (Score:2)
Re: (Score:2)
Why not?
It certainly shows that their software renderer is fast and feature-complete enough to run fairly recent games.
Re:lol (Score:4, Insightful)
DirectX 10 on CPU is _NOT_ intended for games.
It'll be used for rendering the Aero interface. And it requires several orders of magnitude less computing power. Hell, even my old 4-year old ATI Radeon 9600 can render Aero just fine.
Games make a useful test-case, though.
Re: (Score:3, Interesting)
But then, isn't the whole point of aero (excusing the prettiness) to get load OFF the CPU and onto something else?
Re:lol (Score:5, Insightful)
Sure. But you also need good-quality 3D drivers. This way Microsoft will be able to run Aero even on plain VESA framebuffer.
Also, consider this: the upcoming Intel Larrabee graphics card will consist of 64 independent programmable x86-compatible cores. NVIDIA CUDA also allows direct GPU programming.
I bet this renderer will be adapted to run directly on such GPUs bypassing their 'native' rendering pipelines. That'll give Microsoft freedom to experiment with new feature such as ray tracing without any help from hardware vendors.
Re: (Score:2)
Do you run benchmarks copying one file from home to tmp and back again?
Crysis is a good place to test dx10 performance in a way people will understand, would you rather they said they could do x thousand polygon operations per second vs n thousand?
Re: (Score:2)
Arg the first insightful post but in such a stupid form!
On the one hand I want you to get modded up, as on a modern system being able to use software to render desktop effects will be useful (I wonder what the FPS on aero is though). However I also wish you to die a horrible and painful death for using the subject "lol" and ending an unpuctuated post with an emoticon.
Imagine a DX 10 game on an 800mhz CPU -SSE/MMX (Score:5, Insightful)
Re: (Score:3, Funny)
So you're saying the next Office will require eight cores to run? (and only be as fast as on an Intel IGP...)
It's truer than ever (Score:5, Funny)
"Every time Andy gives us more power, Bill takes it away".
Oww it hurts! (Score:4, Funny)
Running Crysis at 800 x 600 with the lowest quality settings, an eight-core Core i7 system managed an average frame rate of 7.36fps, compared with 5.17fps from Intel's DirectX 10 integrated graphics.
and this is ball-slapping good news?
Re: (Score:3, Insightful)
Man, that's like 2 whole fps more. With further optimization they might even crank it up to 15fps, which would get it close to the framerate I got from Crysis on medium settings with my laptop. And the best part is, you can run it on your enterprise-class server when you aren't busy serving up hundreds of thousands of SQL searches! Why pay $400 for a lousy video card when you can buy a $20K server instead?
Yay! (Score:3, Insightful)
Hurrah! In the future, when i switch off pointless Aero crap, it will free up lots more cpu cycles for the annoying microsoft apps i need to run to see simple 2d spreadsheet data sent to me by retards who use proprietary microsoft file formats. Microsoft FTW!!
A good feature and still the endless bashing (Score:3, Interesting)
Improving performance over a dedicated graphics chip (albeit a weak one) is still a respectable achievement, especially when you consider games typically use ~100% cpu anyway. Whilst it may be unplayable for crysis, I can see it giving a solid frame rate on things like WoW.
How does the performance compare to Mesa? (Score:4, Interesting)
The news here is not the existence of a software renderer, but one with good performance (such that a high-end CPU is competitive with a low-end GPU for some tasks). I wonder how the trusty Mesa GL renderer compares to Microsoft's latest offering? (They implement different APIs, but Wine provides DirectX 10 on top of OpenGL, so you can get an apples to apples comparison.)
Xbox 4 ~ Computer in every house! (Score:3, Interesting)
I doubt big multicore chips will be cheap enough for the Xbox 3, but win7 probably won't be released in time for it anyway.
This isn't for Aero on GMAs. This is so you can target both Xbox4 and Win7/GFW without even bothering to think.
One set-top box, one platform. They've had a hard-on for it for a decade+, it's coming.
Re: (Score:2, Insightful)
Well in all fairness it's a pretty dumb idea. An 8 core CPU managed 7fps? Whoooopeeee!
How about instead of wasting time on this, they work with vendors and get properly working drivers for the stand-alone graphics cards?
Re:Quickly, bash microsoft. (Score:5, Insightful)
How about the vendors learn to code and stop writing shitty drivers! I mean they have the full spec on the cards and still cant produce a driver as stable as some guys reverse engineering! Vista had a driver model ready for how long? Its not even like the change was unexpected.
Re:Quickly, bash microsoft. (Score:5, Interesting)
Re:Quickly, bash microsoft. (Score:5, Insightful)
And it's not just the GPU companies. Creative took their sweet time releasing Vista drivers for their previous generation of audio cards. I believe they were actually released after Vista was, and they're still just dreadful.
My Audigy 2 is not that old, but after much fighting I still couldn't get 4.1 sound and EAX to work in any capacity. Part of it was Creative insisting on their own competing implementation of how to configure speakers which does not play nicely with the one included with Vista. Other issues are due to the general crummy nature of the drivers. Still other issues apparently only occur on Vista64 with 4 or more GB of RAM. Just awful. Eventually, I had to stop using the Audigy and use the onboard RealTek branded Intel HDA chip which seems to work fine, though the sound is less clean than what I got with my Audigy.
Another piece of hardware, a Playstation/Gamecube/Dreamcast to USB controller adapter, from EMS Production (http://www.hkems.com) won't work with Vista64 either. Two years in and the company, still alive, has yet to release any Vista64 drivers and the Vista32 drivers are still listed as "beta".
The annoying thing here is that the damn thing shouldn't even *need* an adapter. In Linux it is simply recognized as a HID gaming device and works fine. Vista actually recognizes it as such and DirectX controller diagnostic program can properly read values from the controller, but Vista steadfastly refuses to list the device in the "Game Controllers" control panel dialog, making it pretty useless for anything.
Sigh... at least both these pieces of hardware work perfectly well in Linux...
Re:Quickly, bash microsoft. (Score:5, Informative)
I'm sorry, but Peter Gutmann is not a reputable source for accurate information on Vista graphics, or anything related to Vista at all. Several of his claims have been widely proven to be exaggerated or downright false, and when asked to provide proof, he has refused. His claims have been picked apart on numerous sites both directly and indirectly through the sourcing of benchmarks.
I suggest you read these articles for instance, which provide a good overview:
http://blogs.zdnet.com/Ou/?p=673 [zdnet.com]
http://blogs.zdnet.com/Ou/?p=718 [zdnet.com]
Some of his points are admittedly valid, there are genuine flaws in the new graphics driver device spec., but he's clearly most concerned with pushing an anti-Vista agenda, even if that requires resorting to FUD.
Choose your "experts" carefully.
Re: (Score:3, Insightful)
Well I re-read Gutmann and the blogs and I feel a bit ignorant. When I read it a few years ago, I was by his thoroughness, especially with the driver issues..
How things change.
Facts hurt Microsoft, get over it (Score:3, Insightful)
Choose your "experts" carefully.
I'll take an expert over a pay-for-say MS "expert" any day. Facts happen to run against MS, get over it. That's why the marketing firms they hire come down so hard on reviewers, evaluators and benchmarkers.
If you want to get down to the bottom of some of the many, many problems with MS Vista, as well as the OpenGL imitation, then see Peter Gutmann's analysis, A Cost Analysis of Windows Vista Content Protection [auckland.ac.nz].
Running a smear campaign [boycottnovell.com] may or may not annoy the author, but it is the facts he is reportin
Re:Quickly, bash microsoft. (Score:4, Informative)
I presume you're referring to the article in which he described Vista as something like "quite possibly the longest suicide note in history"? I read it back in 06 shortly after it was initially published, I didn't know Gutmann's work terribly well before reading that, but he came highly recommended.
However, that article cost him about 98% credibility with me. Some of it - even some of the really bad stuff - might in fact be true. However, there were trivially verfiable claims he made which were blatantly untrue (an example being that ATI, nVidia, and other graphics companies were going to need to switch away from unified drivers, and provide a different driver for each card model - which by the time the article went public was an obvious falsehood since you could download and install the beta Vista drivers for any card in a given family and they would work fine).
If the man can't be bothered to do even that minimal an amount of research (it also didn't help that he refused to disclose any of his sources) then he has no business publishing in anything but tabloids, nor does he have any place in academic circles. I am a student, not a professor, but if I had written such tripe and submitted it to anybody who knew what I was talking about, I'd have been laughed out of the department.
Incidentally, the article has been edited at leas three times since its initial publication. While I have no objection to revising, it is usually done prior to publication, not afterwards. Furthermore, while some of the more blatantly false claims are missing from the latest version, Gutmann neither addresses nor explicitly retracts those statements. It is although he wishes to remove the original statements entirely, though nothing controversial on the Internet ever vanishes so thoroughly as that.
Re:Quickly, bash microsoft. (Score:5, Interesting)
Isn't this the point of openGL? An API to dedicated graphics hardware with a backup software renderer if the hardware isn't supported?
Whose idea is this again? It doesn't look like much of an idea, more like a step backwards..
Re: (Score:3, Informative)
Actually, the whole point of OpenGL was to provide software- and hardware- vendor agnostic API for writing applications that perform 3D rendering. You've clearly been living in a monoculture too long if you can't see that.
Software fallback is nice to have but, it's certainly not the reason OGL exists.
/Mike
Re:WOW! Someone buy microsoft a clue. (Score:5, Informative)
Running Crysis isn't the point of the demo. The point was that it was a DX 10 application running entirely in software. In the end, this means that systems without higher end 3D cards would be able to run Aero. THAT's the point.
They are trying to address the main complaint of the "Vista Capable" debacle. Running Crysis was just a way of demonstrating the capability.
Re:WOW! Someone buy microsoft a clue. (Score:4, Informative)
Because, quite frankly, people were upset that their 'Vista Capable' computers couldn't run Vista with Aero enabled. The integrated cards don't have the 'oomph' for Aero's glassy transparency effects, but Microsoft had tooted the horn of 'Look! Shiny!' loud and long, so people expected that functionality. In addition, there are other places extended graphics capabilities are used (the Vista DVD maker program, for instance), where if your card isn't up to snuff, you can't use those programs.
By showing 'we can make this work in software, slowly, but work,' they're trying to address that. This isn't for gaming, despite the demo. This is an attempt to solve the problem out of the gate in Windows 7 so that they don't have another Vista Capable type class action suit.
Re: (Score:3, Insightful)
Which part of "eight core machine" is cheap and low end?
I'm sure a $50 graphics card is cheaper (and would whip this things ass).
Re: (Score:3, Informative)
The client-server model of OpenGL works well because GPU programming is a client-server model - the application is the client, running on the CPU, and the server is on the GPU. You need to transfer data to a coprocessor and process it remotely. Direct3D does exactly the same thing.
If you'd looked at the OpenGL 3 spec, instead of reading tabloid reports, then you'd see that it has some pretty major changes. The entire fixed-function pipeline is basically gone (although it can be emulated easily in shader