AMD Previews DirectX 11 Gaming Performance 103
An anonymous reader writes "AMD invited 100 people up to their private suite in the hotel that Quakecon 2009 is being hosted at for a first look at gaming on one of their upcoming DirectX 11 graphics cards. This card has not been officially named yet, but it has the internal code name of 'Evergreen,' and was first shown to the media back at Computex over in Taiwan earlier this year. The guys from Legit Reviews were shown two different systems running DX11 hardware. One system was set up running a bunch of DX11 SDKs and the other was running a demo for the upcoming shooter Wolfenstein. The video card appears to be on schedule for its launch next month."
Except (Score:1, Insightful)
Problem with DirectX11: Requires Windows Vista or 7.
Re: (Score:2, Insightful)
Bigger problem: Probably runs worse than directx9 with it's only "advantages" being one or two minor shader effects (geometry shaders...) and a lot of games that arbitrarily lock things to Dx11 mode when they could run just fine in dx9 mode.
Re:Except (Score:5, Informative)
It also has a lot of awesome smaller features that make doing what are known as deferred shading/lighting pipelines more feasible. This is a good thing because it simplifies the amount of work needed in implementing game's material system while offering great performance at the cost of more GPU memory being used.
Re: (Score:1, Troll)
I remember the marketspeak about dx10, nobody did anything particularly impressive with THAT either.
Re:Except (Score:4, Informative)
Re: (Score:1)
Re: (Score:2)
Certainly, there are a few features that speed rendering up, while making it look better.
But those same features were probably available on older cards - just not exposed in the API. I saw an OpenGL Parallax Occlusion demo years back, running on a GeForce 6 or 7. (Can't recall which)
HDAO looks nice, but the performance hit is massive. I just know there's going to be a few games that force it on, completely tanking your framerate for a few shadows. :P
And yes, framerate dropping to 10% is completely tanking.
Re: (Score:2)
I'm not arguing for or against Microsoft platforms but the fact remains DirectX is currently the de facto standard in creating games. And even though it's a COM based technology, it's still kinda fun to play with.
Re: (Score:2, Interesting)
XP FTW. (Score:2)
I am not planning to go to Vista or 7 any time soon. Maybe in a few years when MS and other companies drop support it. I am still happy and fine with my old XP Pro. SP3 (IE6).
Re: (Score:3, Insightful)
Re: (Score:2)
What's wrong with IE6? Do you use it? I rarely use it except for Windows/MS Update and Web sites that don't work with Mozilla's Web browsers. :(
Re: (Score:1)
I'd still use win 2k if MS pumped out more security updates for it. To me that OS was the best MS ever produced. However, if I want to use an MS OS then it is XP now. How long until MS starts pushing for 7 and XP users are forced to upgrade for security reasons just like Win2k users where?
Re: (Score:2)
Um, MS still supplies security updates for W2K and its IE6. The hard part is some recent, latest, and upcoming softwares don't work in it. :( But for old stuff, it rocks (still use it for that case).
Re: (Score:3, Insightful)
Re: (Score:1)
Re: (Score:2)
Since when? The standards only exist as a way of describing what hardware can do. Hardware has often preceded standards in many areas, including 3D gaming.
OpenGL has an extension feature for exactly that reason -- allowing hardware vendors to describe features that aren't standardized yet.
Re: (Score:1)
Re: (Score:2)
Traditionally, it was very much the case that the spec preceded hardware support. OpenGL was around for years before it was ever (basically) fully implemented on consumer hardware. There are still some corner cases where some cards will have to fall back to partial software support, and some cards (like my cheap laptop chip) are specifically designed to run some things in software (my chip lacks vertex shaders).
More recently, it's become a bit more complicated, because the spec designers and the hardware
Re: (Score:1)
Re: (Score:2)
That's what people said when DX7 was released.
Re: (Score:2)
Since always?... What are you even talking about? Games are written against the API, and then that API is implemented in hardware. That's what video cards are: efficient implementations of DirectX.
Would you rather the card decide the API and require games to support hundreds of different interfaces?
Re: (Score:1)
Or you know, the hardware could just come with documentation so everyone could implement their favorite API on top of it.
We will be seeing more of this once Intel's Larrabee platform gets released early next year.
Re: (Score:2)
"That's what video cards are: efficient implementations of DirectX."
Huh, I didn't know you could turn electrical 0s and 1s into a piece of silicon and PCB.
Re: (Score:2)
You have no idea how stupid you sound.
Re: (Score:2)
Definition of Video Card:
A circuit board that is usually mounted inside the computer that generates signals necessary to drive, or control a specific type of monitor.
You have no idea how stupid you ARE.
Re: (Score:2)
Perhaps he has misstated the case a bit, but the fact is that video cards have functions to correspond to directx functions these days in the way that they used to have them to correspond to opengl functions. Some of this is of course implemented in software; the idea is however to always implement as much as possible on the card itself, leaving the CPU free to do the other stuff. That's why we're seeing physics functions creeping into GPUs... they can sell us more transistors if they do more of the work.
Re: (Score:2)
They do currently make the best selling current generation real games console and the only serious platform for enthusiast/hardcore/multiplayer gaming.
The PC is still an open standard so there is hope that other OSes might improve things. OpenGL was the standard for a while but they messed that up.
I'd like to see someone make a stripped-down linux distribution just for games & media... no X-windows and all the other crap.
Re: (Score:3, Informative)
>>Since when did we build hardware around APIs, rather than the other way around?
Always.
There's always a dialogue between software and hardware people on what needs to be implemented, and whether it should be done in hardware and software. The RISC/CISC days were full of stories like that in the CPU design world.
Re: (Score:3, Insightful)
Re:"DirectX 11" Hardware? (Score:5, Insightful)
Re: (Score:3, Informative)
Maybe you should also mention in your rant that it doesn't matter whether OpenGL 3.x implements a feature, because every hardware developer can just add an extension to it to implement that feature. This means that new features usually get into the standard after they have been deployed in new hardware.
This is not possible in Direct3D, and so in this case the new versions have to be developed before the hardware for it gets deployed. That's why it always appears that OpenGL is lagging behind, when in realit
Re: (Score:2, Insightful)
Re: (Score:2)
Uh, [ ] you have read and understood my reasoning why Direct3D supported geometry shaders before OpenGL.
AMD not supporting geometry shaders in OpenGL is bad, but that's notorious of them. One more reason not to buy their cards.
Re:"DirectX 11" Hardware? (Score:5, Insightful)
I wouldn't be surprised if you were all Microsoft-paid trolls and marketers that are placing your twisted spin on things and making people continue to believe in your garbage.
The hardware manufacturer talks to Microsoft. Microsoft talks to the hardware manufacturer.
This - surprisingly enough - turns out to be mutually beneficial.
Re: (Score:2)
What you are suggesting is that ATI and NVIDIA compete on features in such a way that their hardware isnt interchangable. Further, that software makers themselves would need to pick one or the other. That consumers would then need to be mindfull of who the software targets, and so on and on and on...
The fact was that when things are done as you suggest, it sucks bigtime for the companies making the hardware. When you have multiple large competitors in the market, neither
Re: (Score:2)
That's called "competition". If FOO is worth creating, implementing, and using because it gives a significant advantage over the status quo, then it will be an advantage for the company that implemented it, and a loss for others. You can market card A with FOO when programmers begin implementing it in games as not being available on card B.
The reality is that programmers *DO NOT* implement it in games if its not available on card B.
Thats the error in your logic. A game company cannot afford to give the finger to half of the market. This leads right back to the hardware company being fucked for spending money on arbitrary innovation.
What you are proposing, and what is happening, is that graphics become a bland sameness across all cards where everything looks horribly generic and nothing exciting or revolutionary can occur.
Maybe you don't know this, but graphics cards render what the programmers tell them to render. They look exactly like what the programers expect. If the programmer wanted it to look different, it would look dif
Finally! (Score:2)
What? Direct X 11? What's tha --
Oh hell, nevermind.
What is next? (Score:1)
Drivers drivers... (Score:2, Insightful)
Re: (Score:2)
ATi, like nVidia, uses a universal driver architecture - the support should be included NATIVELY.
Re: (Score:2)
Re: (Score:1)
Re: (Score:2)
Contractual obligations to partners (e.g. Microsoft game doesn't run) or to customers (e.g. anyone who got a bunch of those GPUs with the die bonding problem and sent them out to customers and hasn't stopped hearing about it since, say HP.)
Re: (Score:1)
This Sarbanes Oxley is a pain. My company told me that once an employee is is fired or resigns under Sarbanes Oxley there's no way to pay them termination because that would be an illegal expense as they are no longer receiving an benefit from that employee. Since they are worried someone who doesn't understand SOx might sue them they have told the security guards to just beat people who are leaving to death with a monitor and bury them under the flowerbeds out back.
All this is due to SOx, the management wo
Re: (Score:1)
Re: (Score:2)
they probably will once windows 7 is out.
Re: (Score:1)
Re: (Score:2)
If you do not have the hardware to support Windows 7 then don't upgrade. If your graphics card is the only thing holding you back then take a stroll over to Newegg [newegg.com] and start upgrading.
Complaining about hardware that was designed for Windows XP not working in Vista/Win7 is really akin to complaining about hardware that worked fine in Win95/98/ME not working or working well in XP. Eventually you have to upgrade hardware to run modern software. If you think ATI is choosing to end support for a legacy product t
Re: (Score:1)
I'll buy an ATI card when they make usable linux drivers with accelerated video like vdpau. Right now the nvidia blob is so much better and i dont really care that it is closed source. I've have a couple of friend that use ATI, and the only reason they still have windows on the computer is the crappy ATI linux support.
Interesting tidbit: 4 display output connectors (Score:5, Informative)
The demo system they set up had one of those new DirectX 11 cards and that card is a dual-slot solution as all the highend graphics cards are now. But ATI did use the space from those two slots quite nicely by including dual DVI ports AND a HDMI AND a DisplayPort connector meaning you have all the different types of digital display connectors available on a single card, which would be a first, I think.
No word yet whether you can use all four ports simultaneously, but if you could, it looks like a nice new way of hooking up multiple displays
Re: (Score:2)
But ATI did use the space from those two slots quite nicely by including dual DVI ports AND a HDMI AND a DisplayPort connector meaning you have all the different types of digital display connectors available on a single card, which would be a first, I think.
I would like to see multiple HDMI outputs. The one cable - one connector - solution for audio and video.
Re: (Score:2)
See, this is what I don't get - why does everyone think HDMI is so awesome? It's just DVI with a couple extra pins for audio. It's not inherently higher-quality; does it have a sufficiently higher bandwidth capacity than DVI + TOSLINK that it makes an impact in real-world environments (24fps 1080p video/5.1 surround sound)? And how is having your video card double as a sound card a good idea? Isn't that just asking for aural interference from the video components?
--- Mr. DOS
HDMI 1.4 (Score:2)
See, this is what I don't get - why does everyone think HDMI is so awesome? It's just DVI with a couple extra pins for audio.
It's become far more than that:
HDMI 1.4 was released on May 28, 2009. HDMI 1.4 increases the maximum resolution to 4K × 2K (3840×2160p at 24Hz/25Hz/30Hz and 4096×2160p at 24Hz, which is a resolution used with digital theaters); an HDMI Ethernet Channel, which allows for a 100 Mb/s Ethernet connection between the two HDMI connected devices; and introduces an Audio Retur
Re: (Score:3, Informative)
See, this is what I don't get - why does everyone think HDMI is so awesome? It's just DVI with a couple extra pins for audio. It's not inherently higher-quality; does it have a sufficiently higher bandwidth capacity than DVI + TOSLINK that it makes an impact in real-world environments (24fps 1080p video/5.1 surround sound)? And how is having your video card double as a sound card a good idea? Isn't that just asking for aural interference from the video components?
First point: HDMI is all-digital, so you don't get "aural interference from the video components". It's actually a pretty cool feature of the current batch of HD 4xx0 cards that you can run the output of an HTPC on one cable.
Second point: HDMI, in the later revisions of the spec (1.3+ or so), actually does have improved features over DVI, like deeper color support, and higher bandwidth to support higher resolution displays. (It also supports 7.1 sound, not merely 5.1. Not that you actually need any of thi
From Wikipedia, (Score:4, Informative)
Since most you other fucks just make some sort of quip with no facts, (yeah yeah, i know it slashdot) here is the wikipedia entry for DX11.
"Microsoft unveiled Direct3D 11 at the Gamefest 08 event in Seattle, with the major scheduled features including GPGPU support, tessellation[11][12] support, and improved multi-threading support to assist video game developers in developing games that better utilize multi-core processors.[13] Direct3D 11 will run on Windows Vista, Windows 7, and all future Windows operating systems. Parts of the new API such as multi-threaded resource handling can be supported on Direct3D 9/10/10.1-class hardware. Hardware tessellation and Shader Model 5.0 will require Direct3D 11 supporting hardware.[14] Microsoft has since released the Direct3D 11 Technical Preview.[15] Direct3D 11 is a strict superset of Direct3D 10.1 - all hardware and API features of version 10.1 are retained, and new features are added only when necessary for exposing new functionality. Microsoft have stated that Direct3D 11 is scheduled to be released to manufacturing in July 2009,[16] with the retail release coming in October '09"
Seems pretty big to me. The thing I see being the biggest is the work on improving multithreading/multicore support, and the whole GPGPU thing. Not to mention that the API will be very compatiable with older cards (read: no real need to upgrade cards just yet)
I know its bad form to post to myself but (Score:1)
I just read up on tesselation and it looks freaking badass.
Re: (Score:3, Insightful)
Re: (Score:1, Insightful)
One of OpenGL's advantages was that the code would work on a number of platforms. Originally on IRIX, IBM licensed it so it worked on AIX machines. Then it moved to other platforms, surpassing 3DFX's Glide interface. OpenGL is still being worked on, 3.2 was released not so long ago.
Direct X11 offers the GPGPU support, but it also offers multithreading (some games chew CPU cores up like they are going out of style, so having threads split up among multiple cores will help performance)
Best thing would be i
Why not OpenGL? (Score:1)
Re: (Score:2)
...and Macs, PS3, Wii, PSP, iPhone...
Re: (Score:1, Informative)
What this means to you and me (Score:1, Insightful)
Microsoft spat in NVidia's eyes when they went with ATI for the Xbox 360, and now they're spitting in ATI's eyes by introducing an incompatible standard. This is just great.
Why? (Score:2, Informative)
Re: (Score:2)
Re: (Score:1, Flamebait)
DX10 was a flop because Vista was a flop. If MS let users of XP grab DX10, DX10 would have caught on in games, but it was Vista-only and no game makers were about to (or are about to) invest a ton of money in a game that's either Vista-only or in the work to make a DX9 game actually make full use of DX10 features. DX10 just was irrelevant because a great majority of the market wasn't able to use it.
Console Ate Our Graphics (Score:2)
Might be a part of it, but I think the real issue here is that the kind of high-end games that used to push the envelope hardware-wise, now more often than not end up on the consoles instead. Since the PC gaming platform is now like three hardware generations ahead of the consoles, console games acts like a cushion on PC gaming.. I was going to say progress, but let's be specific and say qualify as graphics progress. We'll get the occasional (late) port with DX10.1, or in the future, DX11 added -- developer
Re: (Score:2)
For me... it's been the fact that there are just no games coming out that make me go, "Oooh! I need that!"
I've cut my gaming time from over 20 hours a week to about 3 simply because the games that are coming out are simple rehashes. Graphics aren't making the old tricks worth doing anymore.
Re: (Score:2)
Sweet, random Flamebait moderation! Some kind of die-hard DX10 / Vista enthusiast take offense?
Re: (Score:1)
DX10 was more to alter the hardware removing variation. DX10 had little to do with implementing how the user sees the game visually and more to do with how the cards are kept up to standard and the games are programmed. Honestly, I feel that DX11 will be somewhat that way as well with the gpgpu support, threads, and so on.
bump mapped fun! (Score:1)
Neat! The idea of drawing a 2d picture and then having an engine that auto adds wireframe and all that fun stuff seems to remove a lot of work for the developer.
I honestly thought dx11 to be more of a dx10 where most of the alterations would not be noticed by the gamer (like threading) so I'm glad they are adding something visual to help people want to push to use dx11.
I'm an OSX user so don't get me wrong. I'm not exactly a fan of directx per say but any type of innovation towards pushing the market forwar
Upcoming? (Score:1)
If the title of this "upcoming" game is any indication there will be little creative movement on DX11's front either ...
I'm just saying, Wolfenstein?@! Upcoming?@!
Can they come up with some new ideas already?
Why do DirectX still look like shit? (Score:1)
The screenshots look ugly. It's 2009 and they cant make a pretty demo?! The texture is fuzzy, look at the sand. It's extra ultra mega HD but still look 1998.
Now the SLI-in-one seems desperate. But we dont know before it's revealed.
Btw, when do we get the GPU as a core next to the CPU?
DX11 SDK Videos (Score:2)
PCPer has another preview of the same content but includes video of the DX11 SDKs as well.
http://www.pcper.com/comments.php?nid=7640 [pcper.com]