Please create an account to participate in the Slashdot moderation system


Forgot your password?
Graphics Games Technology

AMD Previews DirectX 11 Gaming Performance 103

An anonymous reader writes "AMD invited 100 people up to their private suite in the hotel that Quakecon 2009 is being hosted at for a first look at gaming on one of their upcoming DirectX 11 graphics cards. This card has not been officially named yet, but it has the internal code name of 'Evergreen,' and was first shown to the media back at Computex over in Taiwan earlier this year. The guys from Legit Reviews were shown two different systems running DX11 hardware. One system was set up running a bunch of DX11 SDKs and the other was running a demo for the upcoming shooter Wolfenstein. The video card appears to be on schedule for its launch next month."
This discussion has been archived. No new comments can be posted.

AMD Previews DirectX 11 Gaming Performance

Comments Filter:
  • Except (Score:1, Insightful)

    by emkyooess ( 1551693 ) on Saturday August 15, 2009 @01:51PM (#29077031)

    Problem with DirectX11: Requires Windows Vista or 7.

  • by Anonymous Coward on Saturday August 15, 2009 @01:51PM (#29077037)

    Since when did we build hardware around APIs, rather than the other way around?

    Boy, we have entered a really topsy-turvy world with Microsoft thinking they are the fucking god of computers, haven't we?

  • by afidel ( 530433 ) on Saturday August 15, 2009 @01:59PM (#29077093)
    What?!? The standard always comes before the hardware, DX11 is an API and a (defacto) standard. We could go back to the OpenGL model with ARB extensions for new features that are implemented differently by each party until the standard catches up, but that was tough on everyone. It was tough on the hardware guys because they inevitably implemented features that didn't make it into the standard, it was hard on the standards body because they had to arbitrate between the different implementations to pick a winner, and it was hard on the software guys because they had to support the whole mess. It's a primary reason that DX won over OpenGL in the marketplace.
  • Re:Except (Score:2, Insightful)

    by Shadow of Eternity ( 795165 ) on Saturday August 15, 2009 @01:59PM (#29077097)

    Bigger problem: Probably runs worse than directx9 with it's only "advantages" being one or two minor shader effects (geometry shaders...) and a lot of games that arbitrarily lock things to Dx11 mode when they could run just fine in dx9 mode.

  • by Anonymous Coward on Saturday August 15, 2009 @02:05PM (#29077149)

    Or you know, the hardware could just come with documentation so everyone could implement their favorite API on top of it.

  • Drivers drivers... (Score:2, Insightful)

    by Anonymous Coward on Saturday August 15, 2009 @02:41PM (#29077457)
    How about they fix their win7 drviers for not-so-old but still great performing cards like the X1800 ? Nvidia customers are having a great time with win7 atm, and even Intel integrated graphics are performing better, but I've got several friends with less than 2 year old ATI cards that perform great, but have no real driver support with trashy, even BSOD drivers from ATI for win7.
  • by TheRaven64 ( 641858 ) on Saturday August 15, 2009 @02:42PM (#29077463) Journal
    Always? This is generally how it works for GPUs:
    1. OpenGL provides an API.
    2. Vendors implement some subset of OpenGL in hardware and the rest in software.
    3. Vendor adds Shiny New Featureâ to their hardware.
    4. Vendor publishes VENDOR_SHINY_NEW_FEATURE OpenGL extension.
    5. Another vendor implements it, and it becomes EXT_SHINY_NEW_FEATURE.
    6. The feature is proposed for inclusion into the next standard and becomes ARB_SHINY_NEW_FEATURE.
    7. The next version of OpenGL is released, with this feature as standard.
    8. Vendors provide either a hardware or software implementation of the feature in their drivers (ideally, the OS vendor provides a software implementation which the driver API uses if the hardware doesn't support it).

    At some point during this process, Microsoft takes a selection of OpenGL extensions - often including ones that vendors have proposed but not yet implemented - and says that the next version of Direct3D will require these. Vendors then implement whichever ones they didn't provide in their next generation hardware and stick a DirectX n+1 label on it.

  • by Suiggy ( 1544213 ) on Saturday August 15, 2009 @02:53PM (#29077561)
    Actually, it hasn't been this way since around 2003/2004. Essentially nVidia, ATI/AMD, Intel and a few other lesser known vendors sit down in league with Microsoft and decide what kind of features they will be able to implement in the next graphics hardware cycle. They then come up with the API and get feedback from the hardware vendors and work towards a final workable API. This is what we saw with Direct3D 9.0c, Direct3D 10, and Direct3D 11. OpenGL and the ARB has lagged way behind Microsoft and its partners, which is why the ARB was eventually disbanded and replaced by the Khronos Group. The Khronos Group kind of messed up OpenGL 3.0, they didn't implement half of the things they said they were going to do. As such, OpenGL 3.0 lagged quite a ways behind Direct3D 10. Fortunately, they've caught up, and OpenGL 3.2 is on par with Direct3D 10, but still a big step behind the new stuff in Direct3D 11. As such, Microsoft and it's partners are leading the pack here, and Khronos (and because most of Microsoft's Direct3D partners are also Khronos group members) is no playing the role of follower. You can be guaranteed that the next major revision to OpenGL to match Direct3D 11 almost exactly in features, as nVidia, ATI/AMD, et. al. don't want to deviate radically in their underlying hardware.
  • Re:From Wikipedia, (Score:3, Insightful)

    by Suiggy ( 1544213 ) on Saturday August 15, 2009 @03:04PM (#29077659)
    Yeah, there's a lot of idiots here who still think OpenGL is better than Direct3D. I doubt they'll ever change their opinions despite the fact that some of us are trying to force the facts down their throats. I'm by no means a Microsoft fanboy, I also use OS X and a couple of Linux distros at home and work, but you just can't argue with the fact that Direct3D 11 is better than anything else out there. Hands down. It's just a better API all around. I'm looking forward to moving towards implementing a Direct3D 11 renderer in our code base in the future. Currently our Direct3D 9 rendering code path is almost half the number of lines of code for our OpenGL ES 1.1 and 2.0 renderer implementations.
  • Re:From Wikipedia, (Score:1, Insightful)

    by Anonymous Coward on Saturday August 15, 2009 @03:26PM (#29077831)

    One of OpenGL's advantages was that the code would work on a number of platforms. Originally on IRIX, IBM licensed it so it worked on AIX machines. Then it moved to other platforms, surpassing 3DFX's Glide interface. OpenGL is still being worked on, 3.2 was released not so long ago.

    Direct X11 offers the GPGPU support, but it also offers multithreading (some games chew CPU cores up like they are going out of style, so having threads split up among multiple cores will help performance)

    Best thing would be if cards supported both at similar performance, and drivers supported both at the same level. However, because most Windows based games tend to use DX, card makers go to where the money is and put their effort into DX support. In reality, neither has a significant advantage over the other, although the advances in DX10 and 11 are getting that rendering language ahead.

  • by westlake ( 615356 ) on Saturday August 15, 2009 @03:47PM (#29077973)

    I wouldn't be surprised if you were all Microsoft-paid trolls and marketers that are placing your twisted spin on things and making people continue to believe in your garbage.

    The hardware manufacturer talks to Microsoft. Microsoft talks to the hardware manufacturer.

    This - surprisingly enough - turns out to be mutually beneficial.

  • by Anonymous Coward on Saturday August 15, 2009 @04:28PM (#29078271)
    Let's see.... It has hardware Tessellation, which ATI cards have had... forever. Oh wait, Microsoft has made it specifically so that ATI's proven implementation is incompatible. What a surprise! Now what's this.... They're implementing nVidia's current shader model? It must be incompatable. Wait, it isn't?
    Microsoft spat in NVidia's eyes when they went with ATI for the Xbox 360, and now they're spitting in ATI's eyes by introducing an incompatible standard. This is just great.
  • Re:XP FTW. (Score:3, Insightful)

    by Attaturk ( 695988 ) on Saturday August 15, 2009 @08:12PM (#29079683) Homepage
    I was totally with you right up to the mention of IE6. :P
  • by Suiggy ( 1544213 ) on Saturday August 15, 2009 @08:46PM (#29079851)
    Just because Windows XP can't run Direct3D 10/11 doesn't mean that Direct3D never supported geometry-shaders before OpenGL. Direct3D 10 had geometry shader support back in 2006, and it's what spurred the development of actual hardware that supported that feature set. It's true that nVidia had their GL_EXT_geometry_shader4 extension working back in 2007, but ATI/AMD NEVER supported it. It wasn't until OpenGL 3.2 was announced in August of this year that we actually got standardized support for geometry shaders, but the OpenGL 3.2 drivers from nVidia and ATI are still in beta.

Who goeth a-borrowing goeth a-sorrowing. -- Thomas Tusser