Forgot your password?
typodupeerror
Graphics Games Technology

AMD Previews DirectX 11 Gaming Performance 103

Posted by Soulskill
from the more-and-better dept.
An anonymous reader writes "AMD invited 100 people up to their private suite in the hotel that Quakecon 2009 is being hosted at for a first look at gaming on one of their upcoming DirectX 11 graphics cards. This card has not been officially named yet, but it has the internal code name of 'Evergreen,' and was first shown to the media back at Computex over in Taiwan earlier this year. The guys from Legit Reviews were shown two different systems running DX11 hardware. One system was set up running a bunch of DX11 SDKs and the other was running a demo for the upcoming shooter Wolfenstein. The video card appears to be on schedule for its launch next month."
This discussion has been archived. No new comments can be posted.

AMD Previews DirectX 11 Gaming Performance

Comments Filter:
  • by ShakaUVM (157947) on Saturday August 15, 2009 @02:16PM (#29077229) Homepage Journal

    >>Since when did we build hardware around APIs, rather than the other way around?

    Always.

    There's always a dialogue between software and hardware people on what needs to be implemented, and whether it should be done in hardware and software. The RISC/CISC days were full of stories like that in the CPU design world.

  • Re:Except (Score:5, Informative)

    by Suiggy (1544213) on Saturday August 15, 2009 @02:37PM (#29077417)
    You are dead wrong. Direct3D 11 and Shader Model 5.0 is quite the step up from Direct3D 9 and SM 3.0. If you were a graphics developer you would know this. From Wikipedia:
    • Tessellation to increase at runtime the number of visible polygons from a low detail polygonal model.
    • Multithreaded rendering to render to the same Direct3D device object from different threads for multi core CPUs.
    • Compute shaders which exposes the shader pipeline for non-graphical tasks such as stream processing and physics acceleration, similar in spirit to what NVIDIA CUDA achieves, and HLSL Shader Model 5 among others.

    It also has a lot of awesome smaller features that make doing what are known as deferred shading/lighting pipelines more feasible. This is a good thing because it simplifies the amount of work needed in implementing game's material system while offering great performance at the cost of more GPU memory being used.

  • by bomanbot (980297) on Saturday August 15, 2009 @02:42PM (#29077469)
    Well I read TFA and besides the new capabilities of DirectX 11 (which look nice, but not exactly earth-shattering to me and also will need some time to get implemented into games anyway), what I found interesting was what ATI actually did with the display output connectors.

    The demo system they set up had one of those new DirectX 11 cards and that card is a dual-slot solution as all the highend graphics cards are now. But ATI did use the space from those two slots quite nicely by including dual DVI ports AND a HDMI AND a DisplayPort connector meaning you have all the different types of digital display connectors available on a single card, which would be a first, I think.

    No word yet whether you can use all four ports simultaneously, but if you could, it looks like a nice new way of hooking up multiple displays :)
  • From Wikipedia, (Score:4, Informative)

    by carp3_noct3m (1185697) <slashdotNO@SPAMwarriors-shade.net> on Saturday August 15, 2009 @02:52PM (#29077547)

    Since most you other fucks just make some sort of quip with no facts, (yeah yeah, i know it slashdot) here is the wikipedia entry for DX11.

    "Microsoft unveiled Direct3D 11 at the Gamefest 08 event in Seattle, with the major scheduled features including GPGPU support, tessellation[11][12] support, and improved multi-threading support to assist video game developers in developing games that better utilize multi-core processors.[13] Direct3D 11 will run on Windows Vista, Windows 7, and all future Windows operating systems. Parts of the new API such as multi-threaded resource handling can be supported on Direct3D 9/10/10.1-class hardware. Hardware tessellation and Shader Model 5.0 will require Direct3D 11 supporting hardware.[14] Microsoft has since released the Direct3D 11 Technical Preview.[15] Direct3D 11 is a strict superset of Direct3D 10.1 - all hardware and API features of version 10.1 are retained, and new features are added only when necessary for exposing new functionality. Microsoft have stated that Direct3D 11 is scheduled to be released to manufacturing in July 2009,[16] with the retail release coming in October '09"

    Seems pretty big to me. The thing I see being the biggest is the work on improving multithreading/multicore support, and the whole GPGPU thing. Not to mention that the API will be very compatiable with older cards (read: no real need to upgrade cards just yet)

  • Re:Except (Score:4, Informative)

    by Suiggy (1544213) on Saturday August 15, 2009 @02:54PM (#29077583)
    Most game developers are skipping Direct3D 10 because it's explicitly tied to Vista and it has poor market share compared to Windows XP/Direct3D 9.0c. The hope is that most current gamers on Windows XP will eventually move to Windows 7, and that Direct3D 11 enjoys the same long life span as Direct3D 9.0, ending up in the console from Microsoft.
  • Why? (Score:2, Informative)

    by TheDarkMaster (1292526) on Saturday August 15, 2009 @04:29PM (#29078273)
    Well... We just are using DX9 yet, only two or three games (really) needs DX10... and now we go to DX11? When came a really good game using in fact DX10, we will go to DX14? Is too fast to me
  • by am 2k (217885) on Saturday August 15, 2009 @05:12PM (#29078545) Homepage

    Maybe you should also mention in your rant that it doesn't matter whether OpenGL 3.x implements a feature, because every hardware developer can just add an extension to it to implement that feature. This means that new features usually get into the standard after they have been deployed in new hardware.

    This is not possible in Direct3D, and so in this case the new versions have to be developed before the hardware for it gets deployed. That's why it always appears that OpenGL is lagging behind, when in reality it's actually moving faster. For example, OpenGL geometry shaders are supported in Windows XP, where no Direct3D 10 is available.

  • Re:Why not OpenGL? (Score:1, Informative)

    by Suiggy (1544213) on Saturday August 15, 2009 @08:35PM (#29079789)
    They do support OpenGL, in fact ATI's Direct3D 11 cards will support the latest version, OpenGL 3.2. However, it should be noted that the OpenGL 3.2 feature set is the same as Direct3D 10, which doesn't really bring anything new to the table. Direct3D 11 is where all of the new features are.
  • by GleeBot (1301227) on Sunday August 16, 2009 @03:35AM (#29081523)

    See, this is what I don't get - why does everyone think HDMI is so awesome? It's just DVI with a couple extra pins for audio. It's not inherently higher-quality; does it have a sufficiently higher bandwidth capacity than DVI + TOSLINK that it makes an impact in real-world environments (24fps 1080p video/5.1 surround sound)? And how is having your video card double as a sound card a good idea? Isn't that just asking for aural interference from the video components?

    First point: HDMI is all-digital, so you don't get "aural interference from the video components". It's actually a pretty cool feature of the current batch of HD 4xx0 cards that you can run the output of an HTPC on one cable.

    Second point: HDMI, in the later revisions of the spec (1.3+ or so), actually does have improved features over DVI, like deeper color support, and higher bandwidth to support higher resolution displays. (It also supports 7.1 sound, not merely 5.1. Not that you actually need any of this, but saying it's just DVI is misleading.) It doesn't hurt that the connectors are a lot smaller and easier to work with, too.

    As an aside, the audio from HDMI isn't carried on separate pins. HDMI is digital signaling, it's all just bits. The reason to have so many pins is to enable more bandwidth by spread the signal across more wire pairs, not because you need extra wires to carry different parts of the signal.

    Now, I'm not all rah-rah-rah HDMI (the only thing I'm using it for right now is to plug a Blu-ray player into a TV), but for home theater applications, it does seem pretty attractive.

    I'm also not convinced all those connectors and slots belong to a single card, particularly in the final product. I'm more inclined to believe that it's an engineering sample designed with extra outputs for experimentation, perhaps even a dual card solution with some sort of extra bus. It'd be nice to have all the connectors you could want on one back panel, but I think it's ridiculous to believe most cards are going to have the space for them all.

"Now this is a totally brain damaged algorithm. Gag me with a smurfette." -- P. Buhr, Computer Science 354

Working...