Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Graphics Games Technology

DX11 Tested Against DX9 With Dirt 2 Demo 201

MojoKid writes "The PC demo for Codemasters' upcoming DirectX 11 racing title, Dirt 2, has just hit the web and is available for download. Dirt 2 is a highly-anticipated racing sim that also happens to feature leading-edge graphic effects. In addition to a DirectX 9 code path, Dirt 2 also utilizes a number of DirectX 11 features, like hardware-tessellated dynamic water, an animated crowd and dynamic cloth effects, in addition to DirectCompute 11-accelerated high-definition ambient occlusion (HADO), full floating-point high dynamic range (HDR) lighting, and full-screen resolution post processing. Performance-wise, DX11 didn't take its toll as much as you'd expect this early on in its adoption cycle." Bit-tech also took a look at the graphical differences, arriving at this conclusion: "You'd need a seriously keen eye and brown paper envelope full of cash from one of the creators of Dirt 2 to notice any real difference between textures in the two versions of DirectX."
This discussion has been archived. No new comments can be posted.

DX11 Tested Against DX9 With Dirt 2 Demo

Comments Filter:
  • ehh (Score:5, Insightful)

    by Dyinobal ( 1427207 ) on Thursday December 03, 2009 @01:25AM (#30307966)
    I Personally view DX11 as I do sonys push from DVD to blueray. Sure blueray has some nice features but I'm still enjoying my DVDs, and I don't really need uncompressed audio tracks for every language on my disks. Same thing with DX11, I've not even properly gotten set with many DX10 games and now they are pushing DX11 (well pushing as in mostly tech demos) and I've not even got much dust on my latest graphics card. I'll upgrade in a few years, perhaps when I see DX9 vanish, or at least become increasingly uncommon.
    • Re:ehh (Score:5, Funny)

      by White Flame ( 1074973 ) on Thursday December 03, 2009 @01:41AM (#30308022)

      But these go to 11!

    • Re: (Score:3, Funny)

      by LOLLinux ( 1682094 )

      I Personally view DX11 as I do sonys push from DVD to blueray. Sure blueray has some nice features but I'm still enjoying my DVDs, and I don't really need uncompressed audio tracks for every language on my disks.

      I still watch VHS tapes you insensitive clod!

    • Re:ehh (Score:4, Insightful)

      by ShooterNeo ( 555040 ) on Thursday December 03, 2009 @01:47AM (#30308052)

      Are you blind? It's one thing to compare DirectX 9 versus 11 video games, where either API lets you create highly detailed, high performance graphics.

      It's another to compare the gigantic difference in picture quality between 1080p/720p and craptacular 480p (at most)

      The difference between high def and standard is pretty darn immediate and obvious for new content such as TV shows that were made using the right digital cameras. Film, not so much, because the darn camera and lenses in movies is often set to blur hard edges and details, and of course is a craptacular 24fps.

      • You mean 23.976 fps.

        Yea, NTSC is retarded in some ways.

        • Re: (Score:2, Informative)

          by mrmeval ( 662166 )

          Film is 24fps, NTSC is 23.976fps

          Film can go higher in some formats. HDTV can be a variety of frame rates.

      • Re:ehh (Score:4, Informative)

        by Jeremy Erwin ( 2054 ) on Thursday December 03, 2009 @03:54AM (#30308460) Journal

        On a DVD, if something's out of focus, it could be because of the cinematography, or it could be because the DVD doesn't have enough bits. On a bluray, if something's out of focus, it's probably because the director of photography intended it to be out of focus.
        Water looks a bit more realistic. Animation looks a bit sharper.

        On a smaller screen, these are all subtleties, and don't jump out at the viewer unless edge enhancement is added-- which tends to bother viewers with larger screens. Too much processing can also make skin look like plastic.

        • Yet every Blu-Ray movie I've seen just looks like a blocky, compressed mess, containing FAR more severe compression and movement artifacts than was typical on DVDs. But then again, I notice such things immediately where others don't.

      • Re: (Score:3, Informative)

        Blur hard edges with film lenses? It does depend on which lenses are used, this is the choice of the DP, however most are incredibly sharp. Are you thinking of depth of field? Most HD cameras are 2/3 or 1/2 inch sensors, compared to films 16 or 35 mm, as they have greater magnification, focal length, aperture, circle of confusion etc.

        Film cameras (35mm for example) can resolve resolution far beyond 1080p, more like 6 to 12 thousand pixels horizontal can be scanned from the negatives with no worry about res

      • by Ma8thew ( 861741 )
        Has it not occurred to you that sometimes good enough is Good Enough? Standard definition is a high enough resolution that you can enjoy the content. Sure, high definition is better, but it costs sometimes twice as much for the HD version a the moment. The GP didn't say he couldn't tell the difference, just that he didn't see the need for an increase in quality.
        • Sure, sometimes good enough is good enough for most. Then there are some who care about quality and are willing to pay more for it. I happen to enjoy watching films more when they are on Blu Ray, and the higher quality image and sound further immerse me into the experience. If you don't happen to find the difference worth the cost, that's fine, but I see no reason to complain. Blu Ray is coexisting quite nicely with DVD at the moment, and I don't see anyone shoving the upgrade down your throat.
        • I don't understand retailers trying to charge $35 for a BluRay movie and then complaining that BluRay sales aren't that great. I get all my BluRay movies on Amazon and usually pay in the $15-$20 range.

        • "Good enough" is the enemy of great.
      • Re: (Score:2, Insightful)

        by n3tcat ( 664243 )

        Are you blind? It's one thing to compare DirectX 9 versus 11 video games, where either API lets you create highly detailed, high performance graphics.

        It's another to compare the gigantic difference in picture quality between 1080p/720p and craptacular 480p (at most)

        The difference between high def and standard is pretty darn immediate and obvious for new content such as TV shows that were made using the right digital cameras. Film, not so much, because the darn camera and lenses in movies is often set to blur hard edges and details, and of course is a craptacular 24fps.

        You must work for sony, have stock in sony, or have spent thousands of dollars on the equipment you're talking about.

        • Re:ehh (Score:4, Insightful)

          by thisnamestoolong ( 1584383 ) on Thursday December 03, 2009 @08:24AM (#30309412)
          Yes. You need high quality equipment for the difference between DVD and Blu Ray to be worthwhile. What's your point? There are some of us who care about quality and have thousands of dollars of home theater equipment. There are some who don't. I feel that it makes the experience of watching a film far more engrossing and worth the cost. You have chosen to spend your money differently, so it's not worth upgrading to Blu Ray. On my set-up, however, the difference is immediate, obvious, and clearly worth the money for those who care about such things.
        • by CaseM ( 746707 )

          +4 Insightful? The GP post is correct - on the proper equipment (not sure what Sony has anything to do with it) there is a significant difference between the image quality from a DVD and the image quality from a Blu-ray disc.

          But you're right, in the past you had to spend thousands of dollars on equipment (mostly owing to the cost of the TV) to reap the benefits of Blu-ray. These days? Prices has fallen and the barrier of entry is significantly cheaper.

      • by yoyhed ( 651244 )

        Film ... is a craptacular 24fps.

        Have you ever watched a movie on a new 120hz or 240hz TV, with the frame-interpolation feature on? Try watching a whole movie like that and really getting into it.

        Film is 24fps because it is easy to get absorbed into the movie, rather than being jerked back to reality by home-video-amateur-too-real 30 or 60fps.

        Alright, so I don't know if that's WHY it's 24fps, but I'm very happy with 24. Unless you've been sitting there playing a 60fps game for hours immediately befor

        • by CaseM ( 746707 )

          Regarding films being at 24fps - I think it's all about what you're used to. I have a neighbor who LOVES his 120HZ LCD with interpolation and all that. To me, it just looks way too awkward to ever consider investing in that technology. I think it's because I'm used to my movies being at 24fps and my TV at 30 (more or less), but I don't know if one is necessarily better or worse.

      • by nxtw ( 866177 )

        The difference between high def and standard is pretty darn immediate and obvious for new content such as TV shows that were made using the right digital cameras. Film, not so much, because the darn camera and lenses in movies is often set to blur hard edges and details, and of course is a craptacular 24fps.

        Are you unaware that many TV shows are recorded and produced at ~24 fps?
        Here are a few: Lost, Heroes, House, 30 Rock, The Office, Burn Notice, Star Trek TOS.
        My HTPC is set to output 24 Hz almost all the

    • Comment removed (Score:5, Insightful)

      by account_deleted ( 4530225 ) on Thursday December 03, 2009 @02:56AM (#30308276)
      Comment removed based on user account deletion
      • Bioshock, FEAR, L4D I was too busy playing the game to actually spend much time looking at the pretty.

        The fact that you don't remember the graphics being ugly means that they were, in fact, pretty. Or at least, the artists and developers successfully collaborated to provide a game world [predominantly] without offensive glitches. When it comes right down to it, that is a massively difficult task, and consists of a truly massive and complex work of absolute art. Not that I've actually played any of those games. I'm getting ready to try out Lost Planet on the 360.

        And doesn't the X360 use DX9? Considering how many PC games are nothing but shitty X360 ports anymore DX11 will probably be waiting until the x720 before getting adopted.

        DirectX would be dead now if not for Xbox. Bas

    • Personally I see DX11 just like BluRay as well: What good is better graphics if all you get is to see better how much the content sucks?

      Movies without scripts don't get better just with more eye candy. Likewise, games with no replay value don't get more interesting with more particle effects.

      • by yoyhed ( 651244 )
        Exactly. I'd like to note that all of Valve's games are still DX9, and they still rule, and they also look pretty damn nice in my opinion.
      • No country for old men had a fine script. It looks pretty spectacular on bluray.

    • The size of your TV makes a huge difference. My first HDTV ever was a 32" TV, and the difference between HD and standard TV didn't seem that great. Then I got a 56" TV, and the differences were jarring. I can't stand to watch normal TV now.

      Likewise, the difference in quality with BluRay is also suddenly more prevalent on a larger screen.

      Once you go Blu, you'll never go back.

  • OpenGL (Score:3, Interesting)

    by some_guy_88 ( 1306769 ) on Thursday December 03, 2009 @01:28AM (#30307974) Homepage

    Not even sure if I knew there was a DirectX 11.. Does anyone know how OpenGL compares to direct3d 11?

    • OpenGL Development (Score:4, Informative)

      by bazald ( 886779 ) <bazald@@@zenipex...com> on Thursday December 03, 2009 @01:46AM (#30308044) Homepage

      Most of the "important" features of Direct3D 11 will be exposed immediately as OpenGL extensions.
      The next version of OpenGL will officially support those features.
      As usual, it will be a nightmare to take advantage of those features without requiring their presence. (GLEW and GLEE help only so much.)
      If there are any features of Direct3D that would require architectural changes to OpenGL, they won't appear until the next major version, at the earliest. I'd be surprised if virtualization of texture memory were supported soon, but I'm not really expert in these developments. (For all I know, it is already supported...)

      In summary, OpenGL will remain competitive with Direct3D with the usual caveats.

      • Most of the "important" features of Direct3D 11 will be exposed immediately as OpenGL extensions.

        Well, given the fact that the most important feature of DirectX 11 isn't Direct3D, but in fact is DirectCompute, OpenCL would be a better suited equivalent.

        And OpenCL integrates nicely with OpenGL, just like DirectCompute with Direct3D.

        But beside a few simulation eye candy like water surface or debris during an explosion, it won't cause much difference in games (because otherwise the physics would influence too much on the game play).
        That is, until monstrous power hogs like Crysis 3 and Windows 8 are out. T

    • Re: (Score:3, Interesting)

      by noname444 ( 1182107 )

      Cards are lazily called "DX11" or "DX10", but the features are not DirectX-specific. The term shader model, or pixel shader version can be used to describe GPU hardware generations correctly and/or in an API neutral fashion.

      Since these are hardware features they are available to any API that implements them, and OpenGL usually is implemented by the graphics driver, which is written by (or under contract of) the graphics card manufacturers, they usually expose any new hardware features to an OpenGL-applicati

    • by DrYak ( 748999 ) on Thursday December 03, 2009 @08:09AM (#30309358) Homepage

      Not even sure if I knew there was a DirectX 11

      Well, of course.
      Most of the new features of DX 11 have nothing to do with graphics. There are few addition to Direct*3D* and biggest new stuff of DX11 is Direct*Compute*.
      It's for general purpose computing on GPU.

      Therefore it's no surprise that no player and game company gives a damn about it.
      It has few advantage to offer on most current games.
      It also explains why the testers almost didn't see any *visual* difference between the DX9 and DX11 versions. (It's not the same as between DX9 and DX10 - where most differences were on the graphic side - Direct3D - and thus translated into more eye candy).

      DX11 is not used for the visuals. It is used for the computations going under the hood. It will be useful for physics simulations, etc.
      The main problem in such situations - just like a few years ago with the PhysX accelerator - is that you can't have different level physics support that won't affect the gameplay.
      With difference of graphics capability, you can just have difference in detail level : one configuration will look prettier than the other, but the game will always play the same.
      But you can't have more-or-less realistic physics, because the game won't play the same if the objects don't react the same based on the level of physics simulation. Therefore, the gameplay use the same simulation no-matter what the configuration is (the same rigid body physics for all player-driveable vehicles), and GPGPU (CUDA, OpenCL or in this situation DirectCompute) will only be used for a few small details - water surface, cloth simulation, debris displayed on screen during an explosion animation, perhaps ragdoll physics for NPC death (in games where it doesn't matter where the body lands).

      Thus differences are virtually invisible on screen shots. Its only while playing that some of the players will say : "Hey look, the monster fell in a funny way down the stairs !"

      Does anyone know how OpenGL compares to direct3d 11?

      Given the above, the most correct would be to compare Open*CL* to DirectX11.

      And OpenCL does very well. It looks like a genericised version of CUDA, with a slightly lower level API on the host setup side (the same level of verbosity as OpenGL).
      Also, OpenCL integrates well with OpenGL (just like DiectCompute integrates well with Direct3D)

      Last but not least, OpenCL will be supported much more widely in its target market (Scientific computing) having implementation for most OSes (including Linux and MacOS X), having support from major hardware producers (ATI, Nvidia, Intel) including embed ones (ImaginationTech. PowerVR, ARM, etc.), and even having open-source implementation (Gallium3D framework for the next gen Mesa3D).

      Whereas DXCompute is only available in Windows 7 and probably soon on the current or on the next XBox.

      In conclusion :
      In most case, game developers wont bother (except in some simulator requiring as much realism as possible and thus advanced physics support).
      They'll rely on 3rd party middle ware for physics (like Havok).

      And middle-ware makers will probably target several platforms anyway, in order to be interesting for non-microsoft consoles too.

  • Power efficiency (Score:3, Interesting)

    by afidel ( 530433 ) on Thursday December 03, 2009 @01:36AM (#30307998)
    Would be interesting to know if either the DX9 or DX11 codepath had a significantly higher power requirement on DX11 capable hardware.
    • It's not really that interesting, unless and until there are enough low-power video cards on the market supporting the standard to make a difference. I just bought an allegedly low-power GT 240 card with 1GB during black friday (free shipping whee) and though I do not own an amp meter ATM it at least doesn't require an auxiliary power connector like my 9600 GT does. However, it only provides DX10.1 support, so it's not eligible to participate in this discussion.

      • by afidel ( 530433 )
        The HD5750 is pretty darn low power, couple watts idle and 86W max (though some sites have measured peak consumption as low as 75W).
  • by crazybit ( 918023 ) on Thursday December 03, 2009 @02:01AM (#30308102)
    while playing Crysis. I haven't seen DX11, but from what I've seen on DX9 vs DX10, the only way you couldn't tell the difference is if the game graphics are poorly programmed. I am sure anyone that has seen Crysis superhigh on DX10 in 30+ fps could tell the difference.

    Is it worthy? well, it depends on how much the gamer values graphic quality, so it's really very subjective. But don't say there is no visible difference.
    • by sznupi ( 719324 )

      I haven't seen DX11, but from what I've seen on DX9 vs DX10, the only way you couldn't tell the difference is if the game graphics are poorly programmed.

      Why do you seem to exclude the possibility that DX9 path is excellently programmed?...

    • Re: (Score:2, Insightful)

      by Anonymous Coward

      Ignorance is bliss isn't it? The real difference between dx9 and dx10 in Crysis is barely noticeable. [extremetech.com]

      How does it feel to be duped?

    • I haven't seen DX11, but from what I've seen on DX9 vs DX10

      It's worse :
      - DX9 vs. DX10 was about graphic capabilities. At least they could translate into some visuals.
      - DX11 is about DirectCompute - GPGPU (like CUDA, OpenCL, etc.). Hardly anything beside a few discrete eye candy (like water surface, debris in an explosion, etc.)

    • by thisnamestoolong ( 1584383 ) on Thursday December 03, 2009 @08:38AM (#30309496)
      No. Crysis DX9 vs. DX10 really is no appreciable difference at all -- in Crysis, the Very High setting is locked for DX10 only, but this is a totally artificial limitation, probably to try and drum up support for DX10. Even at that, the difference between High and Very High is not earth-shattering. The Internet quickly figured out how to enable all of the Very High graphics setting for DX9 through .INI tweaks, even before Crysis was on store shelves. Being called out on their bullshit, Crytek then released Crysis: Warhead with the Enthusiast (Very High) graphics setting unlocked in DX9. Here is a great article with screenshots:

      http://www.gamespot.com/features/6182140/index.html [gamespot.com]
  • Dx11 vs 9 (Score:2, Insightful)

    by Anonymous Coward
    Just for the record "...notice any real difference between textures in the two versions of DirectX." Direct X has nothing to do with textures. (Textures are created by the artist & are bound by engine limitations) The textures would not change unless the game was specifically changed with higher resolution textures. I.e. 4098 vs 2048 etc... now that that's over... The engine is the limiting factor in the benchmark. Remember how games became dx10 when dx10 came out? Its not really using the framework t
    • Just for the record "...notice any real difference between textures in the two versions of DirectX." Direct X has nothing to do with textures. (Textures are created by the artist & are bound by engine limitations) The textures would not change unless the game was specifically changed with higher resolution textures. I.e. 4098 vs 2048 etc...

      In theory "Yes", but not in practice.
      What a tester calls "texture" is what he/she sees on the surface of objects.
      Which could be a simple bitmap texture mapped on the object.
      Or could be the result of a complex shader written to combine several source of data (textures, other parameters) to creat a nice-looking surface (as for example in procedural textures).
      For example, parallax mapping can be used go give impression of depth and surface details to something which is only a flat triangle with a texture paint

  • HotHardware Test (Score:5, Interesting)

    by DeadPixels ( 1391907 ) on Thursday December 03, 2009 @02:37AM (#30308212)
    From the HotHardware test:

    The DirectX 11 performance numbers were recorded with the game set to its "Ultra" quality mode, while the DirectX 9 numbers were recorded with the game set to its "High" quality mode. ... As you can see, performance dropped off significantly in DirectX 11 mode.

    Now, is it just me, or does that seem a little biased or inaccurate? Of course you're going to see lower performance when you set the graphics higher. Wouldn't it make much more sense (and be a fairer comparison) to compare the FPS with both cards set on either High or Ultra, instead of each on a different level?

    • Re: (Score:2, Informative)

      by psyph3r ( 785014 )
      DX9 systems can only go to High. the ultra mode is only for the latest dx hardware.
    • Re:HotHardware Test (Score:5, Informative)

      by darthflo ( 1095225 ) * on Thursday December 03, 2009 @05:56AM (#30308898)

      If I'm not mistaken, High sets the game to use the highest quality rendering it can get using only DirectX 9 features while Ultra is the only setting that actually enables stuff specific to DirectX 11. The article doesn't mention there being two cards or different installs or anything, so they probably just ran the game twice on the same box, first with DirectX-9-style rendering (done through DiretX 11 and only then switched on DirectX 11's full visual splendor (Ultra quality).

      • If I'm not mistaken, High sets the game to use the highest quality rendering it can get using only DirectX 9 features while Ultra is the only setting that actually enables stuff specific to DirectX 11.

        Pssst... What about DirectX 10?

        • They test DirectX 9 against DirectX 11. Version 9 is the latest available on XP and can be perfectly emulated by 10, which is the one Vista ships with. Also, of course by 11, which is what Windows 7 ships with.
          I'm guessing (educated, but still a guess) there's a "compatibility mode" that runs features available in DirectX 9 (directly or through 10 or 11, doesn't really matter) which includes everything from "Low" to "High" and a "quality mode" that'll max out the details with DirectX 11's features. The form

    • I remember from Crysis that "High" was DX9, and "Ultra" was DX10. The "Ultra" setting may have nothing to do with increasing graphics quality above "High" on DX9, and just enable the DX11 codepath.
  • A shadow under a rock? Anything that has its own acronym should affect more than 0.5% of the pixels in a screenshot.

  • Bad summary (Score:5, Insightful)

    by julesh ( 229690 ) on Thursday December 03, 2009 @03:40AM (#30308416)

    Summary picks out one point where the article states that graphics haven't improved, but article goes on to discuss improvements in other areas. The pictures speak for themselves; the shadows are much more realistic and the water effects are much more realistic. The textures were fine to start with -- who cares if they improved?

    • the water effects are much more realistic. The textures were fine to start with -- who cares if they improved?

      Water Effects = Physics simulation = not directly to do with graphics = Done by DirectCompute (which is the big new stuff in DX11) (or done by CUDA or OpenCL or other GPGPU)
      Texture effects & Shaders & Procedural textures = Graphics rendering = Entirely dependent on Direct3D = Where DX11 doesn't feature as many new gizmos.

  • I took a look at the video in TFA, and for a game named "dirt" the cars looked very clean... I mean, in the video the cars drive on a dirt track (with a nice dirt cloud after the cars), with small pools of water, and still the cars looks like they just have been thorough clean and waxing session. While using so much extra power for more realistic flags, crowds and water, which you have no time to see really, why not use some of that to make the cars, which you do see quite a lot, become dirty?

    • They may not have been allowed to dirt up the cars; although you'd have to strangle that out of an employee as they're bound to be under NDAs regarding the exact contracts.

      Compare that to car manufacturers not allowing damage models on 'their' cars, or dictating exactly what damage models would be allowed (e.g. a bent fender.. sure. an engine dying.. hell no.)

      Ahhh to remember the early 90's when every racing game just tossed in whatever car they da*n well liked at a splendid 50 polygons and car manufacture

  • I've played Dirt 2 on the PS3, back when it was released a few months ago. I can see why the graphical improvements in the PC version might attract attention, but I have another question...

    Does DX11 have any kind of feature that lets you take that complete and utter XTREME moron who does the voice-overs for the game and kill him slowly in imaginative ways? Any enjoyment in the game was killed for me by XTREME SURFER DUDE RAD TO THE MAX guy screaming his head off every time I tried to do anything. Seriously.

  • One thing to keep in mind is that since we are at the point where textures themselves are high enough quality for things to look very good. So, the focus then becomes motion and in the details. Things like physics, and how objects interact are where a lot of the focus is moving to, as well as things like motion, and in the really TINY details that you may not notice when things are in motion, but are still there(like the fuzz and frayed edges on worn clothes, or sweaters). These sorts of things may no

  • Not worth 20fps on an 80fps game, I'm afraid.

    The price you pay for a little more realistic water and other minor changes that could easily be "faked" without DirectX 11 (if they bothered) isn't worth it for the hardware, the driver suppport, forced OS for DirectX11, the increased power of the cards, etc. If that was my PC, I'd be playing it in DirectX 9 mode.

    • by ProppaT ( 557551 )

      It's not worth it, no, but it'll sell a whole bunch of videocards to people. There's a whole sect of people out there who constantly buy the newest / fastest video card just to they can wank off to the graphics. I don't even think these guys play the games and they might as well just look at screenshots.

  • The days of big changes from DX api change are gone. We had ridiculous hype for DX10 which turned out to be negligible improvement and even Faked "improvement" as in Crysis.

    DX11 here is more of the same. Screen shots from both flipped back and forth to point out that this flag has more flap in it.

    DX11 is the last reason to buy new graphics hardware (just as DX10 was). Buy new graphics hardware when you need the performance boost a new generation card will bring, or some new feature like Eyefinity if you wan

Do molecular biologists wear designer genes?

Working...