Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Graphics Games Technology

DX11 Tested Against DX9 With Dirt 2 Demo 201

MojoKid writes "The PC demo for Codemasters' upcoming DirectX 11 racing title, Dirt 2, has just hit the web and is available for download. Dirt 2 is a highly-anticipated racing sim that also happens to feature leading-edge graphic effects. In addition to a DirectX 9 code path, Dirt 2 also utilizes a number of DirectX 11 features, like hardware-tessellated dynamic water, an animated crowd and dynamic cloth effects, in addition to DirectCompute 11-accelerated high-definition ambient occlusion (HADO), full floating-point high dynamic range (HDR) lighting, and full-screen resolution post processing. Performance-wise, DX11 didn't take its toll as much as you'd expect this early on in its adoption cycle." Bit-tech also took a look at the graphical differences, arriving at this conclusion: "You'd need a seriously keen eye and brown paper envelope full of cash from one of the creators of Dirt 2 to notice any real difference between textures in the two versions of DirectX."
This discussion has been archived. No new comments can be posted.

DX11 Tested Against DX9 With Dirt 2 Demo

Comments Filter:
  • OpenGL Development (Score:4, Informative)

    by bazald ( 886779 ) <bazald@z[ ]pex.com ['eni' in gap]> on Thursday December 03, 2009 @02:46AM (#30308044) Homepage

    Most of the "important" features of Direct3D 11 will be exposed immediately as OpenGL extensions.
    The next version of OpenGL will officially support those features.
    As usual, it will be a nightmare to take advantage of those features without requiring their presence. (GLEW and GLEE help only so much.)
    If there are any features of Direct3D that would require architectural changes to OpenGL, they won't appear until the next major version, at the earliest. I'd be surprised if virtualization of texture memory were supported soon, but I'm not really expert in these developments. (For all I know, it is already supported...)

    In summary, OpenGL will remain competitive with Direct3D with the usual caveats.

  • Re:OpenGL (Score:5, Informative)

    by QuoteMstr ( 55051 ) <dan.colascione@gmail.com> on Thursday December 03, 2009 @03:37AM (#30308214)

    This is the most ill-informed comment I've ever seen.

    You don't have a "direct path" to the hardware on modern computers at all. After all, you're not filling DMAed command buffers and programming memory registers, and you don't want to be: the details would drive you to madness. That's what we have drivers for.

    OpenGL and Direct3D are both abstraction layers for the hardware. Neither is intrinsically more "direct", but both were certainly designed for real-time 3D rendering (although OpenGL was initially more used for CAD applications than games).

  • Re:HotHardware Test (Score:2, Informative)

    by psyph3r ( 785014 ) on Thursday December 03, 2009 @04:32AM (#30308378)
    DX9 systems can only go to High. the ultra mode is only for the latest dx hardware.
  • Re:ehh (Score:4, Informative)

    by Jeremy Erwin ( 2054 ) on Thursday December 03, 2009 @04:54AM (#30308460) Journal

    On a DVD, if something's out of focus, it could be because of the cinematography, or it could be because the DVD doesn't have enough bits. On a bluray, if something's out of focus, it's probably because the director of photography intended it to be out of focus.
    Water looks a bit more realistic. Animation looks a bit sharper.

    On a smaller screen, these are all subtleties, and don't jump out at the viewer unless edge enhancement is added-- which tends to bother viewers with larger screens. Too much processing can also make skin look like plastic.

  • Re:ehh (Score:2, Informative)

    by mrmeval ( 662166 ) <.moc.oohay. .ta. .lavemcj.> on Thursday December 03, 2009 @05:16AM (#30308538) Journal

    Film is 24fps, NTSC is 23.976fps

    Film can go higher in some formats. HDTV can be a variety of frame rates.

  • Re:ehh (Score:3, Informative)

    by SimonTheSoundMan ( 1012395 ) on Thursday December 03, 2009 @06:39AM (#30308838)

    Blur hard edges with film lenses? It does depend on which lenses are used, this is the choice of the DP, however most are incredibly sharp. Are you thinking of depth of field? Most HD cameras are 2/3 or 1/2 inch sensors, compared to films 16 or 35 mm, as they have greater magnification, focal length, aperture, circle of confusion etc.

    Film cameras (35mm for example) can resolve resolution far beyond 1080p, more like 6 to 12 thousand pixels horizontal can be scanned from the negatives with no worry about resolution.

  • Re:HotHardware Test (Score:5, Informative)

    by darthflo ( 1095225 ) * on Thursday December 03, 2009 @06:56AM (#30308898)

    If I'm not mistaken, High sets the game to use the highest quality rendering it can get using only DirectX 9 features while Ultra is the only setting that actually enables stuff specific to DirectX 11. The article doesn't mention there being two cards or different installs or anything, so they probably just ran the game twice on the same box, first with DirectX-9-style rendering (done through DiretX 11 and only then switched on DirectX 11's full visual splendor (Ultra quality).

  • Re:ehh (Score:1, Informative)

    by Anonymous Coward on Thursday December 03, 2009 @07:39AM (#30309036)

    uh, NTSC is not 23.976. It's 29.976 (AKA, 30).

    http://en.wikipedia.org/wiki/Telecine#2:3_pulldown

    Film converted to NTSC may be slowed to 23.976, but it's still broadcast at 29.976. NO one ever "sees" NTSC running at 23.976.

  • by DrYak ( 748999 ) on Thursday December 03, 2009 @09:09AM (#30309358) Homepage

    Not even sure if I knew there was a DirectX 11

    Well, of course.
    Most of the new features of DX 11 have nothing to do with graphics. There are few addition to Direct*3D* and biggest new stuff of DX11 is Direct*Compute*.
    It's for general purpose computing on GPU.

    Therefore it's no surprise that no player and game company gives a damn about it.
    It has few advantage to offer on most current games.
    It also explains why the testers almost didn't see any *visual* difference between the DX9 and DX11 versions. (It's not the same as between DX9 and DX10 - where most differences were on the graphic side - Direct3D - and thus translated into more eye candy).

    DX11 is not used for the visuals. It is used for the computations going under the hood. It will be useful for physics simulations, etc.
    The main problem in such situations - just like a few years ago with the PhysX accelerator - is that you can't have different level physics support that won't affect the gameplay.
    With difference of graphics capability, you can just have difference in detail level : one configuration will look prettier than the other, but the game will always play the same.
    But you can't have more-or-less realistic physics, because the game won't play the same if the objects don't react the same based on the level of physics simulation. Therefore, the gameplay use the same simulation no-matter what the configuration is (the same rigid body physics for all player-driveable vehicles), and GPGPU (CUDA, OpenCL or in this situation DirectCompute) will only be used for a few small details - water surface, cloth simulation, debris displayed on screen during an explosion animation, perhaps ragdoll physics for NPC death (in games where it doesn't matter where the body lands).

    Thus differences are virtually invisible on screen shots. Its only while playing that some of the players will say : "Hey look, the monster fell in a funny way down the stairs !"

    Does anyone know how OpenGL compares to direct3d 11?

    Given the above, the most correct would be to compare Open*CL* to DirectX11.

    And OpenCL does very well. It looks like a genericised version of CUDA, with a slightly lower level API on the host setup side (the same level of verbosity as OpenGL).
    Also, OpenCL integrates well with OpenGL (just like DiectCompute integrates well with Direct3D)

    Last but not least, OpenCL will be supported much more widely in its target market (Scientific computing) having implementation for most OSes (including Linux and MacOS X), having support from major hardware producers (ATI, Nvidia, Intel) including embed ones (ImaginationTech. PowerVR, ARM, etc.), and even having open-source implementation (Gallium3D framework for the next gen Mesa3D).

    Whereas DXCompute is only available in Windows 7 and probably soon on the current or on the next XBox.

    In conclusion :
    In most case, game developers wont bother (except in some simulator requiring as much realism as possible and thus advanced physics support).
    They'll rely on 3rd party middle ware for physics (like Havok).

    And middle-ware makers will probably target several platforms anyway, in order to be interesting for non-microsoft consoles too.

  • by thisnamestoolong ( 1584383 ) on Thursday December 03, 2009 @09:38AM (#30309496)
    No. Crysis DX9 vs. DX10 really is no appreciable difference at all -- in Crysis, the Very High setting is locked for DX10 only, but this is a totally artificial limitation, probably to try and drum up support for DX10. Even at that, the difference between High and Very High is not earth-shattering. The Internet quickly figured out how to enable all of the Very High graphics setting for DX9 through .INI tweaks, even before Crysis was on store shelves. Being called out on their bullshit, Crytek then released Crysis: Warhead with the Enthusiast (Very High) graphics setting unlocked in DX9. Here is a great article with screenshots:

    http://www.gamespot.com/features/6182140/index.html [gamespot.com]

Living on Earth may be expensive, but it includes an annual free trip around the Sun.

Working...