Intel Drops DirectX 9 Support On Xe, Arc GPUs, Switches To DirectX 12 Emulation (tomshardware.com) 45
An anonymous reader quotes a report from Ars Technica: Native DX9 hardware support is officially gone from Intel's Xe integrated graphics solutions on 12th Gen CPUs and A-Series Arc Alchemist discrete GPUs. To replace it, all DirectX 9 support will be transferred to DirectX 12 in the form of emulation. Emulation will run on an open-source conversion layer known as "D3D9On12" from Microsoft. Conversion works by sending 3D DirectX 9 graphics commands to the D3D9On12 layer instead of the D3D9 graphics driver directly. Once the D3D9On12 layer receives commands from the D3D9 API, it will convert all commands into D3D12 API calls. So basically, D3D9On12 will act as a GPU driver all on its own instead of the actual GPU driver from Intel. Microsoft says this emulation process has become a relatively performant implementation of DirectX 9. As a result, performance should be nearly as good, if not just as good, as native DirectX 9 hardware support.
I've seen this movie (Score:1)
Re: (Score:1)
Re:I've seen this movie (Score:4, Informative)
Re: (Score:1)
On that note, DX9 has some serious backwards compatibility issues with hardware graphics moving forward. This same issue happened with OpenGL, so history repeating itself. DX was created to avoid backwards compatibility issues. Wonder if Vulkan will take over, lol. One of the big problems with OpenGL was extension support, but Vulkan seems to have eliminated that. DX now needs to catch up - maybe DX12 with DX 9 backwards compatibility does catch up, I don't know. Still a lot of features that came after DX9
Re: (Score:1)
One of the big problems with OpenGL was extension support, but Vulkan seems to have eliminated that.
No it hasn't. Vulkan absolutely supports extensions just like OpenGL does.
Still a lot of features that came after DX9 like Geometry Shaders and (Pseudo) Ray Tracing.
Nobody uses Geometry Shaders, they were long replaced by compute shaders because of the variability of the output stalling the geometry processing pipeline, Metal doesn't have them at all. They're a legacy feature.
Sorry nVidia, I won't call your RTX ray tracing
Why not? That's exactly what it is.
I wrote a ray tracer 20+ years ago and you can't even come close to emulating that in real time
Ray tracing is easy, path tracing is the one that is computationally complex but even then much of that depends on your material complexity. RTX provides hardware-accelerated BVH creation a
Re:I've seen this movie (Score:4, Informative)
One of the big problems with OpenGL was extension support
No, the main problem with OpenGL is that every major vendor wants to kill it.
Microsoft has been trying to kill it since "Fahrenheit":
https://en.wikipedia.org/wiki/... [wikipedia.org]
Apple hasn't updated their OpenGL since 2010 and recently deprecated it.
The other major problem is that everybody thinks it's "old" when it isn't. OpenGL is absolutely a modern API (I'm programming it right now) but the move from (eg.) version 4.4 to version 4.5 doesn't sound very exciting even though it makes OpenGL compete with Vulkan on speed while being a lot easier to program,
https://www.youtube.com/watch?... [youtube.com]
Dege (Score:4, Interesting)
There's been dgVoodoo for quite a while. It's a drop-in replacement for d3d9.dll that replaces DX9 calls with DX10/11/12 compatible calls, with built-in fixes for specific games.
Not sure which is better, but I've been using it to play Unreal Tournament 2004 for quite a while with no problems.
Re: (Score:2)
Oh man, I loved UT2k3 back in the day. One of the few "big" games that supported Linux back then.
Emulation (Score:3)
Who cares so long as it works?
I can't imagine that games will run slower under a DX12 emulator than they used to run when they were released under DX9.
Re: (Score:3)
Re: (Score:2)
Re: (Score:2)
Well quite.
I don't even get it. It's not like any GPU now supports DX9 natively whatever that means. They're basically a wide array of programmable barrelled SIMD cores with a somewhat exotic memory architecture, an interesting array of synchronisation primitives and hardware support for a number of operations such as texture sampling.
Re:Emulation (Score:4, Informative)
Emulation often means input lag. Microsoft has replaced parts of their graphics APIs in the past with emulation such that old win9x stuff that requests exclusive access simply gets lied to. Stuff (mostly) still works, but now there's a noticeable delay. Because this delay was undeniable and the lost functionality was actually necessary they gave you new ways to actually request a real exclusive context. And then they moved that to emulation as well. It's good that old stuff still works, but sadly while you may get 200x the FPS than you used to in 1999, the actual responsiveness of the games has largely gone down.
GPU driver writers have repeatedly introduced quite noticeable frame delay by reworking the command streams received for better rendering performance. Chasing higher avg. FPS at the expense of user enjoyment. Given past efforts it's hard not to assume this will be more of the same and so another 1ms here, 2ms there and all of a sudden your 120fps Half Life feels as sluggish as the day you played it at too high a resolution on a TNT2.
Re: (Score:3)
a) Citation needed. I'm a game programmer and none of what you're saying passes the sniff test.
and
b) You'd rather it didn't work at all...?
Re: (Score:3)
He's talking about when Windows Vista came out with a compositing windowing renderer.
You used to be able to acquire a part of Windows' display context and draw directly to the global screen buffer. Starting with Vista, that now draws to an off-screen texture which gets composited with everything else.
It was a direct improvement to visual quality and UX for 99% of apps because it significantly reduced repainting artifacts and gave some features like live previews in thumbnails and alt+tab.
I never encountered
Re: (Score:2)
It is known to be 3 frames of latency compared with directly accessing the framebuffer.
But if you run in exclusive fullscreen mode, compositing latency is gone.
Re: (Score:3, Interesting)
I can't imagine people running Xe integrated graphics care about gaming performance.
Re: (Score:2)
Supposedly, Iris Xe is equivalent in performance to GTX 750Ti (2014). And that was a popular value graphics card in its day, people were playing games on those.
Re: (Score:2)
people were playing games on those.
I'm sure they were. I just don't think in 2022 people are as interested in playing Goat Simulator, Titan Fall, or Far Cry 4 with graphics turned on "performance mode".
Let me repeat with context: I can't imagine people running an 8 year old graphics card which was low end even in its day care about gaming performance.
Obligatory XKCD https://xkcd.com/606/ [xkcd.com]
Vulkan/DX12 are easier (Score:4, Informative)
It's a lot less work to do in the software side and probably on the hardware side as well.
But intel should check DXVK as their DirectX9 support as well before just committing to the microsoft stuff.
Re: (Score:2)
It's a lot less work to do in the software side and probably on the hardware side as well. But intel should check DXVK as their DirectX9 support as well before just committing to the microsoft stuff.
Possibly, but letting Microsoft do the heavy lifting on supporting DX9 is a win for Intel today.
Neither option is letting M$ do any lifting at all (Score:2)
He's talking about Intel should be using Vulkan instead of DX12. Especially since there is an existing code base for Vulkan.
Re: (Score:2)
There's an existing code base for DX12 too, D3D9On12 is on GitHub.
The difference is that Intel can push responsibility off to Microsoft for DX9 stuff, while there's nobody they can really do that with for DXVK.
Re: (Score:2)
It's a lot less work to do in the software side and probably on the hardware side as well. But intel should check DXVK as their DirectX9 support as well before just committing to the microsoft stuff.
Both are MIT-style opensource, but dxvk is Linux-only so they'd have to port it first, moreover D3D9On12 has corporate backing, AND it's made by the company that controls both the API and the OS. They'd have to be crazy to do with dxvk.
Re: (Score:2)
D3D9On12 has corporate backing, AND it's made by the company that controls both the API and the OS. They'd have to be crazy to do with dxvk.
That statement makes a lot more sense if you pretend that the API isn't Direct3D, and the company isn't Microsoft. Until recently D3D was something of a shit show, and Microsoft still is.
DXVK is only supported on Linux, but it does work on Windows [reddit.com]. If they wanted to contribute additional Windows support for DXVK, they might well wind up with a better solution than depending on something from Microsoft.
You make more than one API call (Score:1)
Re:You make more than one API call (Score:4, Informative)
Win32 programs have a 1500 cycle penalty for executing system calls. Yes, that's absurdly large. So minimizing system calls has a bigger effect than anything micro like an extra indirect jump on every API call. All that matters is fewer system calls.
Re: (Score:3)
"In most existing systems, switching from user mode to kernel mode has an associated high cost in performance. It has been measured, on the basic request getpid, to cost 1000â"1500 cycles on most machines. Of these just around 100 are for the actual switch (70 from user to kernel space, and 40 back), the rest is "kernel overhead".[11][12]"
Re: (Score:2)
tell me again how performance is "just as good"
On a modern PC it's probably faster than when the DX9 game was originally released.
Re: You make more than one API call (Score:2)
Probably makes no odds (Score:3)
Re: (Score:3)
Re: (Score:2)
They usually don't *rely* on bugs, they have to work around them. When I still used to do low level graphics programming for a living, some of the worst days were when I came to the office and a PC or laptop was waiting for me showing newly discovered and entirely unexpected driver bugs.
For some reason we were not allowed to touch the drivers (some certification issues for devices used in medical environments) but we were allowed to patch our software and litter them with "if driverVersionIs(some_crappy_and
Re: (Score:3)
If I were intel I'd emulate DX10 and 11 as well (Score:3)
DX9 owes its longevity to being the baseline for Win Vista/7/8.x/10, (and being heavily used in XP) but DX9 is the past.
DX12 (with WDDM 2.0) is the new baseline for Win11
DX 10 and 11 are the past as well...
If I were Intel, I'd laser focus all my resources (HW designers, programmers, time) on current APIs (DX12, Vulkan, OpenCL, Metal2), propiertary or not, and leave older APIs (DX9/10/11 , OpenGL (ES), el al) to emulation/compatibility layers whenever possible...
Yes, on the short term this will lead to some growing pains (specially for gamming applications), but in the long run, will lead to cleaner drivers, and more performing Hardware.
More so, when one takes into account that this new architecture (Xe) will be used in datacenter applications (machine learning, CFD, Seismographic analysis are some examples), as well as in workstation applications (publishing, video editing, etc) and not only in gaming...
Re: (Score:2)
Playing new software is fine and dandy, but the reality is that most people running modern applications that are GPU intensive are not going to be using Intel hardware. People that care are going to pony up for hardware from Nvidia or AMD.
Personally, I don't care about modern games, and so I don't opt for the added expense. Over the years I have made do with quite a bit of Intel hardware. Intel chipsets tend to run my display just fine, and I don't have to worry about the driver issues and hardware pro
Re: (Score:2)
Playing new software is fine and dandy, but the reality is that most people running modern applications that are GPU intensive are not going to be using Intel hardware. People that care are going to pony up for hardware from Nvidia or AMD.
"Gamers" that care are going to Pony up for AMD or NVidia, but people that use GPU intensive applications will use hardware that does the work needed AND has a good Price/Performance (and power consumprtion, and form factor and cooling). If intel can meet those demands, so be it.
Personally, I don't care about modern games, and so I don't opt for the added expense. Over the years I have made do with quite a bit of Intel hardware. Intel chipsets tend to run my display just fine, and I don't have to worry about the driver issues and hardware problems that my friends that use their PCs for gaming have to live with.
I am a Heavy player of Portal(2) and HL2 mods myself, but I also care about modern titles... One has to look forward, not backwards, and beyond games.
Nowadays, not only our games use the GPU, our browsers are GPU accelerated, MSWord
Re: (Score:2)
DirectX9 was the last version of DirectX to support fixed-function 3D hardware, which did not support pixel and vertex shaders. So having the older API around made sense, because older hardware still existed. And so many games were written to use the older API.
What really happened to make DX9 live so long was Vista being unpopular, which extended the lifespan of Windows XP, and DX10 not getting ported back to Windows XP.