Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
XBox (Games) Graphics Microsoft

Titanfall Dev Claims Xbox One Doesn't Need DX12 To Improve Performance 117

MojoKid writes: "One of the hot topics in the wake of Titanfall's launch has been whether or not DirectX 12 would make a difference to the game's sometimes jerky framerate and lower-than-expected 792p resolution. According to Titanfall developer Jon Shirling, the new Microsoft API isn't needed to improve the game's performance, and updates coming down the pipe should improve Xbox One play in the near future. This confirms what many expected since DX12 was announced — the API may offer performance improvements in certain scenarios, but DX12 isn't a panacea for the Xbox One's lackluster performance compared to the PS4. It's an API that appears to mostly address scenarios where the CPU isn't able to keep the GPU fed due to draw call bottlenecks."
This discussion has been archived. No new comments can be posted.

Titanfall Dev Claims Xbox One Doesn't Need DX12 To Improve Performance

Comments Filter:
  • by CTU ( 1844100 ) on Sunday April 13, 2014 @02:23AM (#46738397) Journal

    Xbone just sucks compared to the PS4 so it is no wonder the system can't run the game well.

    Well can't say I am upset with not having an xbone, if I really wanted this game, then I think PC would be better anyway with a decent video card at least :)

  • by JavaBear ( 9872 ) on Sunday April 13, 2014 @03:52AM (#46738663)

    MS pulled a fast one at E3, wehre they used high end PC's to demo the XBox One.
    IIRC MS later claimed that these were "representative" and also used for development. However, if these were the machines the devs were using to develop their game, it's no wonder they exceeded the available resources on the console.
    http://www.techpowerup.com/185... [techpowerup.com]

  • by Emetophobe ( 878584 ) on Sunday April 13, 2014 @04:50AM (#46738769)

    That BS. Microsoft and Sony fanboys mocked the Wii for targeting 720p. According to them they had all the games in glorious 1080p while Wii peasant didn't had real HD.

    Correction: The Wii was 480p [wikipedia.org], not 720p.

  • The PS3 plays a lot of games at 1080p native...

    There is nothing wrong with the PS4/XB1, other than for $400/$500, they don't really offer anything new.

    PS1 was the first major 3D console, it was a massive improvement over the SNES.

    The PS2 offered DVD, vastly upgraded graphics, etc.

    The PS3 offered Blu-Ray, 1080p, and the first serious online console (from Sony).

    The PS4? Meh, it is a faster PS3, but otherwise, it doesn't offer anything new.

    Um...The PS3 renders very few games at 1080p native. Maybe a dozen titles out of the entire catalog.

    Don't forget the other dimension. 1080 is only 360 more than 720, but 1920 is over 800 more pixels than 1280. IMO, that's the dimension we should be talking about, since its more significant. However, per pixel calculation load scales with area, not 1/2 perimeter. So, if we look at total pixels: 1280x720p = 921,600 pixels, and 1920x1080p = 2,073,600, the difference being 1,152,000, so a lot of people don't understand that going from 720 to 1080 is MORE THAN TWICE the pixels, in pixel shader costs you might as well be rendering a full secondary screen.

    Now, that's not to say the total cost in rendering will absolutely increase over two fold. Full screen effects like Bloom or HDR are going to come it at about twice the cost. Interpolating a texture coordinate to look up pixel values is cheap compared to most any shader program, even to do something like cube-map specular highlight/reflections, bump mapping (I prefer parallax mapping), shadow mapping, or etc. However, the complexity of geometry calculations can be the same at both resolutions. In a ported / cross-platform game the geometry assets are rarely changed (too expensive in terms of re-rigging and all the animations, testing, etc.) so given slightly better hardware a game at the same resolution will have the prime difference be in adding more particle effects, increased draw distance, maybe even a few whole extra pixel sharers (perhaps the water looks way more realistic, or flesh looks fleshier, blood is bloodier, reflections are more realistic, etc.)

    Jumping up to 1080p makes your pixel shader cost a lot more frame time. Developing for 1080p vs 720p would optimally mean completely reworking the graphics and assets and shaders to adapt to the higher shader cost, maybe cut down on pixel shader effects and add more detailed geometry. I encounter folks who think "1080 isn't 'next gen', 4K would have been next gen" -- No, that's ridiculous. 1080p is "next gen resolution", but the new consoles are barely capable of it while having a significant degree of increase in shader complexity from last gen, and we're seeing diminishing returns on increasing the resolution anyway. So, I wouldn't call the consoles quite 'next-gen' in all areas. IMO, next gen console graphics would handle significantly more shaders while running everything smoothly at 1080p, just like the above average gaming PC I got my younger brother for his birthday which kicks both PS4 and Xbone's ass on those fronts. That would be the sort of leap in graphics scale between PS1 and PS2 or Xbox and the 360. 4K would be a generation beyond 'next-gen' because of the way shaders must scale with resolution.

    One of the main advances this new console generation brings is in the way memory is managed. Most people don't even understand this, including many gamedevs. Traditionally we have to had two copies of everything in RAM, one texture loaded from storage to main memory, and another copy stored in the GPU; Same goes for geometry, but sometimes even a third lower detail geometry will be stored in RAM for the physics engine to work on. The other copy in main RAM is kept ready to shove down the GPU pipeline, and the resource manager tracks which assets can be retired and which will be needed to prevent cache misses. That's a HUGE cost in total RAM. Traditionally this bus bandwidth has been a prime limitation in interactivit

This file will self-destruct in five minutes.

Working...