Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
XBox (Games) Graphics Microsoft

Titanfall Dev Claims Xbox One Doesn't Need DX12 To Improve Performance 117

MojoKid writes: "One of the hot topics in the wake of Titanfall's launch has been whether or not DirectX 12 would make a difference to the game's sometimes jerky framerate and lower-than-expected 792p resolution. According to Titanfall developer Jon Shirling, the new Microsoft API isn't needed to improve the game's performance, and updates coming down the pipe should improve Xbox One play in the near future. This confirms what many expected since DX12 was announced — the API may offer performance improvements in certain scenarios, but DX12 isn't a panacea for the Xbox One's lackluster performance compared to the PS4. It's an API that appears to mostly address scenarios where the CPU isn't able to keep the GPU fed due to draw call bottlenecks."
This discussion has been archived. No new comments can be posted.

Titanfall Dev Claims Xbox One Doesn't Need DX12 To Improve Performance

Comments Filter:
  • by FlyHelicopters ( 1540845 ) on Sunday April 13, 2014 @02:57AM (#46738491)

    Ok, fair enough, the XBox One is a vast improvement over the XBox 360 in many ways...

    But it isn't SO much of an improvement that it is drop dead obvious.

    The PS1 to PS2? Clear as day, just compare FF7 to FFX.

    The PS2 to PS3? Likewise, clear as day, compare FFX to FF13.

    How about before the PS1? SNES? Really, do I have to compare FF2 to FF7? :)

    The XBox (original) to XBox 360, night and day...

    The XBox One? Meh... it is nice, but it can't even play 1080p games, 10 years after 1080p really started to come out in any numbers.

    The PS4 is better, being 50% faster (thanks to 50% more GPU resources), but it isn't THAT much better. Neither console is really "next-gen", that would have been 4K resolution.

    Both are "fine", but fine just isn't going to cut it.

  • by FlyHelicopters ( 1540845 ) on Sunday April 13, 2014 @03:45AM (#46738641)

    Yes... In fairness, not all 1080P is equal, the PS4/XB1 can of course have more detail at the same resolution as the older consoles, but to the average person just looking at them, they are all "fine".

    I showed my wife the PS4 when it came out, side by side to the PS3 (which we own 2 of), yes, she said "yea, the PS4 looks nicer, but are the games any better?".

    Eh, they are of course more of the same, nothing has really changed.

    This is of course a problem... :)

  • by FlyHelicopters ( 1540845 ) on Sunday April 13, 2014 @03:53AM (#46738667)

    The Wii U is nice in many ways, we own one. My 8 year old son and 5 year old daughter love Super Mario World 3D.

    It shows that graphics are nice, but not everything, great games are great games, on any console.

    The problem with the Wii U is that it is WAY overpriced for what it is. It just isn't selling and the time to get it selling has probably past, nothing Nintendo can do about it at this point.

    I recently bought an Amazon Fire TV, and frankly, it has some really nice games on it that look just as nice as most of what is on our PS3. My son has been playing the tower defense game that comes with it and has been having just as much fun with it as with anything else.

    For a $99 device that really is meant to watch TV with, that may be the real threat to PS4/XB1, if a $99 device is "good enough", how much demand is there for $500 game consoles?

    Some, to be sure... but the price needs to come down.

  • by aliquis ( 678370 ) on Sunday April 13, 2014 @04:20AM (#46738725)

    FWIV - Also 1080p games and with possibly more details and AA would still of course be nicer than 720.

    Xbox, 2001-2002. 64 MB 200 MHz DDR shared graphics memory, 733 MHz PIII-ish, 233 MHz NV2A.
    Geometry engine: 115 million vertices/second, 125 million particles/second (peak)
    932 megapixels/second (233 MHz Ã-- 4 pipelines), 1,864 megatexels/second (932 MP Ã-- 2 texture units) (peak)
    (CPU random page 3 GLFOPS, GPU? Nvidia supposedly claim 80, some Xbox book say 22 in total.)

    Xbox 360, 2005-2006, 512 MB 700 MHz GDDR3, 3,2 GHz Tri-Core PowerPC, 500 MHz Xenos, 500 MHz 10 MiB eDRAM.
    Maximum vertex count: 6 billion vertices per second, 240 GFLOPS
    Maximum pixel fillrate: 16 gigasamples per second fillrate using 4X multisample anti aliasing. Maximum texel fillrate: 8 gigatexels per second (16 textures Ã-- 500 MHz)

    Xbox One, 2013-2014, 8 GB DDR3, 1.75 GHz Octo-core AMD APU, 853 MHz AMD GCN, 32 MB ESRAM.
    1.31 TFLOPS.
    "Xbox One supports 4K resolution (3840Ã--2160) (2160p) video" (So for something like "New super mario bros" I guess 4K wouldn't had been impossible.)

    I don't know how much you can trust the numbers but from the claimed GFLOPS numbers Xbox One with be 5.5 * Xbox 360 which would be 3 * Xbox.

    But it took 4 years to get to Xbox 360 and 8 years to get to Xbox One.

    Still obviously better.
    Previously my impression was that consoles use close to top of the line hardware when released and as is I don't see the AMD APU as such, but it's still GTX 650-650TI area and more GK106 GTX 660 for the PS4 (looking at gflops alone.)

    That isn't the best you can get but it just recently was the "reasonable budget high-end" or something such, isn't the 760 still same GPU but higher clocked? Sure going all the way to 770/780/R290X may be worth it from a price/performance perspective but it's still up there.

    People have problem enough running QHD games with one graphics card. Gaming (advanced looking game) isn't something which would happen with current gen graphics so that's totally out of the question.

  • by Anonymous Coward on Sunday April 13, 2014 @05:17AM (#46738835)

    Those machines were representative of what dev teams - even internal MS dev teams - were told the console would be capable of at launch.

    XB1 Platform team MASSIVELY screwed the pooch on those promises and what the teams got was a gaming console that was startlingly bad at drawing verts and triangles. Some titles had to smash practically all non-hero asset shaders down to early DX9 era diff / spec / norm, and cut scene vert budgets to levels on par with Far Cry (the first one, with no plot and bad voice acting) and RtCW.

    So, yeah. Don't blame the game dev and art teams on this one. Blame the platform team and upper management that promised hemorrhaging-edge next-gen capability and handed the game teams a TiVo with a graphics card soldered to it.

  • It wasn't profit (Score:5, Interesting)

    by rsilvergun ( 571051 ) on Sunday April 13, 2014 @12:29PM (#46740421)
    They over estimated the cost of GDDR5. You can only lose so much money on your console, and Microsoft has lost massive amounts for 2 generations.

    They thought the price of GDDR5 was going to be so high they console would sell for more than people could pay. Remember the $799 3DO? No. There's your answer.

    They tried to make up for it by putting 64 megs of high speed on die cache, but again screwed up. The cache was expensive and took up space on the CPU die that Sony used for more Cuda cores.

    So yeah, it was a money decision, but it wasn't about profit, it was about making a console people could afford. Both companies guessed, and Microsoft guessed wrong.

I've noticed several design suggestions in your code.

Working...