Forgot your password?
typodupeerror
XBox (Games) Graphics Microsoft

Titanfall Dev Claims Xbox One Doesn't Need DX12 To Improve Performance 117

Posted by Soulskill
from the ask-carmack-for-some-tips dept.
MojoKid writes: "One of the hot topics in the wake of Titanfall's launch has been whether or not DirectX 12 would make a difference to the game's sometimes jerky framerate and lower-than-expected 792p resolution. According to Titanfall developer Jon Shirling, the new Microsoft API isn't needed to improve the game's performance, and updates coming down the pipe should improve Xbox One play in the near future. This confirms what many expected since DX12 was announced — the API may offer performance improvements in certain scenarios, but DX12 isn't a panacea for the Xbox One's lackluster performance compared to the PS4. It's an API that appears to mostly address scenarios where the CPU isn't able to keep the GPU fed due to draw call bottlenecks."
This discussion has been archived. No new comments can be posted.

Titanfall Dev Claims Xbox One Doesn't Need DX12 To Improve Performance

Comments Filter:
  • by Anonymous Coward

    Titanfall is a splendid game which has its debut on Xbox One, however, the jerky framerate and lower-than-expected 792p resolution mentioned above has caused this exclusive Xbox One game to further lower gamers' opinions on the console. While exclusive games usually promote a console's release, this one fails to do so...

    • by Joce640k (829181) on Sunday April 13, 2014 @01:26AM (#46738407) Homepage

      If the framerate is jerky then they didn't plan the game properly.

      There's no excuse on a console where you know the exact resources available, right down to individual clock cycles.

      • I have to agree with you. Did they not playtest it in anything resembling a "real" situation?
        Good grief.
        • they had to wait in line like everybody else for Microsoft to actually ship the things.

          • by JavaBear (9872) on Sunday April 13, 2014 @02:52AM (#46738663)

            MS pulled a fast one at E3, wehre they used high end PC's to demo the XBox One.
            IIRC MS later claimed that these were "representative" and also used for development. However, if these were the machines the devs were using to develop their game, it's no wonder they exceeded the available resources on the console.
            http://www.techpowerup.com/185... [techpowerup.com]

            • by Anonymous Coward on Sunday April 13, 2014 @04:17AM (#46738835)

              Those machines were representative of what dev teams - even internal MS dev teams - were told the console would be capable of at launch.

              XB1 Platform team MASSIVELY screwed the pooch on those promises and what the teams got was a gaming console that was startlingly bad at drawing verts and triangles. Some titles had to smash practically all non-hero asset shaders down to early DX9 era diff / spec / norm, and cut scene vert budgets to levels on par with Far Cry (the first one, with no plot and bad voice acting) and RtCW.

              So, yeah. Don't blame the game dev and art teams on this one. Blame the platform team and upper management that promised hemorrhaging-edge next-gen capability and handed the game teams a TiVo with a graphics card soldered to it.

              • by Lumpy (12016)

                "XB1 Platform team MASSIVELY screwed the pooch"
                No the XB1 executives that neutered the platform to increase profit margins are at fault. There is NO EXCUSE for the platform to not be 1080p and enough horsepower to easily handle everything at that resolution. The platform team had their hands tied by a bunch of idiots in suits telling them they need to make it cheaper so they can make more profits off of the device.

                • It wasn't profit (Score:5, Interesting)

                  by rsilvergun (571051) on Sunday April 13, 2014 @11:29AM (#46740421)
                  They over estimated the cost of GDDR5. You can only lose so much money on your console, and Microsoft has lost massive amounts for 2 generations.

                  They thought the price of GDDR5 was going to be so high they console would sell for more than people could pay. Remember the $799 3DO? No. There's your answer.

                  They tried to make up for it by putting 64 megs of high speed on die cache, but again screwed up. The cache was expensive and took up space on the CPU die that Sony used for more Cuda cores.

                  So yeah, it was a money decision, but it wasn't about profit, it was about making a console people could afford. Both companies guessed, and Microsoft guessed wrong.
                  • From what I have heard it was not a money decision. It was that of availability. They didn't think enough of it would exist to furnish production of one console let alone two. So they stuck with a more mature technology. So yeah they both guessed, but they were both also playing chicken, and MS flinched. Even today, if BOTH companies used DDR5, are you certain that it would not delay console production?

            • I had totally missed that. In my defense, I only have an (original) XBox, and a PS3; I own neither a PS4 nor an XBone and have paid only passing attention to either.

              But yeah, that's terrible. This isn't why the console market is brutal, but these shenanigans don't help.
              Also, Microsoft's shooting-self-in-foot technique presents such a fragmented approach to putting out a serious product.
              Just....words.....you know what I mean?
      • If the framerate is jerky then they didn't plan the game properly.

        I was going to say "or test it properly", but with the number of glaring bugs I see in games these days I'm starting to think that publishers are taking the sell-it-anyway approach.

      • by mjwx (966435)

        If the framerate is jerky then they didn't plan the game properly.

        There's no excuse on a console where you know the exact resources available, right down to individual clock cycles.

        Well if you plan your game based on specifications that were a complete lie, then yes, yes you have an excuse.

        The Xbox One and PS3 were overhyped and under delivered. They've handed this generation to the lacklustre Wii U, despite not being a big improvement actually does what Nintendo said it would.

    • Titanfail perhaps?
    • Then it's a good thing this isn't an exclusive release.

    • by B33rNinj4 (666756)
      Exclusive the the Xbox One...which is why I prefer to play it on my PC.
  • by CTU (1844100)

    Xbone just sucks compared to the PS4 so it is no wonder the system can't run the game well.

    Well can't say I am upset with not having an xbone, if I really wanted this game, then I think PC would be better anyway with a decent video card at least :)

  • by FlyHelicopters (1540845) on Sunday April 13, 2014 @01:57AM (#46738491)

    Ok, fair enough, the XBox One is a vast improvement over the XBox 360 in many ways...

    But it isn't SO much of an improvement that it is drop dead obvious.

    The PS1 to PS2? Clear as day, just compare FF7 to FFX.

    The PS2 to PS3? Likewise, clear as day, compare FFX to FF13.

    How about before the PS1? SNES? Really, do I have to compare FF2 to FF7? :)

    The XBox (original) to XBox 360, night and day...

    The XBox One? Meh... it is nice, but it can't even play 1080p games, 10 years after 1080p really started to come out in any numbers.

    The PS4 is better, being 50% faster (thanks to 50% more GPU resources), but it isn't THAT much better. Neither console is really "next-gen", that would have been 4K resolution.

    Both are "fine", but fine just isn't going to cut it.

    • by Saffaya (702234)

      The last gen (X360 & PS3) aimed at 720p native.
      The current gen (Xbone & PS4) aims at 1080p native.

      It seems a bit exaggerated to expect them to aim at UHD 2160p or 4k.
      Just have a look at the electrical and transistor power that a PC actually needs to achieve that.
      Of course, they can still output 2160p, but only with simpler games. In the same vein, the original Xbox had several 1080i games.

      • by Anonymous Coward

        The last gen (X360 & PS3) aimed at 720p native.

        That BS. Microsoft and Sony fanboys mocked the Wii for targeting 720p. According to them they had all the games in glorious 1080p while Wii peasant didn't had real HD.

        1080p was the selling point of both 360 and ps3.

        • by FlyHelicopters (1540845) on Sunday April 13, 2014 @02:45AM (#46738641)

          Yes... In fairness, not all 1080P is equal, the PS4/XB1 can of course have more detail at the same resolution as the older consoles, but to the average person just looking at them, they are all "fine".

          I showed my wife the PS4 when it came out, side by side to the PS3 (which we own 2 of), yes, she said "yea, the PS4 looks nicer, but are the games any better?".

          Eh, they are of course more of the same, nothing has really changed.

          This is of course a problem... :)

        • by Emetophobe (878584) on Sunday April 13, 2014 @03:50AM (#46738769)

          That BS. Microsoft and Sony fanboys mocked the Wii for targeting 720p. According to them they had all the games in glorious 1080p while Wii peasant didn't had real HD.

          Correction: The Wii was 480p [wikipedia.org], not 720p.

          • by donaldm (919619)

            That BS. Microsoft and Sony fanboys mocked the Wii for targeting 720p. According to them they had all the games in glorious 1080p while Wii peasant didn't had real HD.

            Correction: The Wii was 480p [wikipedia.org], not 720p.

            Well if you live in the USA and countries that support the NTSC standard then "yes", however there are other countries that support the PAL standard which is 575i/p 768×576) so the Wii can output in higher resolution.

        • by Narishma (822073)

          What? The Wii can't even output at 720p. It's max is 480p.

        • by firex726 (1188453)

          Which was all the more odd when so many games still come out in only 720.
          Seems like it takes a re-release to ensure it happens, otherwise it'll be 720 for the release copy.

      • The PS3 plays a lot of games at 1080p native...

        There is nothing wrong with the PS4/XB1, other than for $400/$500, they don't really offer anything new.

        PS1 was the first major 3D console, it was a massive improvement over the SNES.

        The PS2 offered DVD, vastly upgraded graphics, etc.

        The PS3 offered Blu-Ray, 1080p, and the first serious online console (from Sony).

        The PS4? Meh, it is a faster PS3, but otherwise, it doesn't offer anything new.

        • Um...The PS3 renders very few games at 1080p native. Maybe a dozen titles out of the entire catalog.

          • More than that. One list [playstation.com] I found lists 33 1080p native games, and the list may not be complete (It lists upscaled games as well, which I edited out, but doesn't include any of the Final Fantasy titles, all of which I know are 1080p, although FF13, at least, is apparently upscaled). Here they are:

            Bioshock
            The Bourne Conspiracy
            Call of Duty: World at War
            Civilization Revolution
            College Hoops 2K7
            Fantastic Four: Rise of the Silver Surfer
            Ferrari Challenge Trofeo Pirelli
            FIFA 09
            FIFA Street 3
            Gran Turismo 5: Prologue

          • The PS3 plays a lot of games at 1080p native...

            There is nothing wrong with the PS4/XB1, other than for $400/$500, they don't really offer anything new.

            PS1 was the first major 3D console, it was a massive improvement over the SNES.

            The PS2 offered DVD, vastly upgraded graphics, etc.

            The PS3 offered Blu-Ray, 1080p, and the first serious online console (from Sony).

            The PS4? Meh, it is a faster PS3, but otherwise, it doesn't offer anything new.

            Um...The PS3 renders very few games at 1080p native. Maybe a dozen titles out of the entire catalog.

            Don't forget the other dimension. 1080 is only 360 more than 720, but 1920 is over 800 more pixels than 1280. IMO, that's the dimension we should be talking about, since its more significant. However, per pixel calculation load scales with area, not 1/2 perimeter. So, if we look at total pixels: 1280x720p = 921,600 pixels, and 1920x1080p = 2,073,600, the difference being 1,152,000, so a lot of people don't understand that going from 720 to 1080 is MORE THAN TWICE the pixels, in pixel shader costs you might as well be rendering a full secondary screen.

            Now, that's not to say the total cost in rendering will absolutely increase over two fold. Full screen effects like Bloom or HDR are going to come it at about twice the cost. Interpolating a texture coordinate to look up pixel values is cheap compared to most any shader program, even to do something like cube-map specular highlight/reflections, bump mapping (I prefer parallax mapping), shadow mapping, or etc. However, the complexity of geometry calculations can be the same at both resolutions. In a ported / cross-platform game the geometry assets are rarely changed (too expensive in terms of re-rigging and all the animations, testing, etc.) so given slightly better hardware a game at the same resolution will have the prime difference be in adding more particle effects, increased draw distance, maybe even a few whole extra pixel sharers (perhaps the water looks way more realistic, or flesh looks fleshier, blood is bloodier, reflections are more realistic, etc.)

            Jumping up to 1080p makes your pixel shader cost a lot more frame time. Developing for 1080p vs 720p would optimally mean completely reworking the graphics and assets and shaders to adapt to the higher shader cost, maybe cut down on pixel shader effects and add more detailed geometry. I encounter folks who think "1080 isn't 'next gen', 4K would have been next gen" -- No, that's ridiculous. 1080p is "next gen resolution", but the new consoles are barely capable of it while having a significant degree of increase in shader complexity from last gen, and we're seeing diminishing returns on increasing the resolution anyway. So, I wouldn't call the consoles quite 'next-gen' in all areas. IMO, next gen console graphics would handle significantly more shaders while running everything smoothly at 1080p, just like the above average gaming PC I got my younger brother for his birthday which kicks both PS4 and Xbone's ass on those fronts. That would be the sort of leap in graphics scale between PS1 and PS2 or Xbox and the 360. 4K would be a generation beyond 'next-gen' because of the way shaders must scale with resolution.

            One of the main advances this new console generation brings is in the way memory is managed. Most people don't even understand this, including many gamedevs. Traditionally we have to had two copies of everything in RAM, one texture loaded from storage to main memory, and another copy stored in the GPU; Same goes for geometry, but sometimes even a third lower detail geometry will be stored in RAM for the physics engine to work on. The other copy in main RAM is kept ready to shove down the GPU pipeline, and the resource manager tracks which assets can be retired and which will be needed to prevent cache misses. That's a HUGE cost in total RAM. Traditionally this bus bandwidth has been a prime limitation in interactivity. Shader programs exist because we couldn't manipulate video RAM directly (they were the first step on the return to software rasterizer days, where the physics, logic and graphics could all interact freely). Shoving updates to the GPU is costly, but reading back any data from the GPU is insanely expensive. With shared memory architecture we don't have to keep that extra copy of the assets, so without an increase in CPU/GPU speed just full shared memory by itself would practically double the amount geometry and detail the GPU could handle. The GPU could directly use what's in memory and the CPU can manipulate some GPU memory directly. It means we can compute stuff on the GPU and then readily use it to influence game logic, or vise versa, without paying a heavy penalty in frame time. The advance in heterogeneous computing should be amazing, if anyone knew what to do with it.

            Ultimately I'd like to put the whole damn game in the GPU, it's not too hard on traditional memory model hardware (well, it's insane but not impossible): You can keep all the gamestate and logic in buffers on the GPU and bounce between two state buffer objects using shaders to compute physics and update the buffer as input for the next physics and render pass; Pass in a few vectors to the programs for control / input. I've even done this with render to texture but debugging VMs made of rainbow colored noise is a bitch. The problem is that controller input, drives, and the NIC aren't available to the GPU directly so I can't really make a networked game that streams assets from storage completely in the GPU alone, there has to be an interface and that means CPU feeding data it and reading data out across the bus, and that's slow for any moderate size of state I'd want to sync. At least with everything GPU bound I can make particle physics interact with not just static geometry, but dynamic geometry, AND even game logic: I can have each fire particle be able to spawn more fire emitters if they touch a burnable thing right on the GPU and make that fire damage the players and dynamic entities; I can even have enemy AI reacting to the state updates without a round trip to the CPU if their logic runs completely on the GPU... With CPU side logic that's not possible, the traditional route of read-back is too slow, so we have particles going through walls, and use something like "tracer-rounds", a few particles (if any) on the CPU to interact with the physics and game logic. With the shared memory model architecture more of this becomes possible. The GPU can do calculations on memory that the CPU can read and apply to game logic without the bus bottle neck; CPU can change some memory to provide input into the GPU without shoving it across a bus. The XBone and PS4 stand to yield a whole new type of interaction to games, but it will require a whole new type of engine to leverage the new memory model. It may even require new types of game. "New memory architecture! New type of games are possible!" Compared with GP: "Meh, it is a faster PS3, but otherwise it doesn't offer anything new." . . . wat?

            As a cyberneticist, all these folks wanking over graphics make me cry. The AI team is allowed 1%, or maybe 2% of the budget. All those parallel Flops! And they're just going to PISS THEM AWAY instead of putting in actual machine intelligence that can be yield more dynamism or even learn and adapt as the game is played? You return to town and the lady pushing the wheelbarrow is pushing that SAME wheelbarrow the same way. They guy chopping wood just keeps chopping that wood forever: Beat the boss, come back, still chopping that damn wood! WHAT WAS THE POINT OF WINNING? The games are all lying to you! They tell you, "Hero! Come and change the world!", and now you've won proceed to game over. Where's the bloody change!? Everything just stays pretty much the same!? Start it up again, you get the same game world? Game "AI" has long been a joke, it's nothing like actual machine learning. It's an indication of a Noob gamedev when they claim their AI will learn using neural networks, and we'd all just laugh or nod our heads knowingly, but I can actually do that now, for real, on the current and this new generation of existing hardware... If the AI team is allowed the budget.

            A game is not graphics. A game is primarily rules of interaction, without them you have a movie. Todays AAA games are closer to being movies than games. Look at board games or card games like Magic the Gathering -- It a basic set of rules and some cards that a add a massive variety of completely new rules to the game mechanics so the game is different every time you play. I'm not saying make card games. I'm saying that mechanics (interaction between players, the simulation and the rules) is what a game is. Physics is a rule set for simulating, fine, you can make physics games and play within simulations, but a simulation itself isn't really a game, at the very least a world's geometry dictates how you can interact with the physics. Weapons and some spells, item effects, etc. things might futz with the physics system, but it is very rare to see a game that layers on rules dynamically during the course of play in a real-time 3D the way that paper and dice RPGs or even simple card games do. League of Legends does a very job of adding new abilities that have game changing ramifications and the dynamic is great because of it, but that's a rare example and is still not as deep as simple card games like MtG. It's such a waste, because we have the ram and processing power to do such things, but we're just not using it.

            I love a great stories, but it looks like all the big-time studios are fixated on only making these interactive movies to the exclusion of what even makes a game a game: The interaction with various rule logic. AAA games are stagnant in my opinion, it's like I'm playing the same game with a different skin, maybe a different combination of the same tired mechanics. The asset costs and casting, scripts, etc. prevent studios from really leveraging the amazing new dynamics and logic detail that are available in this generation of hardware, let alone next-gen hardware with shared memory architectures. IMO, most AAA studios don't need truly next-gen hardware because they don't know what the fuck to do with it -- Mostly because they've been using other people's engines for decades. These 'next-gen' consoles ARE next gen in terms of the game advancement they enable, even rivaling PCs in that regard, but no one is showing them off. I hope that changes. Most folks are scratching their head and asking, "How do I push more pixels with all this low latency RAM?" and forgetting that pixels make movie effects, not games. I mean, I can run my embarrassingly parallel n.net hive on this hardware, and give every enemy and NPC its own varied personality where the interactions with and between them become more deep and nuanced than Dwarf Fortress, and the towns and scenarios and physics interactions more realistic, or whimsical, or yield cascades of chaotic complexity... but... Dem not nxtGen, cuz MUH PIXZELS!!1!!1

            The enemies and NPCs in your games are fucking idiots because "AI" and rules are what games are made of, and the AI team is starving to death while watching everyone else gorge themselves at the graphics feast. It's ridiculous. It's also pointless. So what if you can play Generic Army Shooter v42 with more realistic grass? Yeah, it's nice to have new shooters to play, but you're not getting the massive leap in gameplay. You could be protecting the guys who are rigging a building to crush the enemies as you retreat and cut off their supply lines. No, the level of dynamism in a FPS today is barely above that of a team of self-interested sharp shooters honing their bullseye ability. It's boring to me, great, I'm awesome at shooting while running now. So fucking what. Protip: that's why adding vehicles was such a big deal in FPSs -- That was a leap in game mechanics and rules. I'm picking on FPS, but I can leverage the same at any genre: There's little in the way of basic cooperative strategy (cooperative doesn't have to mean with other players, instead of re-spawning why not switch between bodies of a team having them intuitively carry out the task you initiate when not in the body anymore). We barely have any moderate complexity available in strategy itself let alone the manipulation of new game rules on the fly for tactical, logistical, or psychological warfare. How many pixels does it take to cut off a supply line, or flank your enemies?

            • by AmiMoJo (196126) *

              The reason people don't think 1080p is "next gen" is because PC gaming moved on from it years ago. If you look at most of the hardware review sites they test cards at 4k or 1440p with everything on maximum detail, way beyond the current crop of consoles.

              I think people expected 1080p as the absolute minimum, since that's what PC gamers expect. Even low end cards can run most stuff on medium detail in full HD.

              • by donaldm (919619)

                The reason people don't think 1080p is "next gen" is because PC gaming moved on from it years ago. If you look at most of the hardware review sites they test cards at 4k or 1440p with everything on maximum detail, way beyond the current crop of consoles.

                I think people expected 1080p as the absolute minimum, since that's what PC gamers expect. Even low end cards can run most stuff on medium detail in full HD.

                Most HDTV's have an aspect ratio of 16:9 and support 720p (1280x720 pixels) and 1080i/p (1920x1080 pixels). Now there are what is commonly called 4k HDTV's and most of these also have an aspect ratio of 16:9 so a little bit of arithmetic will give you 3840x2160 pixels or 2160p. at the moment 4k HDTV's are more expensive than 1080p HDTV's although the price is dropping, however while the difference between Standard Definition and HDTV is very obvious especially when screen sizes can be over 80 inches (16:9 a

            • Funny, my desktop PC runs games at 2560x1600 = 4096000 pixels. Effectively double again the 1080p pixel count. The performance hit is obvious going to that rez, but considering this hardware cost about $700 three years ago... Admittedly I haven't tried Titanfall yet, it just doesn't look that interesting to me. I can tell you that Battlefield 4 runs at High settings and looks absolutely beautiful at high resolution.
      • by Lumpy (12016)

        Current Gen does NOT aim at 1080p XB1 is not capable of 1080p.

      • PC's have been capable of 2048x1536 at 100hz since xbox 360 came out. Expecting consoles to be "just" a decade behind PC's seems reasonable.

    • by guises (2423402) on Sunday April 13, 2014 @02:23AM (#46738551)
      It's not about releasing an underpowered console, it's about focusing on performance as a selling point. The Wii U can't do what either of them can graphically, but it's the only one I actually want. No DRM bullshit, no ads, no camera in my living room, the games are actually fun, off screen play... I'm getting a little sick of people treating this like it's a two horse race.
      • by FlyHelicopters (1540845) on Sunday April 13, 2014 @02:53AM (#46738667)

        The Wii U is nice in many ways, we own one. My 8 year old son and 5 year old daughter love Super Mario World 3D.

        It shows that graphics are nice, but not everything, great games are great games, on any console.

        The problem with the Wii U is that it is WAY overpriced for what it is. It just isn't selling and the time to get it selling has probably past, nothing Nintendo can do about it at this point.

        I recently bought an Amazon Fire TV, and frankly, it has some really nice games on it that look just as nice as most of what is on our PS3. My son has been playing the tower defense game that comes with it and has been having just as much fun with it as with anything else.

        For a $99 device that really is meant to watch TV with, that may be the real threat to PS4/XB1, if a $99 device is "good enough", how much demand is there for $500 game consoles?

        Some, to be sure... but the price needs to come down.

        • by aliquis (678370)

          This one just in:

          ARM is starting to become more competitive and some people think their portables are adequate ;D

          • Yep... Take a look at "Real Racing 3" on the iPad.

            One of my son's favorite iPad games, it looks just as good as anything on the PS3 and it runs on a tablet.

            Until we get something "new" in games, there will be a limit to how much "better graphics" can sell new systems. They are approaching "good enough" for most people, at least until something changes such as the world becoming bigger, or something else about the actual game play being new.

    • by naff89 (716141)

      This is my major sticking point, too. I upgraded to a PS2 for DVD and component video, and I upgraded to a PS3 for BR and HDMI. So I could get a PS4 and have... BR and HDMI?

      Not to mention that my PS2 played PS1 games and my PS3 played PS1/PS2 games, meaning that each time I could just swap the console out and keep my current library -- I always had great games to play on them.

      • Dont care about running ps3 games on a ps4, i have a ps3 for that, and its updated to 500gig too. So wth PS+ and the free games, theres loads of games for ps3, ps4 days are early, so patience, its not even been 6 months.

        • Nothing wrong with that, you may be happy to have a PS3 next to a PS4.

          Rest assured not everyone is ok with that, many parents who are suffering from gadget overload (raises hand) don't want another box.

          As it stands, we have too many, we recently canceled DirecTV to cut down on the boxes and devices, using our Roku 3 boxes (now Amazon Fire TV boxes for the parental controls) to watch TV because they are faster than the PS3.

          We keep the PS3s because the kids have a game library they play and because they are o

      • Thank you for bringing up the backwards compatibility issue...

        Our two main TVs each have a PS3 on them, they serve dual purple of being a BR player and a game console.

        We are NOT going to have a PS4 sitting next to a PS3, we just aren't... there is already too much in front of our TVs.

        If the PS4 had the ability to play PS3 games, I'd have bought one already (I had one on preorder with Amazon and canceled a few weeks before launch).

        The PS4 simply doesn't offer enough to add a completely new game console. Be

    • by Dutch Gun (899105)

      Neither console is really "next-gen", that would have been 4K resolution.

      I would have been happy with true 1080p resolution. How many people actually have 4K TVs at this point? Not nearly enough to support that in a console designed for the masses, at least. 4K is pretty demanding even for PC videocards. That would have pushed the cost up by several hundred bucks with absolutely no benefit for the majority of customers.

      Still, it's not like we could have expected the same massive leaps in visual quality from previous generations. After all, the 360/PS3 generation was already

      • How many people actually have 4K TVs at this point?

        Not many, but that will change. Within 3 years the price difference of a 4K and a 1080P TV will be pretty close.

        Last year they were $10K plus, today you can get one for about $1K, give or take, and a big huge 65" one for $3K.

        The prices are dropping fast, simply because they don't really cost that much more to make than current TVs.

        The switch from tube TVs to LCD TVs did indeed have a MASSIVE cost, whole factories had to be thrown out and completely new factories produced.

        But a 84" 4K TV is really just 4 42

        • by drinkypoo (153816)

          Do we need another optical disc format for 4K? No, not really, streaming video will probably be enough and frankly BR has enough room that if they want to do it with H.265, they could.

          We don't need a new optical disc, but we will at minimum need extensions to the format. Many consumers still want to buy a physical thing, and also, putting physical things in front of people still makes them buy shit they don't even really want, so they're going to keep doing that into the 4K era.

          On the other hand, it's easy to see lots of companies just not bothering to release 4K content, because making 1080P is enough of a PITA compared to 480P

    • Re: (Score:3, Interesting)

      by aliquis (678370)

      FWIV - Also 1080p games and with possibly more details and AA would still of course be nicer than 720.

      Xbox, 2001-2002. 64 MB 200 MHz DDR shared graphics memory, 733 MHz PIII-ish, 233 MHz NV2A.
      Geometry engine: 115 million vertices/second, 125 million particles/second (peak)
      932 megapixels/second (233 MHz Ã-- 4 pipelines), 1,864 megatexels/second (932 MP Ã-- 2 texture units) (peak)
      (CPU random page 3 GLFOPS, GPU? Nvidia supposedly claim 80, some Xbox book say 22 in total.)

      Xbox 360, 2005-2006, 512 MB 7

      • by aliquis (678370)

        I guess one thing which may have changed is that the supposedly "PC gamers" and what is the average there may have decreased / isn't even counted for any more because so many have moved to more portable stuff so what one view as a "serious PC rig" as among the best there is out there whereas previously maybe the consoles was compared to a PC more people had.

        AKA compare the Xbox One to a tablet and you won't be disappointed by the performance ..

        EA made their claim how the PCs couldn't run the new FIFA 14 eng

    • by batkiwi (137781)

      Ignore 4k. That's a long ways off.

      EVERY game on the ps4/xbone should have been REQUIRED to be 1080p/60fps.

      If you can't achieve that, your game doesn't pass QC. If you can't achieve that, then you turn down some details.

      Due to THAT I don't consider either a "nextgen" (eg 'current-gen') console.

      For the same price I bought a PC (in Australia where we pay more for everything!) that can play titanfall at 1080p at 60fps with everything reasonable turned on. As well as every other game. No I don't run 16xFSAA,

    • by drinkypoo (153816)

      The PS4 is better, being 50% faster (thanks to 50% more GPU resources), but it isn't THAT much better. Neither console is really "next-gen", that would have been 4K resolution.

      Except only a vanishingly few former-gen titles were actually 1080p and basically none of them ran at a smooth 1080p60. I'd have settled for 1080p60, but they're not even providing that. My guess is that next generation they'll support 4K... badly.

      • by AmiMoJo (196126) *

        NHK are aiming to start 8k broadcasts in 2020 (in time of the Olympics) so I expect TV manufacturers will have models out by then. Judging by how long we had to wait for the current generation of consoles it seems likely that by the time the next generation comes 4k will be standard and 8k will be where PC gaming is at.

    • by firex726 (1188453)

      Also of note...

      PS1 to PS2 had DVD support at a time when TVs had only one input.
      Then PS2 to PS3 had Bluray support at a time when a player could cost as much as the console itself.

      • Yes, and when the PS2 came out, it also wasn't that much more than a DVD player, and you got a console for free. :)

        The PS3 was an easy choice, we bought ours at launch in 2006 because it indeed was the cheapest BR player out there. For $600 we got the best BR player on the market, a next-gen console that was clearly better than the PS2, AND it played ALL our PS2 and PS1 games.

        If the PS4 was $499 instead of $399 and it included full PS3 compatibility, I'd buy one, as it stands, I'm not interested in another

    • by Shinobi (19308)

      "The PS4 is better, being 50% faster (thanks to 50% more GPU resources),"

      Actually, what makes the PS4 better is not the extra GPU resources, it's that they use GDDR5, so the system won't be starved for memory bandwidth. If the PS4 had used the same type of RAM as the XB1, it would have been as starved, with the same drop in performance. It's one of the well-known drawbacks of Unified Memory designs.

      • The GDDR5 indeed does help... much more memory bandwith...

        That being said, the PS4 also has 50% more shaders than the XB1, so it is faster all the way around.

        In general, the XB1 can run at 720p while the PS4 can run at 900p, given all the same graphics factors. It isn't fast enough to run the same games at 1080p, but it is close.

    • by whiplashx (837931)

      What you're complaining about is that the growth isn't linear. But all of the improvements you're pointed out have seemed "smaller" than the last. Imagine if we could get the kind of improvement SNES had over NES again. But that sort of thing just isn't possible in modern games; the required complexity of the art grows way faster than the required complexity of the hardware.

      • The irony is that the SNES and the NES have the same CPU in them, just faster...

        If Nintendo had wanted to, it would have been easy to make the SNES pay NES games...

  • "According to Titanfall developer Jon Shirling, the new Microsoft API isn't needed to improve the game's performance, and updates coming down the pipe should improve Xbox One play in the near future. This confirms what many expected since DX12 was announced — the API may offer performance improvements in certain scenarios, but DX12 isn't a panacea for the Xbox One's lackluster performance compared to the PS4." How is the ability of devs to improve their product through convential optimizations a CONF
    • by Zxern (766543)

      All this really tells you is that Titanfall isn't really a next gen title at all, which should be obvious as it was originally planned for the 360 to start with and will be released on it shortly. It runs about as well as your typical console to pc port, which is basically what it is, a xbox360 to xboxone port.

  • by _Shorty-dammit (555739) on Sunday April 13, 2014 @02:04AM (#46738519)

    Only they're also known targets, and should be able to be easily programmed for, as a result. Performance for 1920x1080 shouldn't be an issue for any title on the hardware available. It boggles the mind at how poor these developers must be if they can't even target known hardware, console-style, and get good performance out of the thing. Average PC game devs don't seem to have any problem doing so on the PC, and that's a moving target. Why would any competent devs have a problem with a fixed target? They've got decent CPUs. They've got decent GPUs. They've got a decent amount of RAM. Yet they found a way to get horrible performance out of it. Send in the firing squad.

    • Yes and no (Score:4, Insightful)

      by Sycraft-fu (314770) on Sunday April 13, 2014 @03:52AM (#46738779)

      So they are a bit different, hardware wise. A big difference is unified memory. There is only one pool of memory which both the CPU and GPU access. That's makes sense since the CPU and GPU are also on the same silicon, but it is a difference in the way you program. Also in the case of the Xbone they decided to use DDR3 RAM, instead of GDDR5, which is a little slow for graphics operations, but the APU (what AMD calls the CPU/GPU combo chips) has 32MB of high speed embedded RAM on it to try and buffer for that.

      Ok so there are some differences. However that aside, why the problem with the target? Visual quality. Basically, a video card can only do so much in a given time period. It only can push so many pixels/texels, only run so many shaders, etc. So any time you add more visual flair, it takes up available power. There's no hard limit, no amount where it stops working, rather you have to choose what kind of performance you want.

      For example if I can render a scene with X polygons in 16ms then I can output that at 60fps. However it also means that I can render a scene of 2X polygons in about 33ms, or 30fps.

      So FPS is one tradeoff you can make. You don't have to render at 60fps, you can go lower and indeed console games often do 30fps. That means each frame can have more in it, because the hardware has longer to generate it.

      Another tradeoff is resolution. Particularly when you are talking texture related things, lowering the output resolution lowers the demand on the hardware and thus allows you to do more.

      So it is a tradeoff in what you think looks best. Ya, you can design a game that runs at 1080p60 solid. However it may not look as good overall as a game that runs at 720p30 because that game, despite being lower FPS and rez, has more detail in the scenes. It is a choice you have to make with limited hardware.

      On the PC, we often solve it by throwing more hardware at the problem, but you can't do that on a console.

      • On the PC, we often solve it by throwing more hardware at the problem, but you can't do that on a console.

        I think the OP's point was that they should have been starting with this extra hardware to begin with.

        • Hardware costs money. If you want cheap consoles, you have to trade things off. For example my PC has no problems rendering games like Titanfall at 60fps, even at resolutions beyond 1080 (2560x1600 in my case). So, just put that kind of hardware in a console right? Ya well, my GPU alone costs near double what a current console does, never mind the supporting hardware. It isn't feasible to throw that level of hardware at a console, it just costs too much.

          That kind of thing has been tried in the past and it n

    • They have really anemic CPUs. The PS4 and Xbox One are each using something pretty similar to the Athlon 5150 (except with 4 modules/8 cores instead of 2 modules/4 cores).
      • by Narishma (822073)

        The consoles have two modules, with each module consisting of 4 cores an a shared 2 MiB L2 cache. The Athlon 5150 only has one module.

        • AMD defines a module as a set of 1 FPU and 2 integer cores. The Athlon 5150 has two modules/four integer cores. The consoles have two of these two module/four integer core things for four modules/eight cores.
          • by Narishma (822073)

            You are incorrect. The consoles use Jaguar [wikipedia.org] modules, as opposed to the Bulldozer [wikipedia.org] family, which is what you describe. The Athlon 5150 is also Jaguar BTW.

            • You're right, total brainfart on my part. I knew they were Jaguars (hence anemic), but I was thinking jags were put together the same way as the Bulldozers. Still--my point was that it's an Athlon 5150 with more cores (same speed, architecture), which really isn't enough to feed modern games at 1080p.
              • by Narishma (822073)

                It may not be fine with the bloated APIs (OpenGL and D3D) and unoptimized games on PC but on a console with low level access to the hardware, it's more than enough. The lack of 1080p games on Xbox One (I believe the only non-1080p game on PS4 is BF4) is mostly due to its middling GPU.

    • Optimising an engine like that is a non-trivial exercise, especially with newish hardware. So no, they're not crap developers, they're developers with time and financial constraints who can only achieve so much before release.
    • by Nemyst (1383049)
      There's another reason that Sycraft-fu's reply didn't mention: the development of those games started before the consoles were out. This means that the targets, while known, were also moving. Specs changed a few times and I'll bet the APIs changed significantly over the course of the past few years. That makes it quite hard to properly implement the graphics engine for the console. This is why, as developers get more familiar with the API and hardware, we see graphics quality keep improving on the same hard
    • by Sir_Sri (199544)

      It boggles the mind at how poor these developers must be if they can't even target known hardware, console-style, and get good performance out of the thing.

      It boggles the mind why Microsoft put shitty laptop CPU ram in a gaming device.

      The devs are trying to find a balance point between visual quality (memory taken) and performance (memory bandwidth) but the 68GB/s memory bandwidth on the XB3 is way too low. IMO the 175 ish on the PS4 is too low too. For 30 FPS remember that only means you can have 2GB of stuff on screen at a time, for 60... well, 1 GB of stuff. (That's not counting AI and Audio).

      Yes, sure, the dev's need to make a game for it, but that's re

  • by Anonymous Coward
    The point of Mantle and DX12 is that they bring the kind of APIs to the desktop that have always been available on consoles. Therefore the whole premise that DX12 would somehow improve Xbone games is faulty.
  • Shouldn't that be prefaced with 'Former' as I'm sure Microsoft has seen to it that this person isn't involved in any more Xbox development for knocking their 'OMG turn it up to 12! rehash of D3D?

    • by Nemyst (1383049)
      Titanfall was developed by Respawn and published by EA. Respawn are an independent developer, not owned by either EA or Microsoft. Microsoft have absolutely no control over those developers apart from having an exclusivity deal with Respawn/EA.
  • I thought I read here on /. that M$ would not make any version of directx after d11?
  • by Sir_Sri (199544) on Sunday April 13, 2014 @02:06PM (#46741525)

    If you look at the Mantle benchmarks for various games it's pretty clear that it doesn't get you much on half decent systems, and on high end systems you're looking a negligible effect. I would think the same is true of DX12, which does the same basic thing.

    For all the complaining about the Xb3 it's not terrible hardware, it's some odd choices compared to the PS4 and it's slow compared to a high end PC. But it's not in an absolute sense bad hardware.

  • Maybe I'm just old (26, lol) but I remember when games that were designed for static resources on a console worked just fine. You had an N64 and your game ran on an N64 and all N64s were similar. Now they're building Xbox One games like it's a computer except there's nothing you can do to raise the performance. So the console and things like DX11 and 12 are complicated...so what? Make your game run on the damn hardware before releasing it. On my own gaming PC I ALWAYS put gameplay, speed, and high frame
    • As someone noted in another comment [slashdot.org], developers were shown hardware that was much faster than the actual Xbox One. Creating a rather awkward situation where they were creating games that couldn't be run on the actual machine w/o some cuts and/or issues. Interesting enough, the N64 apparently also had similar issues. [youtube.com]

Truth is free, but information costs.

Working...