Titanfall Dev Claims Xbox One Doesn't Need DX12 To Improve Performance 117
MojoKid writes: "One of the hot topics in the wake of Titanfall's launch has been whether or not DirectX 12 would make a difference to the game's sometimes jerky framerate and lower-than-expected 792p resolution. According to Titanfall developer Jon Shirling, the new Microsoft API isn't needed to improve the game's performance, and updates coming down the pipe should improve Xbox One play in the near future. This confirms what many expected since DX12 was announced — the API may offer performance improvements in certain scenarios, but DX12 isn't a panacea for the Xbox One's lackluster performance compared to the PS4. It's an API that appears to mostly address scenarios where the CPU isn't able to keep the GPU fed due to draw call bottlenecks."
Titanfall's pros and cons (Score:1)
Titanfall is a splendid game which has its debut on Xbox One, however, the jerky framerate and lower-than-expected 792p resolution mentioned above has caused this exclusive Xbox One game to further lower gamers' opinions on the console. While exclusive games usually promote a console's release, this one fails to do so...
Re:Titanfall's pros and cons (Score:5, Insightful)
If the framerate is jerky then they didn't plan the game properly.
There's no excuse on a console where you know the exact resources available, right down to individual clock cycles.
Re: (Score:3)
Good grief.
Re: (Score:1)
they had to wait in line like everybody else for Microsoft to actually ship the things.
Re:Titanfall's pros and cons (Score:5, Informative)
MS pulled a fast one at E3, wehre they used high end PC's to demo the XBox One.
IIRC MS later claimed that these were "representative" and also used for development. However, if these were the machines the devs were using to develop their game, it's no wonder they exceeded the available resources on the console.
http://www.techpowerup.com/185... [techpowerup.com]
Re:Titanfall's pros and cons (Score:5, Interesting)
Those machines were representative of what dev teams - even internal MS dev teams - were told the console would be capable of at launch.
XB1 Platform team MASSIVELY screwed the pooch on those promises and what the teams got was a gaming console that was startlingly bad at drawing verts and triangles. Some titles had to smash practically all non-hero asset shaders down to early DX9 era diff / spec / norm, and cut scene vert budgets to levels on par with Far Cry (the first one, with no plot and bad voice acting) and RtCW.
So, yeah. Don't blame the game dev and art teams on this one. Blame the platform team and upper management that promised hemorrhaging-edge next-gen capability and handed the game teams a TiVo with a graphics card soldered to it.
Re: (Score:3)
"XB1 Platform team MASSIVELY screwed the pooch"
No the XB1 executives that neutered the platform to increase profit margins are at fault. There is NO EXCUSE for the platform to not be 1080p and enough horsepower to easily handle everything at that resolution. The platform team had their hands tied by a bunch of idiots in suits telling them they need to make it cheaper so they can make more profits off of the device.
It wasn't profit (Score:5, Interesting)
They thought the price of GDDR5 was going to be so high they console would sell for more than people could pay. Remember the $799 3DO? No. There's your answer.
They tried to make up for it by putting 64 megs of high speed on die cache, but again screwed up. The cache was expensive and took up space on the CPU die that Sony used for more Cuda cores.
So yeah, it was a money decision, but it wasn't about profit, it was about making a console people could afford. Both companies guessed, and Microsoft guessed wrong.
Nope (Score:3)
From what I have heard it was not a money decision. It was that of availability. They didn't think enough of it would exist to furnish production of one console let alone two. So they stuck with a more mature technology. So yeah they both guessed, but they were both also playing chicken, and MS flinched. Even today, if BOTH companies used DDR5, are you certain that it would not delay console production?
Re: (Score:2)
But yeah, that's terrible. This isn't why the console market is brutal, but these shenanigans don't help.
Also, Microsoft's shooting-self-in-foot technique presents such a fragmented approach to putting out a serious product.
Just....words.....you know what I mean?
Re: (Score:3)
If the framerate is jerky then they didn't plan the game properly.
I was going to say "or test it properly", but with the number of glaring bugs I see in games these days I'm starting to think that publishers are taking the sell-it-anyway approach.
Re: (Score:2)
If the framerate is jerky then they didn't plan the game properly.
There's no excuse on a console where you know the exact resources available, right down to individual clock cycles.
Well if you plan your game based on specifications that were a complete lie, then yes, yes you have an excuse.
The Xbox One and PS3 were overhyped and under delivered. They've handed this generation to the lacklustre Wii U, despite not being a big improvement actually does what Nintendo said it would.
Re: (Score:3, Funny)
I've played both and I'm actually partial to the XOne version. The game "feels" better designed for a game pad vs mouse/keyboard, and it has the pacing that's better suited to relaxing on the couch. WIth the titans, the fast twitch you get with the mouse isn't as big a deal, and the wall running stuff lend's itself better to a controller situation. I think a lot of FPS stuff fails to translate to console, but some of them can be quite good if the dev's think about it beyond "right stick == mouse look".
Re: (Score:1)
Re: (Score:2)
Then it's a good thing this isn't an exclusive release.
Re: (Score:1)
The trolls shall say.... (Score:1, Informative)
Xbone just sucks compared to the PS4 so it is no wonder the system can't run the game well.
Well can't say I am upset with not having an xbone, if I really wanted this game, then I think PC would be better anyway with a decent video card at least :)
Re: (Score:2)
Problem with releasing an underpowered console (Score:5, Interesting)
Ok, fair enough, the XBox One is a vast improvement over the XBox 360 in many ways...
But it isn't SO much of an improvement that it is drop dead obvious.
The PS1 to PS2? Clear as day, just compare FF7 to FFX.
The PS2 to PS3? Likewise, clear as day, compare FFX to FF13.
How about before the PS1? SNES? Really, do I have to compare FF2 to FF7? :)
The XBox (original) to XBox 360, night and day...
The XBox One? Meh... it is nice, but it can't even play 1080p games, 10 years after 1080p really started to come out in any numbers.
The PS4 is better, being 50% faster (thanks to 50% more GPU resources), but it isn't THAT much better. Neither console is really "next-gen", that would have been 4K resolution.
Both are "fine", but fine just isn't going to cut it.
Re: (Score:3)
The last gen (X360 & PS3) aimed at 720p native.
The current gen (Xbone & PS4) aims at 1080p native.
It seems a bit exaggerated to expect them to aim at UHD 2160p or 4k.
Just have a look at the electrical and transistor power that a PC actually needs to achieve that.
Of course, they can still output 2160p, but only with simpler games. In the same vein, the original Xbox had several 1080i games.
Re: (Score:1)
The last gen (X360 & PS3) aimed at 720p native.
That BS. Microsoft and Sony fanboys mocked the Wii for targeting 720p. According to them they had all the games in glorious 1080p while Wii peasant didn't had real HD.
1080p was the selling point of both 360 and ps3.
Re:Problem with releasing an underpowered console (Score:5, Interesting)
Yes... In fairness, not all 1080P is equal, the PS4/XB1 can of course have more detail at the same resolution as the older consoles, but to the average person just looking at them, they are all "fine".
I showed my wife the PS4 when it came out, side by side to the PS3 (which we own 2 of), yes, she said "yea, the PS4 looks nicer, but are the games any better?".
Eh, they are of course more of the same, nothing has really changed.
This is of course a problem... :)
Re:Problem with releasing an underpowered console (Score:5, Informative)
Correction: The Wii was 480p [wikipedia.org], not 720p.
Re: (Score:2)
Correction: The Wii was 480p [wikipedia.org], not 720p.
Well if you live in the USA and countries that support the NTSC standard then "yes", however there are other countries that support the PAL standard which is 575i/p 768×576) so the Wii can output in higher resolution.
Re: (Score:2)
What? The Wii can't even output at 720p. It's max is 480p.
Hey Yankee, the world beats you again (Score:2)
The rest of the planet got 576p, sucks to be you at 480p.
Re: (Score:1)
By upscaling the 480p signal. Woooooo
Re: (Score:2)
Which was all the more odd when so many games still come out in only 720.
Seems like it takes a re-release to ensure it happens, otherwise it'll be 720 for the release copy.
Re: (Score:1)
Judging by the summary you surely mean: The last gen (X360 & PS3) aimed at 720p native. The current gen (Xbone & PS4) aims at 792p native.
Wrong.
Most PS4 games aim for (and achieve) 1080p, e.g. Knack, COD Ghosts, Assassin's Creed 4, Killzone SF SP, Infamous SS, Outlast, Tomb Raider DE, Metal Gear Solid: GZ.
Not sure what XB1 is aiming for, as in most of the games listed above - the ones that were MP anyway - vary between 720p, 900p & in a rare case, 1080p. TitanFall is the only XB1 game that has that odd-man-out resolution of 792p.
I would say, respectfully, please club things together only when they make sense.
Re: (Score:2)
The PS3 plays a lot of games at 1080p native...
There is nothing wrong with the PS4/XB1, other than for $400/$500, they don't really offer anything new.
PS1 was the first major 3D console, it was a massive improvement over the SNES.
The PS2 offered DVD, vastly upgraded graphics, etc.
The PS3 offered Blu-Ray, 1080p, and the first serious online console (from Sony).
The PS4? Meh, it is a faster PS3, but otherwise, it doesn't offer anything new.
Re: (Score:2)
Um...The PS3 renders very few games at 1080p native. Maybe a dozen titles out of the entire catalog.
Re: (Score:3)
More than that. One list [playstation.com] I found lists 33 1080p native games, and the list may not be complete (It lists upscaled games as well, which I edited out, but doesn't include any of the Final Fantasy titles, all of which I know are 1080p, although FF13, at least, is apparently upscaled). Here they are:
Bioshock
The Bourne Conspiracy
Call of Duty: World at War
Civilization Revolution
College Hoops 2K7
Fantastic Four: Rise of the Silver Surfer
Ferrari Challenge Trofeo Pirelli
FIFA 09
FIFA Street 3
Gran Turismo 5: Prologue
Re:Problem with releasing an underpowered console (Score:5, Informative)
Um...The PS3 renders very few games at 1080p native. Maybe a dozen titles out of the entire catalog.
Don't forget the other dimension. 1080 is only 360 more than 720, but 1920 is over 800 more pixels than 1280. IMO, that's the dimension we should be talking about, since its more significant. However, per pixel calculation load scales with area, not 1/2 perimeter. So, if we look at total pixels: 1280x720p = 921,600 pixels, and 1920x1080p = 2,073,600, the difference being 1,152,000, so a lot of people don't understand that going from 720 to 1080 is MORE THAN TWICE the pixels, in pixel shader costs you might as well be rendering a full secondary screen.
Now, that's not to say the total cost in rendering will absolutely increase over two fold. Full screen effects like Bloom or HDR are going to come it at about twice the cost. Interpolating a texture coordinate to look up pixel values is cheap compared to most any shader program, even to do something like cube-map specular highlight/reflections, bump mapping (I prefer parallax mapping), shadow mapping, or etc. However, the complexity of geometry calculations can be the same at both resolutions. In a ported / cross-platform game the geometry assets are rarely changed (too expensive in terms of re-rigging and all the animations, testing, etc.) so given slightly better hardware a game at the same resolution will have the prime difference be in adding more particle effects, increased draw distance, maybe even a few whole extra pixel sharers (perhaps the water looks way more realistic, or flesh looks fleshier, blood is bloodier, reflections are more realistic, etc.)
Jumping up to 1080p makes your pixel shader cost a lot more frame time. Developing for 1080p vs 720p would optimally mean completely reworking the graphics and assets and shaders to adapt to the higher shader cost, maybe cut down on pixel shader effects and add more detailed geometry. I encounter folks who think "1080 isn't 'next gen', 4K would have been next gen" -- No, that's ridiculous. 1080p is "next gen resolution", but the new consoles are barely capable of it while having a significant degree of increase in shader complexity from last gen, and we're seeing diminishing returns on increasing the resolution anyway. So, I wouldn't call the consoles quite 'next-gen' in all areas. IMO, next gen console graphics would handle significantly more shaders while running everything smoothly at 1080p, just like the above average gaming PC I got my younger brother for his birthday which kicks both PS4 and Xbone's ass on those fronts. That would be the sort of leap in graphics scale between PS1 and PS2 or Xbox and the 360. 4K would be a generation beyond 'next-gen' because of the way shaders must scale with resolution.
One of the main advances this new console generation brings is in the way memory is managed. Most people don't even understand this, including many gamedevs. Traditionally we have to had two copies of everything in RAM, one texture loaded from storage to main memory, and another copy stored in the GPU; Same goes for geometry, but sometimes even a third lower detail geometry will be stored in RAM for the physics engine to work on. The other copy in main RAM is kept ready to shove down the GPU pipeline, and the resource manager tracks which assets can be retired and which will be needed to prevent cache misses. That's a HUGE cost in total RAM. Traditionally this bus bandwidth has been a prime limitation in interactivit
Re: (Score:2)
The reason people don't think 1080p is "next gen" is because PC gaming moved on from it years ago. If you look at most of the hardware review sites they test cards at 4k or 1440p with everything on maximum detail, way beyond the current crop of consoles.
I think people expected 1080p as the absolute minimum, since that's what PC gamers expect. Even low end cards can run most stuff on medium detail in full HD.
Re: (Score:2)
The reason people don't think 1080p is "next gen" is because PC gaming moved on from it years ago. If you look at most of the hardware review sites they test cards at 4k or 1440p with everything on maximum detail, way beyond the current crop of consoles.
I think people expected 1080p as the absolute minimum, since that's what PC gamers expect. Even low end cards can run most stuff on medium detail in full HD.
Most HDTV's have an aspect ratio of 16:9 and support 720p (1280x720 pixels) and 1080i/p (1920x1080 pixels). Now there are what is commonly called 4k HDTV's and most of these also have an aspect ratio of 16:9 so a little bit of arithmetic will give you 3840x2160 pixels or 2160p. at the moment 4k HDTV's are more expensive than 1080p HDTV's although the price is dropping, however while the difference between Standard Definition and HDTV is very obvious especially when screen sizes can be over 80 inches (16:9 a
Re: (Score:2)
Re: (Score:2)
Current Gen does NOT aim at 1080p XB1 is not capable of 1080p.
Re: (Score:3)
PC's have been capable of 2048x1536 at 100hz since xbox 360 came out. Expecting consoles to be "just" a decade behind PC's seems reasonable.
Re:Problem with releasing an underpowered console (Score:4, Insightful)
Re:Problem with releasing an underpowered console (Score:4, Interesting)
The Wii U is nice in many ways, we own one. My 8 year old son and 5 year old daughter love Super Mario World 3D.
It shows that graphics are nice, but not everything, great games are great games, on any console.
The problem with the Wii U is that it is WAY overpriced for what it is. It just isn't selling and the time to get it selling has probably past, nothing Nintendo can do about it at this point.
I recently bought an Amazon Fire TV, and frankly, it has some really nice games on it that look just as nice as most of what is on our PS3. My son has been playing the tower defense game that comes with it and has been having just as much fun with it as with anything else.
For a $99 device that really is meant to watch TV with, that may be the real threat to PS4/XB1, if a $99 device is "good enough", how much demand is there for $500 game consoles?
Some, to be sure... but the price needs to come down.
Re: (Score:1)
This one just in:
ARM is starting to become more competitive and some people think their portables are adequate ;D
Re: (Score:3)
Yep... Take a look at "Real Racing 3" on the iPad.
One of my son's favorite iPad games, it looks just as good as anything on the PS3 and it runs on a tablet.
Until we get something "new" in games, there will be a limit to how much "better graphics" can sell new systems. They are approaching "good enough" for most people, at least until something changes such as the world becoming bigger, or something else about the actual game play being new.
Re: (Score:1)
This is my major sticking point, too. I upgraded to a PS2 for DVD and component video, and I upgraded to a PS3 for BR and HDMI. So I could get a PS4 and have... BR and HDMI?
Not to mention that my PS2 played PS1 games and my PS3 played PS1/PS2 games, meaning that each time I could just swap the console out and keep my current library -- I always had great games to play on them.
i like keeping both ps3/ps4 (Score:2)
Dont care about running ps3 games on a ps4, i have a ps3 for that, and its updated to 500gig too. So wth PS+ and the free games, theres loads of games for ps3, ps4 days are early, so patience, its not even been 6 months.
Re: (Score:2)
Nothing wrong with that, you may be happy to have a PS3 next to a PS4.
Rest assured not everyone is ok with that, many parents who are suffering from gadget overload (raises hand) don't want another box.
As it stands, we have too many, we recently canceled DirecTV to cut down on the boxes and devices, using our Roku 3 boxes (now Amazon Fire TV boxes for the parental controls) to watch TV because they are faster than the PS3.
We keep the PS3s because the kids have a game library they play and because they are o
Re: (Score:2)
Thank you for bringing up the backwards compatibility issue...
Our two main TVs each have a PS3 on them, they serve dual purple of being a BR player and a game console.
We are NOT going to have a PS4 sitting next to a PS3, we just aren't... there is already too much in front of our TVs.
If the PS4 had the ability to play PS3 games, I'd have bought one already (I had one on preorder with Amazon and canceled a few weeks before launch).
The PS4 simply doesn't offer enough to add a completely new game console. Be
Re: (Score:2)
Neither console is really "next-gen", that would have been 4K resolution.
I would have been happy with true 1080p resolution. How many people actually have 4K TVs at this point? Not nearly enough to support that in a console designed for the masses, at least. 4K is pretty demanding even for PC videocards. That would have pushed the cost up by several hundred bucks with absolutely no benefit for the majority of customers.
Still, it's not like we could have expected the same massive leaps in visual quality from previous generations. After all, the 360/PS3 generation was already
Re: (Score:2)
How many people actually have 4K TVs at this point?
Not many, but that will change. Within 3 years the price difference of a 4K and a 1080P TV will be pretty close.
Last year they were $10K plus, today you can get one for about $1K, give or take, and a big huge 65" one for $3K.
The prices are dropping fast, simply because they don't really cost that much more to make than current TVs.
The switch from tube TVs to LCD TVs did indeed have a MASSIVE cost, whole factories had to be thrown out and completely new factories produced.
But a 84" 4K TV is really just 4 42
Re: (Score:2)
Do we need another optical disc format for 4K? No, not really, streaming video will probably be enough and frankly BR has enough room that if they want to do it with H.265, they could.
We don't need a new optical disc, but we will at minimum need extensions to the format. Many consumers still want to buy a physical thing, and also, putting physical things in front of people still makes them buy shit they don't even really want, so they're going to keep doing that into the 4K era.
On the other hand, it's easy to see lots of companies just not bothering to release 4K content, because making 1080P is enough of a PITA compared to 480P
Re: (Score:3, Interesting)
FWIV - Also 1080p games and with possibly more details and AA would still of course be nicer than 720.
Xbox, 2001-2002. 64 MB 200 MHz DDR shared graphics memory, 733 MHz PIII-ish, 233 MHz NV2A.
Geometry engine: 115 million vertices/second, 125 million particles/second (peak)
932 megapixels/second (233 MHz Ã-- 4 pipelines), 1,864 megatexels/second (932 MP Ã-- 2 texture units) (peak)
(CPU random page 3 GLFOPS, GPU? Nvidia supposedly claim 80, some Xbox book say 22 in total.)
Xbox 360, 2005-2006, 512 MB 7
Re: (Score:1)
I guess one thing which may have changed is that the supposedly "PC gamers" and what is the average there may have decreased / isn't even counted for any more because so many have moved to more portable stuff so what one view as a "serious PC rig" as among the best there is out there whereas previously maybe the consoles was compared to a PC more people had.
AKA compare the Xbox One to a tablet and you won't be disappointed by the performance ..
EA made their claim how the PCs couldn't run the new FIFA 14 eng
Re: (Score:2)
Dont be a cheap ass, its only 10hrs wages, go buy one. Its not like its $4000.
Re: (Score:2)
Ignore 4k. That's a long ways off.
EVERY game on the ps4/xbone should have been REQUIRED to be 1080p/60fps.
If you can't achieve that, your game doesn't pass QC. If you can't achieve that, then you turn down some details.
Due to THAT I don't consider either a "nextgen" (eg 'current-gen') console.
For the same price I bought a PC (in Australia where we pay more for everything!) that can play titanfall at 1080p at 60fps with everything reasonable turned on. As well as every other game. No I don't run 16xFSAA,
Re: (Score:2)
Your comment about:
There's an awful lot of people out there that can't tell the difference between 720 and 1080 when it comes what they're seeing on a 40-something inch TV screen, which is what most people seem to have.
Brings up a great point that too many times gets left off.
What people consider "great" or "good enough" depends largely on what they have to look at.
My primary TV is a 70" Sony LED, 1080P is ok, but frankly it would be well served being 4K, but that is just too expensive right now and the content isn't there.
At 40", as you say, 4K may well be overkill unless you're right in front of it.
I sit 8' in front of the 70" TV and I can see the grain, further away it would be ok... it works for no
Re: (Score:2)
In which case, the PS4 and XBox One are "fine", except that they are a bit expensive for the hardware you get, in years past consoles were sold at a loss, this time around they are making a profit on each one.
MMOs proved that gamers would pay for the razor and pay again for the blades. (Even the fuck-it-five-blades razors of today can be had on sale such that you get the stupid plastic handle for free, and you only get robbed recurrently when you buy new cartridges.)
I've had an Xbox Live subscription a couple of times now. I wanted to play some Xbox 360 exclusives. Might well still play Titanfall. But I'm not even considering buying any of the current consoles. Had an Ouya for a moment, but it was garbage. I've
Re: (Score:3)
The PS4 is better, being 50% faster (thanks to 50% more GPU resources), but it isn't THAT much better. Neither console is really "next-gen", that would have been 4K resolution.
Except only a vanishingly few former-gen titles were actually 1080p and basically none of them ran at a smooth 1080p60. I'd have settled for 1080p60, but they're not even providing that. My guess is that next generation they'll support 4K... badly.
Re: (Score:2)
NHK are aiming to start 8k broadcasts in 2020 (in time of the Olympics) so I expect TV manufacturers will have models out by then. Judging by how long we had to wait for the current generation of consoles it seems likely that by the time the next generation comes 4k will be standard and 8k will be where PC gaming is at.
Re: (Score:2)
Also of note...
PS1 to PS2 had DVD support at a time when TVs had only one input.
Then PS2 to PS3 had Bluray support at a time when a player could cost as much as the console itself.
Re: (Score:2)
Yes, and when the PS2 came out, it also wasn't that much more than a DVD player, and you got a console for free. :)
The PS3 was an easy choice, we bought ours at launch in 2006 because it indeed was the cheapest BR player out there. For $600 we got the best BR player on the market, a next-gen console that was clearly better than the PS2, AND it played ALL our PS2 and PS1 games.
If the PS4 was $499 instead of $399 and it included full PS3 compatibility, I'd buy one, as it stands, I'm not interested in another
Re: (Score:2)
"The PS4 is better, being 50% faster (thanks to 50% more GPU resources),"
Actually, what makes the PS4 better is not the extra GPU resources, it's that they use GDDR5, so the system won't be starved for memory bandwidth. If the PS4 had used the same type of RAM as the XB1, it would have been as starved, with the same drop in performance. It's one of the well-known drawbacks of Unified Memory designs.
Re: (Score:2)
The GDDR5 indeed does help... much more memory bandwith...
That being said, the PS4 also has 50% more shaders than the XB1, so it is faster all the way around.
In general, the XB1 can run at 720p while the PS4 can run at 900p, given all the same graphics factors. It isn't fast enough to run the same games at 1080p, but it is close.
Re: (Score:1)
What you're complaining about is that the growth isn't linear. But all of the improvements you're pointed out have seemed "smaller" than the last. Imagine if we could get the kind of improvement SNES had over NES again. But that sort of thing just isn't possible in modern games; the required complexity of the art grows way faster than the required complexity of the hardware.
Re: (Score:3)
The irony is that the SNES and the NES have the same CPU in them, just faster...
If Nintendo had wanted to, it would have been easy to make the SNES pay NES games...
nonsensical summary (Score:2)
Re: (Score:1)
All this really tells you is that Titanfall isn't really a next gen title at all, which should be obvious as it was originally planned for the 360 to start with and will be released on it shortly. It runs about as well as your typical console to pc port, which is basically what it is, a xbox360 to xboxone port.
I thought current consoles were like current PCs (Score:5, Insightful)
Only they're also known targets, and should be able to be easily programmed for, as a result. Performance for 1920x1080 shouldn't be an issue for any title on the hardware available. It boggles the mind at how poor these developers must be if they can't even target known hardware, console-style, and get good performance out of the thing. Average PC game devs don't seem to have any problem doing so on the PC, and that's a moving target. Why would any competent devs have a problem with a fixed target? They've got decent CPUs. They've got decent GPUs. They've got a decent amount of RAM. Yet they found a way to get horrible performance out of it. Send in the firing squad.
Re: (Score:2)
Yes and no (Score:4, Insightful)
So they are a bit different, hardware wise. A big difference is unified memory. There is only one pool of memory which both the CPU and GPU access. That's makes sense since the CPU and GPU are also on the same silicon, but it is a difference in the way you program. Also in the case of the Xbone they decided to use DDR3 RAM, instead of GDDR5, which is a little slow for graphics operations, but the APU (what AMD calls the CPU/GPU combo chips) has 32MB of high speed embedded RAM on it to try and buffer for that.
Ok so there are some differences. However that aside, why the problem with the target? Visual quality. Basically, a video card can only do so much in a given time period. It only can push so many pixels/texels, only run so many shaders, etc. So any time you add more visual flair, it takes up available power. There's no hard limit, no amount where it stops working, rather you have to choose what kind of performance you want.
For example if I can render a scene with X polygons in 16ms then I can output that at 60fps. However it also means that I can render a scene of 2X polygons in about 33ms, or 30fps.
So FPS is one tradeoff you can make. You don't have to render at 60fps, you can go lower and indeed console games often do 30fps. That means each frame can have more in it, because the hardware has longer to generate it.
Another tradeoff is resolution. Particularly when you are talking texture related things, lowering the output resolution lowers the demand on the hardware and thus allows you to do more.
So it is a tradeoff in what you think looks best. Ya, you can design a game that runs at 1080p60 solid. However it may not look as good overall as a game that runs at 720p30 because that game, despite being lower FPS and rez, has more detail in the scenes. It is a choice you have to make with limited hardware.
On the PC, we often solve it by throwing more hardware at the problem, but you can't do that on a console.
Re: (Score:2)
On the PC, we often solve it by throwing more hardware at the problem, but you can't do that on a console.
I think the OP's point was that they should have been starting with this extra hardware to begin with.
Can't do that and hit the price point (Score:2)
Hardware costs money. If you want cheap consoles, you have to trade things off. For example my PC has no problems rendering games like Titanfall at 60fps, even at resolutions beyond 1080 (2560x1600 in my case). So, just put that kind of hardware in a console right? Ya well, my GPU alone costs near double what a current console does, never mind the supporting hardware. It isn't feasible to throw that level of hardware at a console, it just costs too much.
That kind of thing has been tried in the past and it n
Re: (Score:3)
Re: (Score:2)
The consoles have two modules, with each module consisting of 4 cores an a shared 2 MiB L2 cache. The Athlon 5150 only has one module.
Re: (Score:3)
Re: (Score:3)
You are incorrect. The consoles use Jaguar [wikipedia.org] modules, as opposed to the Bulldozer [wikipedia.org] family, which is what you describe. The Athlon 5150 is also Jaguar BTW.
Re: (Score:3)
Re: (Score:2)
It may not be fine with the bloated APIs (OpenGL and D3D) and unoptimized games on PC but on a console with low level access to the hardware, it's more than enough. The lack of 1080p games on Xbox One (I believe the only non-1080p game on PS4 is BF4) is mostly due to its middling GPU.
Re: (Score:2)
Re: (Score:2)
You mean like a finished product?
Re: (Score:3)
Re: (Score:2)
Re: (Score:3)
It boggles the mind at how poor these developers must be if they can't even target known hardware, console-style, and get good performance out of the thing.
It boggles the mind why Microsoft put shitty laptop CPU ram in a gaming device.
The devs are trying to find a balance point between visual quality (memory taken) and performance (memory bandwidth) but the 68GB/s memory bandwidth on the XB3 is way too low. IMO the 175 ish on the PS4 is too low too. For 30 FPS remember that only means you can have 2GB of stuff on screen at a time, for 60... well, 1 GB of stuff. (That's not counting AI and Audio).
Yes, sure, the dev's need to make a game for it, but that's re
Re: (Score:2)
Thing is, and the biggest problem with any new console, is the devs have to literally remake an OS each time. They don't get the luxury of having an OS manage things for them. They get some hardware calls, specs and told to get on with it.
That hasn't been true for the last 2 console generations, in some cases, the last 3 console generations.
Dumb question (Score:1)
Former Titanfall devs? (Score:2)
Shouldn't that be prefaced with 'Former' as I'm sure Microsoft has seen to it that this person isn't involved in any more Xbox development for knocking their 'OMG turn it up to 12! rehash of D3D?
Re: (Score:3)
D11 the end? (Score:1)
Makes sense (Score:4)
If you look at the Mantle benchmarks for various games it's pretty clear that it doesn't get you much on half decent systems, and on high end systems you're looking a negligible effect. I would think the same is true of DX12, which does the same basic thing.
For all the complaining about the Xb3 it's not terrible hardware, it's some odd choices compared to the PS4 and it's slow compared to a high end PC. But it's not in an absolute sense bad hardware.
maybe I'm old (Score:2)
Re: (Score:3)