Ubisoft Claims CPU Specs a Limiting Factor In Assassin's Creed Unity On Consoles 338
MojoKid (1002251) writes A new interview with Assassin's Creed Unity senior producer Vincent Pontbriand has some gamers seeing red and others crying "told you so," after the developer revealed that the game's 900p framerate and 30 fps target on consoles is a result of weak CPU performance rather than GPU compute. "Technically we're CPU-bound," Pontbriand said. "The GPUs are really powerful, obviously the graphics look pretty good, but it's the CPU that has to process the AI, the number of NPCs we have on screen, all these systems running in parallel. We were quickly bottlenecked by that and it was a bit frustrating, because we thought that this was going to be a tenfold improvement over everything AI-wise..." This has been read by many as a rather damning referendum on the capabilities of AMD's APU that's under the hood of Sony's and Microsoft's new consoles. To some extent, that's justified; the Jaguar CPU inside both the Sony PS4 and Xbox One is a modest chip with a relatively low clock speed. Both consoles may offer eight CPU threads on paper, but games can't access all that headroom. One thread is reserved for the OS and a few more cores will be used for processing the 3D pipeline. Between the two, Ubisoft may have only had 4-5 cores for AI and other calculations — scarcely more than last gen, and the Xbox 360 and PS3 CPUs were clocked much faster than the 1.6 / 1.73GHz frequencies of their replacements.
Linked? (Score:4, Interesting)
"We could be running at 100fps if it was just graphics, but because of AI, we're still limited to 30 frames per second."
Uh, have you guys tried running the AI calculations less frequently than graphics redraws? You don't have to keep them in sync, you know.
Re: (Score:2)
Re:Linked? (Score:5, Insightful)
Some engines don't give you much choice. I'd hope modern games aren't still stuck in the single-threaded, hard-clocked world of yesteryear, but you never know.
It's also possible they're using some very slow high-level language for the AI, and/or that no one's ever done an algorithmic optimization pass on the AI code, and they just couldn't keep up with the pipeline of collision events and whatnot that are often tightly coupled with framerate.
I've been amazed in some MMOs how server performance will be totally trashed by some patch to the AI, and completely restored by the next. Poorly thought out AI code can certainly bring a CPU to its knees.
Re: (Score:2)
> I'd hope modern games aren't still stuck in the single-threaded, hard-clocked world of yesteryear, but you never know.
Nah, game devs were forced to multithread their engine back in the PS3 and Xbox360 days. (In the PS2 days you had to offload data to the VU, GS, IOP, and SPU.)
Current gen consoles are basically a dumbed down x86 that have 8 threads.
Re: (Score:2)
It's very difficult with complex algorithms or structures of course. Relatively easy with parallel primitives if your tasks are simple.
Re: (Score:2)
It's also possible they're using some very slow high-level language for the AI, and/or that no one's ever done an algorithmic optimization pass on the AI code, and they just couldn't keep up with the pipeline of collision events and whatnot that are often tightly coupled with framerate.
It's possible, but highly unlikely. Game AI doesn't have all that much in common with research AI, and the issue of optimisation has meant that game AI is generally coded very close to the metal (C++, quite often).
Re: (Score:3, Interesting)
Oh but every engine is still doing this.
Crytek 3 ? Still locked to one CPU core
Unity? Still locked to one CPU core
Even though they may do "multithreaded rendering" this really only means the CPU part of the rendering splits passes. The entire calculations are still being done as though they were single threaded.
In a single threaded game, I might do A,B,C,D,E in that order, but if I have more cores, I'll just stick all these on separate threads... and hope the damn OS schedules it properly. So instead of get
Re:Linked? (Score:5, Interesting)
I'm an engine programmer who has been lead on 2 published titles (PC not console).
I'm pretty sure they thought of that too and already did it. One has to write multi-threaded code for these consoles since they are multi-cored and otherwise most of the resources are wasted. Multi-threaded code is really easy to arbitrarily set tick frequencies and lock contention on the rendering thread is actually lower when you set the tick frequency of things like physics and AI to a lower frequency to your FPS, especially if your FPS is not an even multiple of this frequency. We run Havok at 50hz and render at 60fps, it sounds counter-intuitive, but it looks and feels great.
The things is though, this game is obviously either GPU limited or close to becoming GPU limited. The key here is not the 30fps, which without looking at profiling results, could be equally easily explained by CPU limiting or GPU limiting, but the resolution of 900p, which the CPU should have absolutely nothing to do with. So you cannot confidently say that it is GPU limited now, but it certainly would be at 1080p, otherwise they would have just upped the res without it slowing the framerate.
The issue here is Vincent Pontbriand is probably not a technical guy. Roles between companies vary and it's hard to know who if anyone reports to a "Senior Producer". If the engine programmers reported to him, it would be possible that he was lied to, since explaining exactly why framerates are the way things are is often tiresome and complex, since bottlenecks can be in many different places in the GPU pipeline (geometry, shader, input, texture, ROP, framebuffer) and the position of the bottleneck may shift while rendering a single frame, the bottleneck can also shift between the CPU and the GPU during a singe frame if the instruction buffer fills up. As it is, they probably don't report to him, so I would say that he probably just doesn't know the whole picture.
Re:Linked? (Score:4, Interesting)
It's quite possible that he means they have artificially slowed down the graphics rendering to provide more cycles to the AI.
Re: (Score:2)
It's quite possible that he means they have artificially slowed down the graphics rendering to provide more cycles to the AI.
This is how I read it as well. Though, pure rendering and lerping should not eat up much CPU especially on consoles. Unless they've got a really inefficient rendering pipeline. I'm curious exactly how much extra AI this would allow them to run.
Re:Linked? (Score:5, Insightful)
It's probably not the AI calculations related to gameplay, but Ubisoft's AI calculations related to their DRM that get highest priority in their games ...
clockspeed really? (Score:5, Insightful)
can we please stop pretending clockspeed has anything to do with performance?
as has been said a million times
CLOCKSPEED IS ONLY FOR MEASURING APPLES WITH APPLES.
different CPU generations are INCOMPARABLE using clock speed.
so for instance a sandy bridge at 2.0 ghz is slower than a haswell at 2.0ghz, even with the same ghz number.
Benchmarking is the only way.
Re:clockspeed really? (Score:4, Insightful)
Very true. We have 1Ghz processors today than can outperform yesterdays 1.8Ghz processors.
Re:clockspeed really? (Score:5, Interesting)
Oh, yea?
An Intel 1.7Ghz i7 is TWENTY FIVE times faster than an Intel Pentium 4 4.0Ghz
http://www.cpubenchmark.net/co... [cpubenchmark.net]
That settles it. Ignore clock speed across generations.
Re: (Score:3)
The pentium 4 was a lemon. A 1.4 GHz PIII-S outperformed the first P4s running at 2 GHz. The pipeline had grown so much that "normal" code spent a good amount of time refilling the pipeline.
With the Core CPUs, they went back to the shorter pipeline, and real world speeds per clock jumped.
But I think the real problem here is designed bottlenecks combined with 50,000 feet programming and abstractions where the squeezes are. I'm sure they use a profiler, but it can't unravel bad design or remove unnecessary
Re: (Score:3, Insightful)
Not only that, but the architecture changed (PS3 was PPC instead of x86)
Re:clockspeed really? (Score:5, Informative)
Well... not really. The PS3 was the Cell, which features a PowerPC core to feed the SPEs. It's not helpful to just call it PowerPC.
The Xbox 360's Xenon CPU, on the other hand, was a 'conventional' tri-core PowerPC (6 logical cores).
clock speed is not the right comparison (Score:5, Insightful)
Ubisoft may have only had 4-5 cores for AI and other calculations — scarcely more than last gen, and the Xbox 360 and PS3 CPUs were clocked much faster than the 1.6 / 1.73GHz frequencies of their replacements.
Clock speed is not a good comparison. These processors should process data much faster than those in the Xbox 360 and PS3 despite the clock speed differences.
CPU that has to process the AI, the number of NPCs we have on screen
Maybe you didn't target your game properly.
Many Nintendo games run in 1080p and 60 frames/second on the Wii U which is much less powerful...because Nintendo makes that their target when deciding how much AI and graphics detail to put on the screen at once.
Re: (Score:3)
Which is why I usually prefer playing games on Nintendo systems. When I play with my friends on their Xbox 360, Xbox One or PS4, the games may look good but there's often a really huge drop in the frame rate, I don't know how they can enjoy games that way.
Re:clock speed is not the right comparison (Score:5, Insightful)
Maybe you didn't target your game properly.
I think what Ubisoft is trying to say here is that it's programmers are shit, and so is it's game engine (AnvilNext). Other publishers manage to do okay, but poor old Ubisoft are stuck with this turd and can't just switch to Unreal or something more competent, so sorry guys you only get 900p on consoles.
I mean, obviously if you are CPU bound the solution is to reduce the load on the GPU by making the game render at 900p. The problem is the AI for the large number of characters in the game, so clearly reducing the pixel count will help with that.
Re: (Score:2)
What if the consoles have shared system memory and they're hitting the memory bandwidth with the graphics?
Re: (Score:2)
They also might be running into an skinning bottleneck. Are they using 4 bones / vertex? How many AIs do they have onscreen at once? What is their poly count per character?
One of their old games, RB6:LV2 doesn't even draw more then 8 characters maximum at any one time, even though the level clearly supports 50+ terrorists.
Re: (Score:2)
Gah... should be "its game engine". Forced to troll myself now.
Re: (Score:3)
Re: (Score:2)
"right about unified memory architectures" So, that pretty much settles it for me: their shaders push the memory subsystem to its knees and at 1080p the CPUs don't have enough memory bandwidth left to run at full speed.
Re: (Score:3)
Re: (Score:3)
In other words, "cripple your game so it'll run on proprietary pieces of shit instead of letting it be as good as it possibly could be on a real computer."
Re: (Score:2)
Nintendo has never made a 1080p/60FPS game (Score:3)
Hey Ubisoft, maybe you should stop shitting on PCs (Score:5, Insightful)
Just saying if consoles aren't powerful enough to make you happy, well there are these new fangled PC things with a shit ton of CPU, RAM, and other goodies and gamers like me who spend way too much money on them to play games. Of course if you keep taking the attitude that we are all pirates, releasing shitty ports and so on don't be surprised if we aren't so interested in your products
http://www.escapistmagazine.co... [escapistmagazine.com].
Re:Hey Ubisoft, maybe you should stop shitting on (Score:5, Funny)
Re: (Score:2)
As someone pointed out a couple weeks ago in a Win8 thread, today's PCs are now so powerful that even Windows can't slow them down. Now that's impressive!
Actually, a better saying is that PCs are so fast these days that even JAVA can't slow them down....
Windows would run just fine on a Pentium computer with a decent amount of RAM. Try running a java applet on it, though, and you may as well go for a long coffee break...
Re: (Score:3)
Actually, a better saying is that PCs are so fast these days that even JAVA can't slow them down....
You obviously don't run Eclipse.
Re: (Score:2)
Today's JVM runs much better on an old Pentium than the JVM that was released concurrently with said Pentium.
Re:Hey Ubisoft, maybe you should stop shitting on (Score:5, Insightful)
Wait, what? (Score:2)
Do you mean to imply that chip manufacturers have started to lie on their spec sheets? Really? That's a new one; in the past they've been 100% accurate.
Re: (Score:3)
I would even say that in the past, they've been 104.92% accurate.
Cell (Score:4, Interesting)
I really hate that Sony dropped their cell processors going from the PS3 to the PS4 in favour of an x86 based system. We didn't see a lot of devices using cell and because of that, a lot of cell super-computer clusters were even made using actual PS3s. Even the prior MIPS processers of the earlier PlayStations are used in computer architecture texts books to this day (albeit overly-simplified versions of MIPS's pipling systems).
I really want to see more architecture options, not less. Intel bought Alpha, killed it, screwed up with their own VLIW attempt with the Itaniums (which use EPIC) and I haven't heard anything about Transmeta in years. Today everything is ARM or x86_64 (with MIPS still seen in some embedded systems, mostly home routers). IBM still produces new POWER systems, but they're limited to a specific server niches.
Re:Cell (Score:5, Informative)
You haven't heard anything about Transmeta in years because they ceased operating in 2009. The patent portfolio went to Intellectual Ventures, LLC, and licensed in whole or part to Intel, Nvidia, Sony, Fujitsu, and NEC.
Re: (Score:2)
Re:Cell (Score:5, Insightful)
In the PS4, you don't have cell, but instead you have 4 GPU cores dedicated for compute tasks (they don't have the back end necessary to perform rendering, although they can participate in compute tasks that aid the rendering pipeline.) Like Cell, these cores work for the CPU, have generally the same programming model (load them up with code and data and set them to running), and also have the exact same theoretical throughput as Cell had.
Variety and competition are great, but Cell was nothing special in the sense that what was good and unique about it has been subsumed by GPU compute -- it was ahead of its time, but it hasn't aged nearly as well. Game consoles are a commodity business though, its hard to justify custom or obscure hardware unless its the only way to get the architecture you want, but then you have to teach everyone to use it effectively.
Re: (Score:2)
The SPARC processors are still doing some interesting things, and there's been some shakeups in the GPU architecture space.
Re:Cell (Score:5, Insightful)
The problem with the PS3's Cell and the PS2's odd set up before it is that they were both a bugger to get good performance from. It took developers years to get the best from them, and it cost a lot of money to do so. Compare that with systems like the Dreamcast and the two XBOX consoles that were relatively easy to get on with and which had excellent looking games from the start.
With this generation both Sony and Microsoft realized that games are now so expensive to produce that anything which reduces that cost will be attractive. Most are cross platform too, so if you make your system hard to work with it's going to suffer from lame ports. The PS3 often had that problem, with its versions being inferior to the PC and 360 ones.
Thus both new systems are basically PCs. Familiar hardware, familiar and fairly easy to work with CPUs, mature tools.
Re:Cell (Score:5, Interesting)
Disclaimer: I work for Ubisoft. I did not work on the game in question and I won't comment on it.
Now, the PS3. I have a friend that's made a very good living for the last few years doing nothing but PS3 optimisation. He'd go in 3 days a week and make more than I would in a year. The PS3 setup was fiendishly complicated and difficult to wring real performance out of. Even by the end of the cycle, I'd say there were only a few games that significantly made use of the potential power that was available in the PS3. On paper, it was impressive. In practice, it was a mild nightmare. You had completely different tools than when you were making a 360 game. The compiler was different. You had to be a lot more meticulous about where data was and how you were moving it around.
I worked on the PS4 earlier this year, and it's dead easy to use. The tools integrate well into the environment, and you don't have nearly the same optimisation headaches that you did on the PS3. It's trivially faster than the XBone, and there's virtually no platform specific code (except for the obvious stuff, like connecting to the respective online services, etc.)
From a developer perspective, the PS4 is a lot nicer than the PS3. That'll mean more simultaneous releases on the PS4 and XBone, and this time there's no delay before the PS4 is at or past parity with its competition (which is more important for Sony and Sony fans, really).
That's just my opinion on the matter, but Sony really listened to the developer community when it came to tools and ease of use. It may be less interesting, but interesting generally means 'troublesome', not 'exciting' when you're writing software.
Re: (Score:2)
Re: (Score:3)
OK, so you work for Ubisoft, on to the topic at hand - Unity is not going to be on PS3 or Xbox 360. When we are talking about the current consoles (PS4 and Xbox One), they are basically the same - the difference being one has a slightly higher clocked CPU and a massively inferior GPU, along with much less memory bandwidth (yes, there's the "massive" ESRAM totaling a whopping 32MB), versus a slightly lower clocked CPU, much better GPU and a memory bandwidth in a totally different ballpark.
Given these specs,
Re: (Score:2)
We didn't see a lot of devices using cell and because of that, a lot of cell super-computer clusters were even made using actual PS3s
Sony sold and priced Cell based systems for commercial use.
The PS3 could be purchased in wholesale lots --- taken out of consumer distribution channels where the true cost of the hardware would be recouped by future video game sales --- at a substantial net loss to Sony.
Exit the "Other OS."
Re: (Score:2)
My LSI SAS controller card has an 800MHz PowerPC processor. Lots of car ECUs use PowerPC. It's anything but dead.
Re: (Score:2)
They're pretty common to see running BMCs as well.
I have a i5 4690k (Score:3)
quad core. Is it better than the PS/4 CPU?
Re: (Score:3)
Newegg has your processor at $240 - the entire PS4, including controller & game is only $400. I'd be curious to know what you shelled out for the full computer (please include input devices).
You've always been able to buy more powerful machines than the consoles. Just not at console prices.
Re: (Score:2)
For the Xbox One...
The most comparable processor is probably the AMD FX-8320, under-clocked. It'll cost you $150.
The most comparable video card is probably the ADM 7770, also scaled back a touch, and it'll cost you $100.
8GB of DDR3 will cost you $75
The Blu-ray reader, another $40.
500GB hard drive is another $40.
That leaves you negative $5 for a motherboard, HTPC case, power supply and remote joypad.
Re: (Score:2)
It's rumored that PC's can do more than play games, though.
What was the last time you posted on Slashdot through your Xbone?
Re: (Score:2)
My HTPC could post to Slashdot too, but it doesn't. It just records CATV and plays it (and downloaded videos) back for me with a high Wife-Acceptance-Factor interface.
My current setup is an Xbox One with the "TV" input running to a Windows Media Center device with a Ceton cable cards tuner in it. I'm fairly angry that The One isn't a Media Center Extender, but I see Microsoft's vision of turning the One into that "one" device in the living room, and it means slowly, silently killing Media Center Edition.
W
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
But you also need a computer, for school, work, facebook, whatever else, everybody has one nowadays. So the choice is between a $800 computer vs a $400 computer plus $400 console.
Library size (Score:2)
Re: (Score:2)
That would explain frame rate (Score:2)
Re: (Score:2)
They might be running into skinning performance. See my previous post [slashdot.org]
boo hoo hoo (Score:5, Insightful)
Boo fricking hoo. Learn to develop a game with what you have and quit yer bitching.
The best damn video game console ever was 8 bits, ran on a single core (usually), at 1.79 MHz IIRC. Gamers then logged just as many hours saving princesses, shooting aliens, and stacking blocks as what gamers do today. And guess what...they loved it. They fucking ate it up and went back for seconds and thirds. No, it wasn't photo-realistic 3D video with dolby-i-don't-give-a-crap sound and 87 button LCD-screen force feedback controllers. We didn't need it because we could have fun with what we had and didn't worry about frame rates or pixel resolutions or how many cores a OS management thread ran on vs graphics cores vs whatever.
Game designers these days, spoiled rotten little twerps that whine about everything.
Re: (Score:3)
Get off my lawn!!!
Re:boo hoo hoo (Score:5, Informative)
And back in our days when we wanted to play with our friends we had to trudge 15 miles through 10 feet of snow going up hill both ways!
Re:boo hoo hoo (Score:4, Funny)
Was your Nintendo 64 on the North Pole?
Re:boo hoo hoo (Score:5, Insightful)
More to the point - When you have the luxury of coding for a very specific platform (ie, a gaming console with a known hardware configuration and known performance profile), you have no excuse for failing to adjust your resource demands accordingly. And if you just can't physically dial down the load enough to run well on platform X - You don't release the goddamned game for platform X.
Re-read that last point, because it nicely translates Pontbriand's whining into plain English: "We promise not to turn down any chance to grab your cash, no matter how shitty the experience for our loyal customers".
Re: (Score:2)
I have a radical idea. If we accept that game developers are going to have the moral equivalency of a used car salesman with a crack addict to support, how about we ( the customers ) stop giving said addicts OUR GOD DAMNED MONEY FOR SHIT.
Anyone who gives these greedy SOBs a single cent are part of the problem.
NES limits (Score:3)
Where is this interview itself? (Score:5, Interesting)
TFA just mentions the interview without a clear reference to it. Looking for it I found two other articles that suggest that the 900p resolution and 30fps targets came from other factors. http://www.gamespot.com/articl... [gamespot.com] says that 30fps is "more cinematic" and 60fps "looked really wierd." http://www.gamespot.com/articl... [gamespot.com] suggests that some non-graphic computation is going on the GPU, but also has a quote that mentions "technically CPU bound."
What we don't know from these articles is why some or more of the AI computation can't be done in the GPU.
Re: (Score:2)
What we don't know from these articles is why some or more of the AI computation can't be done in the GPU.
Because modern GPUs doesn't have a 1:1 mapping between their "cores" and general purpose logic units. Once you use a branch you effectively collapse your "cores" down to the general purpose logic associated with those "cores." (I think my gtx 660ti has like ~1400 cores and ... 8 general purpose logic units?) For graphics you essentially are just doing vector/matrix math calculations with no branching so you can use all of the "cores" in parallel. This is not to mention that there are costs in streaming d
Re: (Score:2)
What makes you certain that the AI code can't be vectorized across the GPU cores, for example, one core per NPC? Yes, we understand that CUDA/GPU's don't run vanilla C code.
Re: (Score:2)
They claim to have crowds of 30k NPCs: http://www.gamespot.com/articl... [gamespot.com]
Re: (Score:2)
Re: (Score:3)
Yep, this is totally right. The main thing keeping an algorithm from running well on GCN cores is being branch-heavy. While I haven't kept up-to-date on the terminology, only one instruction at a time can be executing across a set of inputs.
The following code is pretty understandable and quick on a CPU. But on a GPU, performance suffers.
if (unitOnFire){
flailAround();
} else {
doFightRoutine();
Sacrifice visual quality (Score:2)
I'd go for reducing overall visual quality rather than the AI quality. Having good AI, and supposedly better gameplay as a result, should be at the top of the priority list. I'd go so far as to reduce the visual quality so much that the framerate is high and the game responsiveness is super smooth.
F Ubisoft (Score:2, Insightful)
Why should be believe anything he has to say? He is a known employee of Ubisoft, after all.
What's the PC Processor Usage Then (Score:5, Insightful)
Re: (Score:2)
Re: (Score:3)
I suspect BS, but I'll hear out the argument if there is actual evidence. Sliding down the resolution to 900p from 1080p would mostly save you on GPU, and Graphics memory usages. In a more detailed artilce it was stated that they picked 900p because they didn't want to fight with the differences between the XBone and PS4. The main reason the PS4 keeps getting 1080p and the XBone does not is that the memory for the graphics is so much faster. So if someone actually has the PC version and can so that "AI" is burning every processor at 100% then I'll buy their argument as plausable. Until then the story has changed from day to day, and isn't believable. If the AI has something to do with it then the AI coder has probably deadlocked the system when they use multiple cores, and has been cheating by using only one core.
Well if Watch_Dogs (another UbiSoft title) is any indication then I would believe it. I bought the PC version (yea yea, I know, Watch_dogs!? Fool me once...) and ran it on a 2nd gen i5. Don't recall the model exactly but it was clocked north of 3Ghz. Not a beast by today's standards but not horrible either. With that and a 780ti, the game was horribly CPU bound. Constantly at 100% CPU. Upgrading to a i7 4790K and it still runs at around 60% CPU. So yes, I can completely believe that their games are CP
Crappy programming? (Score:2)
How do we know that the real issue isn't scripting silliness and otherwise inefficient code design? Not saying this is the case, just pointing out that the possibility is glossed over.
Re: (Score:2)
That's much more likely to be the case than whatever nonsense some PR person is feeding these journalists. Ubisoft isn't known for their optimization prowess.
design for the box you have (Score:2)
As an embedded architect/programmer, I deal with this all of the time. You have to design for the platform, not what you fantasize about having. Doing the dev work on a dual-4-core-hyperthreaded box with 48 GBytes of memory, then whining that the CPU/GPU/memory space/... isn't enough is just exposing your stupidity. Onbce had a developer build and test a database on his quad-core Apple-thingy, then whine to management that the hardware engineers and materials people had just not done their job, when, eve
This is Ubisoft (Score:4, Funny)
... i thought they were going to blame it on piracy?
what no relly (Score:2)
To be fair though (Score:3)
*re-reads*
Oh..
For the last couple of gens it's usually been possible to get a PC that 'looked better' - but you ended up paying a whole wedge more for the privilege. This is the first gen of consoles that have come out and I've immediately written off (and I'm reasonably sure could build a better PC for near enough the same money).
PC monitors have got better, and it's never been easier to plug your PC into a TV
Re:To be fair though (Score:5, Interesting)
You pay more, and you get more. You want a game machine that's capable of being a full-blown music recording studio? Or a video editing suite? Or can run Matlab, Mathematica and fluid dynamics simulations? That's a PC.
Consoles became less interesting to me as I grew up. Games didn't become less interesting to me, but the notion that I would sit in my family's living room with a controller in my hand just became an artifact of childhood.
Of course, some dedicated game box is going to be less powerful than a PC. The only reason they exist is so that game companies can manage licenses. They're not meant to be for your benefit. They're consumption machines, designed to tie you into a corporate "ecosystem". If I was 13, I would love one. Now, they just seem like evidence of the failure of the gaming market to mature.
More childhood (Score:2)
You want a game machine that's capable of being a full-blown music recording studio? Or a video editing suite? Or can run Matlab, Mathematica and fluid dynamics simulations? That's a PC.
I imagine that a lot of people would prefer to have a Matlab/Mathematica/music/video/whatever machine on a desk and an easy-to-use machine to play major label video games in the living room.
the notion that I would sit in my family's living room with a controller in my hand just became an artifact of childhood.
You know what happens when you settle down and get married? More childhood in your house. If you have three kids, is it cheaper to buy one console and two extra controllers or two extra gaming PCs? That's not even counting games that never come to PC at all, including Red Dead Redemption and almost any game involving a c
Re: (Score:3)
Playing games in the living room is for children and childish men.
You had to mention Red Dead Redemption, damn you. I hope you're happy that you made me cry.
I'll give you that one. My daughter moved into her own place this year when she started grad school. I suppose I would play games in the living room but my wife hates to see me in my underwear with a controller in my hand and my tongue hanging out of my mouth like Michael
Re: (Score:2)
That's not really the point... this is against AMD making it difficult for a developer to put an unoptimized game out as fast as possible. I blame MS and Sony for hogging resources. I remember in PSX/PS2 and Dreamcast that the "OS" was nearly non-existent when the game was running. All I know is that in 2015, AMD needs something big to be a viable option - they are far from being a "competitor" to intel as they stand right now.
Re: (Score:2)
But we're talking about modern games, not Doom.
Re: (Score:2)
The problem is continually trying to achieve photo-realistic 3D graphics. It's the uncanny valley; and the bulk of hardware spec increases are from trying to push that envelope.
My favorite example of this is Warcraft 2. Ignoring the tiny resolution; the game doesn't really look 'dated' -- due to well done, albeit cartoonish graphics and animation. Compare that to Everquest (1998); which looks like someone stuffed a bunch of polygons in a blender.
Similarly the AAA mega 3d graphics fests will follow the sa
Re: (Score:2)
due to well done, albeit cartoonish graphics and animation.
There's no "albeit" about it. Cartoonishness is the only effective answer we have to the uncanny valley problem -- because it keeps us on the other side of the valley. Consider that old episodes of Scooby Doo will be repeated from now until kingdom come, but when did you last see Reboot on TV? (Although Reboot might get away with the dated computer animation due to the "inside the computer" angle -- it works on an abstract plane I suppose.)
Re:Completely full of shit (Score:4, Insightful)
If there's one thing the new consoles have right is a mind numbing amount of CPU power.
Uh, no, they don't. These are low-clocked AMD cores, which have much lower performance than Intel cores at similar clock rates.
They're equivalent to a PC CPU from several years ago. They sure as heck ain't equivalent to a 3GHz, eight-core, sixteen-thread Xeon (or whatever Xeons are up to these days).
Re: (Score:3)
which have much lower performance than Intel cores at similar clock rates.
Depends...it's true that the Jaguar cores have something like 400 single-thread Passmarks per GHz and Intel is somewhere around 600 with its big cores, but at least you have eight of them. That should bring you somewhere in the vicinity of a i3-4330, or an overclocked G3258, which is considered good enough by gamers to drive a ~150W graphics card even today. (Unless you can't deal with eight cores, which of course would be rather shameful for professional programmers of the kind that consoles need. :-p)
Re: (Score:2)
That I3 is only considered 'good enough' because most modern PC games are ports of console games with even worse CPUs than the new ones. And I don't think anyone in their right mind would say it had a 'mind numbing amount of CPU power'.
Re: (Score:2)
What happens when you try to make a cutting edge pretty 2014 game run 60FPS at 1080p on a four year old graphics card?
This [youtube.com]. But granted, that's a remake of a 2013 game.
There's a lot of horse shit in this summary (Score:2, Interesting)
You're missing the fact that AI code is typically branch-heavy, which kills pipelining and makes all of what you say about "instructions per clock" moot. In branch-heavy code, a high clock speed is very important. Metrics of "N instructions per clock" only apply to the theoretical best-case of non-branching code.
Re: (Score:2)
Re: (Score:2)
They're throwing lots of cores at the problem *because* of their very low IPC. The fact that they took chips with poor IPC and then gave them very low clockspeeds just made matters worse.
Sony/Microsoft went with AMD because they were willing to do custom designs, with single-chip solutions, at a good pricepoint. Intel is unwilling to do custom designs, and they would have had to do a custom design with multiple times more graphics EUs to satisfy Sony/Microsoft.