Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
AMD Businesses PlayStation (Games) Sony Games

Sony Ditching Cell Architecture For Next PlayStation? 276

RogueyWon writes "According to reports in Kotaku and Forbes, Sony is planning to ditch the Cell processor that powered the PlayStation 3 and may be planning to power the console's successor using a more conventional PC-like architecture provided by AMD. In the PS3's early years, Sony was keen to promote the benefits of its Cell processor, but the console's complicated architecture led to many studios complaining that it was difficult to develop for."
This discussion has been archived. No new comments can be posted.

Sony Ditching Cell Architecture For Next PlayStation?

Comments Filter:
  • POWER7 baby. (Score:5, Interesting)

    by RyuuzakiTetsuya ( 195424 ) <taiki@c o x .net> on Thursday March 01, 2012 @01:50PM (#39211241)

    Probably. But they'll probably use a POWER7 based CPU instead of an AMD x86 CPU. Given how much Cell influenced POWER7, I'd actually say that's a huge likelyhood they'd go POWER instead of x86.

  • It is a pain (Score:5, Interesting)

    by DoofusOfDeath ( 636671 ) on Thursday March 01, 2012 @01:55PM (#39211311)

    I programmed a Cell processor (for HPC, not gaming) a few years ago, and it was definitely a pain in the butt compared to just targeting a multi-core x86.

    The problem, at least back then, was that you had to write explicit code to have the various cores communicate with each other (DMA transfers, etc.)

    I imagine compilers/libraries/SDK's have improved the situation since then, but really the modest performance premium offered by the chip just wasn't worth the hassle.

  • Re:Doesn't matter (Score:5, Interesting)

    by Anonymous Coward on Thursday March 01, 2012 @02:03PM (#39211419)

    Seriously, fuck those guys! DRM included on audio CDs, fraudulently advertising their product as able to use Linux and then disabling that feature ex post facto, fake astroturf blog ad campaigns that insult human intelligence, spending money to purchase censorship laws and immoral copyright extensions, suing tinkerers playing with products they legally own.

    Fuck Sony! They are an icon of much that is wrong with the world right now.

    Sony is what you get when you allow companies to grow too large in scope. I try my best (imperfectly, of course) to not give money to companies that are large. There are almost always smaller alternatives that won't fuck you 8 ways from Sunday with corruption, greed, and control like a large company like Sony can't help but do.

    Please help kill companies like Sony by decentralizing your purchasing power! Next time you're thinking about buying a game licensed by Sony, check out what smaller, independent alternatives like the Humble Bundle guys are doing!

  • NIH Syndrome (Score:5, Interesting)

    by UnknownSoldier ( 67820 ) on Thursday March 01, 2012 @02:04PM (#39211445)

    I've shipped PS2 games and worked with numerous developers that have shipped PS3 games.

    Sony's problem is the Not-Invented-Here syndrome. They have yet to learn the lesson that Apple mastered years ago in the 80's -- use off the shelf commodity parts!! Why? They will become DIRT cheap in a few years. Why waste millions of dollars investing into R&D of new hardware when in 5 years somebody else will have a no-name version of it at a fraction of the price??

    e.g.
    Sony is _slowly_ learning this lesson. After how many man-years of a buggy PS2 GS (Graphics Synthesizer) that couldn't even properly do z-tesing (!?!/!) the PS3 RSX is (mostly) a GTX 7800+
    http://en.wikipedia.org/wiki/RSX_'Reality_Synthesizer [wikipedia.org]'

    When the PS2 first came out everyone bitched how difficulty it was, yet it was a beautiful thing to see all of its 7 CPUs working full speed load-balancing the system. It laid the foundation that multi-core programming was the future. When the PS3 came out everyone bitched how even more difficult it would be. Developers just sucked it up and now we are even seeing A.I. running on the SPE/SPUs on second-gen and 3rd-gen PS3 games! That's pretty cool to see a modern game engine utilizing every core it can.

    Using stock parts: CPU + GPU is a great way to minimize costs. You don't get the same performance benefits of true dedicated design but the commodity parts are cheap enough that the pricing curve naturally takes care of that. Kind of a no-brainer if Sony decides to use an AMD or Intel CPU for the PS4.

    References:

    See: PS3 games list & SPE usages
    http://www.neogaf.com/forum/showthread.php?t=184843 [neogaf.com]

    i.e.

    Killzone 2 utilizes roughly only 60 per cent of the SPU's.
    "It's incredible to see huge levels and see the deferred rendering and note that on all the SPU's, even on the heaviest load were coming up to about 60%," Haynes said. "They weren't coming close to maxing out. They had about 40% of space before they started tripping or saw slow down on some of the processes."

    and

    Killzone 3 uses 100% of SPU's.
    we're having a footprint of a level that's ten times bigger than the average Killzone 2 level. Killzone 2 was not a small game, but that was as far as we could push it back then.

  • Re:Doesn't matter (Score:5, Interesting)

    by AngryDeuce ( 2205124 ) on Thursday March 01, 2012 @02:21PM (#39211787)

    Maybe not, but they also gave the finger to universities using PS3 clusters [wikipedia.org]. The fact that Sony participated assisted said universities with setting up these clusters speaks volumes as to how ridiculously contradictory Sony's response was when they blocked OtherOS.

    These types of applications are what attracted me to the PS3, not because I necessarily wanted to do this myself, but the fact that the console was powerful and flexible enough to be used in this way was very attractive to me. Most people prefer having an option to having the option taken away out of nowhere.

    It's as if Sony gets a list of options and always picks the one that will most piss off their customers. They're sabotaging themselves...

  • Re:Doesn't matter (Score:5, Interesting)

    by billcopc ( 196330 ) <vrillco@yahoo.com> on Thursday March 01, 2012 @02:31PM (#39211919) Homepage

    There's nothing shameful about the /. masses agreeing that Sony abuses its customer base. Perhaps what is truly insightful is how quickly the comment leapt up to +5 and stayed there, implying that far more people agree than disagree.

    If you look to /. for balanced, impartial fact-based discourse... keep looking! And if you ever find such an impossible thing, do let us know.

  • by Animats ( 122034 ) on Thursday March 01, 2012 @02:38PM (#39212007) Homepage

    The trouble with the Cell processor is that there's not enough memory per processor. Each of the little processors (the "SPE" units) in the PS3 only has 256KB of RAM. That's not enough to store a frame. It's not enough to store a game level, or a significant amount of geometry. It's more like having a number of DSPs available.

    This forces redesigning the program to work in batch mode. A batch job is one frame, but it's still a batch job. Data for one frame cycle is sequentially pumped through one or more SPEs. There's not much random access, because access to main memory from an SPE is in big blocks, transferred in the background.

    This is both limiting and a huge pain. Especially when the competition is selling shared-memory multiprocessors. I used to do game physics engines, and when the PS3 came out, my reaction was "I'm glad I sold off that technology and got out of the business." I knew some people at Sony's SCEA R&D center, and they basically threw all their smart people at trying to figure out how to use the Cell effectively. Many of the early games really ran in the main CPU, with the SPEs managing things that didn't affect gameplay, like particles for fire, explosions, smoke, and such.

    If each SPE came with a few megabytes of RAM, instead of only 256K, it wouldn't be so bad. Then you could probably have the physics engine in one CPU, the AI in another, the background object management in a third, and so on. But each of those things needs more state than whatever fraction of 256K is left over after the code is loaded.

    There's a long history of Cell-like architectures in the supercomputer field. The BBN Butterfly, the NCube Hypercube, and the Connection Machine also consisted of a large number of processors, each with a small memory. None were successful. One of the lessons of multiprocessing computer architecture to date is that the two extremes - shared memory multiprocessors and networked clusters of separate computers - are useful. None of the partially-shared machines have been successful. The Cell is the only one ever to be mass-produced.

    Great for audio, though. The audio guys like having their own processor, and audio processing really is a streaming process of tight loops without much state.

  • Re:Why not PC + 360? (Score:5, Interesting)

    by Anonymous Coward on Thursday March 01, 2012 @02:43PM (#39212081)

    I'll join you on the Sony SDKs being horrible. I still think the SN debugger is the best debugger I've used for multithreaded debugging. I'd also venture that you weren't a particularly serious PS3 dev house if you were using Sony's GL implementation, we ditched that shit the second GCM became available.
    The Cell architecture itself isn't difficult to program for, Sony just screwed themselves by coming out a year later then the 360. The big issue is that developing parallel software on the 360 is in a homogeneous environment. Game devs (myself included) started building engines around those constraints. After we had 360 devkits for a year or so, Sony comes by with PS3s and they are different at a fundamental level. We already have over a year of engine design and development into the 360 and we have commitments on both consoles. Now what? You can't afford the time to throw it all out and re-design from the ground up. It also didn't help that Sony's SDK was completely in flux before the launch - and for some time after. The end result is any game that wasn't first party was a horrible compromise on the PS3 at first. As time went on we changed large parts of our engine to be more PS3 friendly and it helped quite a bit. It also didn't help that the PS3's GPU is about 15%-25% slower on average and that the OS takes up a bunch more memory then the 360's does.
    All in all, the PS3 was a clusterfuck for the first few years and still hasn't recovered.

  • Re:Doesn't matter (Score:4, Interesting)

    by cbhacking ( 979169 ) <been_out_cruisin ... m ['hoo' in gap]> on Thursday March 01, 2012 @03:04PM (#39212453) Homepage Journal

    So, where were the universities going to get replacement hardware when their machines start breaking down? Newer consoles that come with the firmware update blocking Linux and can't be downgraded? PS3 Slim consoles that never had Linux at all (officially speaking; they can run it just fine in reality)?

    The only thing that stops me from hoping that Sony dies in a fire is the risk of what level of unethical behavior it will permit their direct competitors to stoop to, when there's one less alternative for people to switch to. I'm under no delusion that any megacorp is going to behave any more ethically than its bottom line dictates. The disgusting thing is that Sony can't even measure up to that.

  • Re:Cell Failed (Score:5, Interesting)

    by DigitalDreg ( 206095 ) on Thursday March 01, 2012 @03:09PM (#39212547)

    Disclaimer: I used to teach Cell programming classes for people who were looking to do HPC on the blades.

    Cell failed. But the reasons behind the failure are more interesting.

    The obvious answer is that it was hard to program. On a single chip you had the PowerPC processor and 8 SPUs. Communication was through mailboxes for small messages and DMA transfers for larger messages. To get the most out of a chip you had to juggle all 9 processor elements at the same time, try to vectorize all of your ops, and keep the memory moving while you were doing computation. That is the recipe for success for most architectures - keeping everything as utilized as possible. But it is also hard to do on most architectures, and the embedded nature of Cell made it that much more difficult.

    There were better software tools in the works for people who didn't want to drop down to the SPU intrinsic level to program. There were better chips in the works too; more SPUs, stronger PowerPC cores, and better communications with main memory. Those things did not come to fruition because IBM was looking to cut expenses to keep profits high (instead of boosting revenue). The Cell project was killed when a new VP known for cost cutting came in. We finally had a good Cell blade to sell (QS22 - two chips, 32GB RAM, fully pipelined double precision, etc.) and that lasted four months before the project got whacked. And we lost a lot of good people as a result. (That VP, Bob Moffat, was part of the Galleon insider trading scandal.)

    So yes, Cell failed. But not necessarily for the obvious reasons. IBM has been on a great cost cutting binge the past few years - it lets them meet their earnings per share targets. But it causes collateral damage.

  • by Animats ( 122034 ) on Thursday March 01, 2012 @03:43PM (#39213065) Homepage

    A better approach is to break down your engine into a large number of small more or less self contained tasks, then implement a jobs system that takes those tasks and runs them on whatever processor is free at that moment.

    That works fine on a shared-memory multiprocessor. On a Cell processor with 256K, switching a processor from one task to another requires moving in new code and data, not just CPU dispatching. That's not something you can do many times per frame cycle.

  • Re:Doesn't matter (Score:5, Interesting)

    by Sir_Sri ( 199544 ) on Thursday March 01, 2012 @04:41PM (#39213821)

    As someone who worked with a PS3 cluster, the removal other OS functionality did not impact me in the slightest. If you're using them for a cluster you aren't using them for gaming. If you're using them for a cluster you don't download the updates that have absolutely no impact on your console, which is all of them.

    What they did that impacted the PS3 cluster business was they took away the other OS option in future consoles, which makes sense since it was a waste of money on their part anyway, but that means there's no way to replace broken parts of the cluster. Though as it turns out, it wouldn't be worthwhile anyway, since GPU's do the number crunching better, and for less money.

    The Cell on clusters suffers the same problem it has in a console. It's not enough better than a CPU for the extra time needed to learn to use it properly. And it's not good enough to compete with a GPU for pure computing needs. It was an amusing project, and sure, once the cluster is running you want to churn through some data with it, but by the time they ditched the Other OS feature in software they were beyond viable to build new (since you couldn't get consoles that would do it).

    That doesn't mean it wasn't illegal to remove the other OS feature after the fact. It probably was on principle. But don't misrepresent who it mattered to. The fraction of a percent of people who ever actually used the other OS feature *and* games did get screwed, no doubt. But if you seriously used the OtherOS functionality you didn't use them as gaming machines at the same time. Remember a lot of people 'used' the other OS feature in the same way 90 million people 'use' google plus. And yes, that small collection of power users, and that larger but still small collection of pirates got screwed on the deal. That's why these things are illegal.

  • Re:Doesn't matter (Score:5, Interesting)

    by SadButTrue ( 848439 ) on Thursday March 01, 2012 @06:42PM (#39215175) Homepage

    While you are probably correct that the DRM & Linux people have little to do with it you are way off base as to what is the real problem at Sony. Sony finds themselves on the wrong side of pretty much every quickly evolving high tech consumer device.

    They were very slow to move away from CRT production which they were very strong in. The ramped up their LCD production just as the bottom fell out of LCD pricing. They are now attempting to catch up in the OLED space which ironically wouldn't exist in its current state without Sony R&D.

    They were slow to move from tape to CD to digital music and lost the entire market.

    Their once dominant position in the console market is gone. They are actually being out innovated by Microsoft. This isn't even the Microsoft of 15 years ago that was trying. They are losing to the Vista Microsoft.. it boggles the mind.

    tldr; They are losing tons of money because they have become really slow in the fastest moving consumer markets.

  • Re:It is a pain (Score:5, Interesting)

    by Anonymous Coward on Thursday March 01, 2012 @06:49PM (#39215269)

    Yes, writing code for CELL is much harder than MT x86. However when you do the DMA the right way (16B boundary aligned, and fetch the next work batch while computing the previous), use the predicated instructions (which the compiler wasn't very good at when I used it, so I had to use c-intrinsics ) instead of conditional jumps and make sure you schedule your instructions so you get the dual issue going; The Cell is an absolute monster in terms of raw computational power. You always complete a memory access (you never get cache misses) in 6 cycles which at 3.2GHz which the Cell has is around 6-10 times faster than the fastest DDR3 memory. You rarely get branching misses if you use the correct instructions for loops and predicated instructions where possible. Effectively eliminating the two biggest performance bottlenecks in modern microprocessor design. At the COST that you have to manually shuffle data and make sure the machine performs at it's max.

    This is a well known trade off in microprocessor design: Make a chip that runs excellent code at break neck speed and poor code like porridge; Or make a chip that runs excellent code at an okay speed and also runs poor code at a decent speed. Cell is designed as the former, which actually all of Sony's hardware are, while X86 is designed as the later. One can argue which is better, I'd say it depends on the application and who is going to program it. Most programmers are not proficient enough program such strict machines as the Cell properly because you need a deeper understanding of computer architecture than what most programmers have.

    In closing: Yes it is difficult, but it is by no means a slow chip if you program it the way it was intended to. And it might not be the best chip for all applications.

  • Comment removed (Score:4, Interesting)

    by account_deleted ( 4530225 ) on Friday March 02, 2012 @09:59AM (#39219763)
    Comment removed based on user account deletion

"Money is the root of all money." -- the moving finger

Working...