Forgot your password?
typodupeerror
PlayStation (Games)

IEEE Spectrum On The PS3 Learning Curve 88

Posted by Zonk
from the see-the-cells-divide dept.
An anonymous reader writes "The Insomniacs is the cover article in the December issue of IEEE Spectrum, discussing developers ramping up to the PS3 hardware. The article features Insomniac Games, who developed the PS3 launch title Resistance: Fall of Man. Despite mixed reports in the press, the Insomniac folks are delighted to be working with Sony's technology, and describe the process of helping to make or break a console launch." From the article: "Despite the delays, there's something inside the PS3 that burnished Sony's reputation as a hardware company. The heart of the machine is the powerful new Cell Broadband Engine microprocessor. Developed over the last five years by Sony, IBM, and Toshiba on a reported budget of $400 million, the Cell is not just another chip: it is a giant leap beyond the current generation of computer processors into a nextgen muscle machine optimized for multimedia tasks."
This discussion has been archived. No new comments can be posted.

IEEE Spectrum On The PS3 Learning Curve

Comments Filter:
  • Oh man.... (Score:3, Funny)

    by Pojut (1027544) on Thursday December 07, 2006 @04:53PM (#17153002) Homepage
    "...the Cell is not just another chip: it is a giant leap beyond the current generation of computer processors into a nextgen muscle machine optimized for multimedia tasks."

    Anyone else react the same way I did?

    Fox News is now spinning CPU development?
    • Re: (Score:3, Insightful)

      by Broken scope (973885)
      If that was not a canned response I don't know what is. OMG IT TEH CELL!!!! IT SO FUCKING AMAZING LOLZ!!!! HAX!!!!!!
      Cellular processing. DO something with it on the console that could only be done on the console then TOUT IT DAMNIT.
      • I entered:

        "the Cell is not just another chip: it is a giant leap beyond the current generation of computer processors into a nextgen muscle machine optimized for multimedia tasks."

        and the fish sayeth:

        "Cell is not just another chip, it is a big buzzword buzzword buzzword processor buzzword buzzword buzzword buzzword."

        So it's not just another chip...it's a processor! :D
        • I have an IBM system journal about the cell. They mention some cool things. They admit to some limitations.. Then they think up ways to make games nickel and dime the hell out of you. Want to join a server. $.05, want a gun ohhhh $.20, ohhhhhh want to move... $1.00!!!!
    • naw. Then it would be a "nextgen freedom machine" [queue rousing patriotic music]...
    • Re: (Score:2, Funny)

      by steveo777 (183629)
      That's okay. They said the same thing about the Emotion engine years ago and everyone found out exactly how life-like all those images were when they fired up Evergrace [wikipedia.org] the first time. (oh, and if you haven't played it, save yourself the trouble and light firecrackers in both ears so you'll never hear again, next gouge your eyes out with rusty scalpels... trust me it would be a lot more fun)
    • Re: (Score:3, Insightful)

      by Quantam (870027)
      I think you're overreacting. While they may be exaggerating a bit, the Cell is a pretty insane piece of hardware. I'm a professional programmer (though not on the Cell), and I've been reading various architecture specs for the Cell. Its peculiar architecture means that it's difficult to make use of its full power for many types of tasks (don't ask me why they're selling Cell-based blade servers; it doesn't make much sense to me); but if you have an application that fits with what the Cell is optimized for,
      • Re: (Score:2, Insightful)

        by Pojut (1027544)
        noted, but something that raised the red flag for me:

        If it is truly as powerful as they say (for gaming purposes, of course) they wouldn't need to talk it up. They would simply say "hey, you will see...what you will experience will be beyond what I could convey to you here today"

        I THINK nintendo did something along those lines, if I remember correctly...and now they have the little console that could going
        • Re: (Score:3, Interesting)

          by be-fan (61476)
          The article doesn't even involve Sony talking up the chip. It's an IEEE article.

          A lot of the Cell press has nothing to do with Sony, actually. There are a lot of EE/CompE types who get a hard-on over Cell for the same reason they do for Itanium (simple, fast hardware driven by complex compilers).
          • by imroy (755)

            Are you seriously trying to compare the Cell with the Itanium? The problem with Itanium (and probably all IA64) is that the CPU core can not adapt to changing conditions. Everything's encoded into the instructions by the compiler and AFAIK, can not be rescheduled by the CPU. So scientific and engineering workloads written in FORTRAN run great, but nothing else does. It's pretty mediocre at common server tasks e.g web, email, etc.

            Now, the Cell is basically an under-performing PowerPC G5 core with 7 or 8 SP

            • by be-fan (61476)
              Way to miss the point COMPLETELY.

              Both Cell and Itanium remove complexity from the hardware, in order to fit more hardware into the available space, at the expense of more complexity in the compiler. The Itanium does this by using VLIW and eliminating OOO, and the Cell does this by eliminating OOO in the PPE and SPE, and eliminating dynamic branch prediction and hardware-managed cache in the SPE.

              Hardware guys jack of to this sort of thing, because they don't have to write software for the damn things. That's
              • by imroy (755)
                Ok, so both the Itanium and Cell have simplified but highly parallel hardware. But where is the complexity in the Cell compiler? Everything I've read about the Cell says that programmers have to break up their code themselves for the SPE's, not the compiler.
                • by be-fan (61476)
                  IBM is working on a compiler to split that stuff up, but that's sort of besides the point. Writing a compiler for Cell is more traditional than writing one for Itanium, but its still really hard. The SPE and PPEs are in-order, and unlike Itanium, don't have any special latency hiding mechanisms like the advanced-load-table. They also have long pipelines (~18 stages), high cache latencies (5 cycle L1 on the PPE, 6-cycle LS load on the SPE), and poor (PPE) to no (SPE) branch prediction. All the technology exi
                • by Retric (704075)
                  The programmer needs to find some use for 7 SPE's but the compiler let's you use each SPE to it's fullest.

                  Basically, the Cell is an 8way chip designed around having smart programmers and smart compilers that optimise for it. You can't make a desktop CPU based like this because you need to run old software fast but when starting over you can go for it.
              • The Itanium looks great to hardware enginners until you realize that it executes BOTH paths from a branch (until it can verify the correct branch direction), then throws out the incorrect path. When you combine this with the fact that the compiler can not always schedule each VLIW with the full 3 instructions (due to dependencies between instructions), you've got a mediocre architecture for general-purpose code. Still, at least the Itanium is a scalar processor with multiple execution pipes - this is much
          • (simple, fast hardware driven by complex compilers)

            Unfortunately, the only compiler that can produce decent code for the Cell is the most complex, expensive, and undependable compiler on the market today.

        • by tbannist (230135)
          Actually, "they" don't "need" to talk it up. But, marketing people get bored too, you know.
          • by Pojut (1027544)
            Marketing is like a bar with a four drink minimum, what are you talking about?
            • by tbannist (230135)
              Exactly, nobody havers like marketing!

              And occasionally they have to put out, you know, stuff so that the people who hired them think they're actually working.
              • by Pojut (1027544)
                you mean like a consultant, dogbert style?

                "Here is how it works. I will walk around with large empty binders for a couple weeks. At the end of the couple of weeks, I will call your buisness a failure and give you the bill."
      • Re: (Score:3, Insightful)

        by RzUpAnmsCwrds (262647)

        Its peculiar architecture means that it's difficult to make use of its full power for many types of tasks (don't ask me why they're selling Cell-based blade servers; it doesn't make much sense to me); but if you have an application that fits with what the Cell is optimized for, that thing is ungodly fast.

        So is a GPU. So is a DSP. So is an FPGA. So is an ASIC.

        There have always been ICs that are "insane" compared to CPUs - the CPU's power comes not from its raw performance, but in its ease of programmability

        • by ClamIAm (926466)
          My GeForce 6200 can do more GFLOPS than the fastest Core 2 Duo, but you're not going to run Linux on it.

          I see your 6200 and raise you a Tamagotchi.
        • My GeForce 6200 can do more GFLOPS than the fastest Core 2 Duo, but you're not going to run Linux on it.

          O rly? [nvidia.com]

          (Yes, I'm aware that the 6200 isn't CUDA-compatible. And yes, I'm aware that CUDA doesn't compile all the code to run on the GPU. It is getting closer, however...)

        • by adam31 (817930)
          But calling it a "revolution" is just plain wrong. It's just a better, more integrated version of what we have been doing for years.

          See, it's such an integrated version of what we've been doing for years, that I would call it a revolution... not to mention the education possibilities it opens. Technologically, it takes these different facets of computing-- Supercomputers using high-bandwidth interconnects (turns into 25/30 GB/s IOIF and 300 GB/s EIB), GPUs using separate superscalar vector processors +

      • by 7Prime (871679)
        But the problem is, for all the power increase you get, you have to spend that much more time and money to actually take advantage of it. To make bigger, more immsersive worlds, you have to invest in more programmers, more art designers, more architects, etc. The question is, are game companies ready and willing to take advantage of this power to its full extent? Or will it simply be a distraction from what's really important: making a good game.
        • Re: (Score:3, Insightful)

          by xero314 (722674)

          But the problem is, for all the power increase you get, you have to spend that much more time and money to actually take advantage of it.

          This is only true in the sense that unified libraries or processes need to be conceived and/or developed before it becomes easy to take advantage of the more complex architectures. This can be seen in the history of GPUs where originally it took specialized knowledge of the specific GPU to get the most out of it, but now standardized APIs have been developed which allow

          • In the end developing for the Cell will be as easy as developing for any processor

            No it won't, because to take advantage of the Cell you have to write parallel code. Any way you slice it, that's always going to be inherently harder than writing single-threaded code.

            • by xero314 (722674)

              No it won't, because to take advantage of the Cell you have to write parallel code.

              Incorrect. Your program must compile to parallel instructions, important and not very subtle difference. This can be hidden by a good library. The most you should be expected to be able to do is manage threading, which exist in most architectures out there. The fact that a thread would be run on an SPE rather than a general purpose CPU should be transparent with the right APIs. The SPEs are turing complete and so should

          • No disrespect, but have you even done any console or graphics development??

            The graphics pipeline is much more rigid and simple, compared to trying to best make use a multi-threaded architecture. Not all algorithms are parallelizable; most are serial. Sure you split the general execution of your game loop using N cores up into streaming, render, physics, audio, input, and AI, but each of the sub-components are still relatively single-threading without adding further complications about dealing with multi-t
            • by xero314 (722674)
              Thanks for the insightful information. I like what you have to say and you hit the nail on the head in a round about way. Working effectively with architectures such as the PS2 and PS3 does require a change in though, but interestingly enough it's a reversion back to older thought process rather than a newer one. I am a software engineer, though I have not worked on any game development in many years, but these newer architectures do get me interested again. If you look at the early PCs you will see tha
        • by be-fan (61476)
          That's not really true. More powerful hardware allows you to use higher-level programming techniques, pre-made game engines, etc, all of which reduce cost and reduce the time required. Also, a lot of game art is scalable, or can be made scalable. Shaders, for example, don't really care what the polygon count is, or what the output resolution is. Indeed, more power can help reduce art development time as well. Game models, for example, are usually developed at a high level, then the polygon count is pared do
      • Re: (Score:3, Informative)

        by xero314 (722674)

        don't ask me why they're selling Cell-based blade servers

        They are selling Cell based servers, blade or otherwise, because the cell processor was designed with Scientific Computing in mind. For those that don't know this is the category of computing that is done on all Super Computers at this time. IBM is hoping to replace the current generation of x86/Power based super computers, and super clusters, with Cell based clusters. The current top rank Supercomputer is capable of 367 teraflops peak using 131072

      • don't ask me why they're selling Cell-based blade servers; it doesn't make much sense to me

        Its pretty clear that you are not involved in any projects requiring real-time voice recognition of hundreds of datastreams simultaneously then. This is the only realistic architecture for that kind of task. Its also pretty good for many kinds of massive database processing, such as might be required for real-world artificial intelligence, such as face recognition from poor cctv pictures, and problems such as computi

    • by big4ared (1029122)
      They should at least get their facts straight. The article continually states that the Cell has 1 PPU and 8 SPUs, which is WRONG. It has 1 PPU and 7 SPUs because one of the SPUS is disabled to increase yields.
      • by be-fan (61476)
        The article is right and you are wrong. Cell is a joint development of Sony and IBM. The ones this article is talking about, the ones that will be used in scientific computing, manufactured by IBM, has 8 SPUs, because IBM throws away the ones that have a defective SPU. Sony also makes Cell processors, for use in the PS3, and that Cell is specced to have 7 active SPUs.
  • by Control Group (105494) * on Thursday December 07, 2006 @05:05PM (#17153266) Homepage
    See any serious problems with this story? Email our on-duty editor.

    Yeah, I sure do - it's about good news for the PS3.

    That can't be right.
  • Sony Hype Machine (Score:2, Informative)

    by Cadallin (863437)
    The only thing I've seen for the PS3 that looks even remotely impressive is White Knight, and I won't believe that would be impossible on the 360 for a second. If the PS3 has any edge at all over the 360, its the one year newer Nvidia graphics chip. The PS3 MAY have more memory speed than the 360, but the 360 has twice the capacity (512 vs. 256), so I'd say that's a wash, especially since I'm not sure how the RAMBUS components in the PS3 compare latency wise to the GDDR3 in the 360. I'm sick of the Sony
    • Actually, the nVidia chip probably isn't anything special. Given the date that the PS3 was supposed to ship and the specs that have been released, it looks like the RSX is basically a 7900. While a 7900 is a great card, it's nothing more than what the 360 can do (which is something of a modified ATi 1900).

      http://en.wikipedia.org/wiki/RSX_'Reality_Synthesi zer [wikipedia.org]'
    • by cptgrudge (177113)

      Both Sony and Squeenix need to be taken down a notch or two.

      To be fair, Square Enix has titles for both the PS2 and Gamecube, though not the same ones. They've said within the past couple years that they'd like to go more multiplatform. There are new Crystal Chronicles games in development for the DS and Wii. And I don't think that they've definitively chosen a console for the next Kingdom Hearts game. They have a lot more titles other than just Final Fantasy.

      But I get your point. I'll take my ti

      • by be-fan (61476)
        The problem is that its not just FF that's only on the PS2. So is Star Ocean, Xenosaga, Shadow Hearts, etc. The XBox and GC had a few RPGs here and there, but that was it. If you're an RPG fan (and there are a lot, the FF series is one of the best-selling of all time), then buying a non-Sony console just doesn't make any sense.

        This might change a bit, with the RPG scene looking better on the 360 (with Blue Dragon and whatnot), but Microsoft's presence in Japan is still minimal, and Japan is still where nea
    • Re: (Score:3, Insightful)

      by androvsky (974733)
      Last I checked, both systems had the same amount of memory. 512 MB. The Xbox 360 cpu and gpu have to share it, while in the ps3 the Cell gets 256 MB, and the gpu gets 256 MB. As simple as the math here is, this is the second time today I've seen someone post the "fact" that the 360 has twice the memory. Where is the 360 supposed to store geometry and textures? The ps3 nvidia chip really isn't anything special compared to the 360's ATI chip... the only really interesting thing is that the Cell has easil
      • The PS3 does have a competitor, one who will do much better then them for at the least a couple of years. Nintendo is going to have a much stronger showing this time around than it did before. It's easily going to wipe the floor with the 360. Apart from GoW the 360 does not have anything reall riviting, and the problem with GoW as a system seller is that it has a narrow focus for it's audience. As time goes by I'm betting the PS3 will slowly overtake the 360 and near the end of their respective lifespans th
      • by Cadallin (863437)
        You are correct, and I apologize for my error. I'm so used to consoles being UMA systems I didn't read any further in the wikipedia article. Although, I think my general point, that I think the PS3 and 360 are largely equal in terms of performance and capability, stands.

        I remain EXTREMELY skeptical about Cell's vector performance being leveraged in the ways Sony keeps claiming. Your statement about Microsoft is very true. Personally I'm rooting for Nintendo to come back in a big way, although I don'

    • by big4ared (1029122)
      Actually, they both 512 megs of ram. The difference is that on the Xbox 360, the GPU and the CPU share the same bank of 512 megs (plus an additional 10 megs of EDRAM for the framebuffer) whereas on the PS3, 256 megs are dedicated to the CPU and the other 256 megs are dedicated to the GPU.
  • by Anonymous Coward on Thursday December 07, 2006 @05:24PM (#17153602)
    Back seven years ago I remember whiteboard discussions with other engineers when we first started work on the first PS2 devkits about where we hoped Sony would take the amazing technology we now had access to with the PS2 hardware. Cell is in essence exactly what we wanted to see Sony take the PS2 design philosophy.

    As game developers we spend a huge amount of our time 1) organizing data 2) feeding that data to someplace to operate on it 3) sending that data back to step one to repeat the process

    Cell's design makes our lives vastly simpler. It is an absolute dream to work with.

    The insanely high floating point power is what is talked about most with the Broadband Engine, but it is the memory architecture that is the best part of the architecture. The internal ring bus allows us to write code that hide memory latency.

    Writing for Cell is extremely straightforward. You have each SPU setup to operate on three regions of internal memory: 1) Static data 2&3) doubled buffer of dynamic data. Data is being fed into one buffer while the SPU operates on the other. With this setup optimal Cell code has all available SPUs plowing through data with very little latency from the memory subsystem.

    In many ways it is very similar to writing old style code where you got your data into the chips cache, operated on it, and then wrote that data back out to main memory or somewhere else. But with Cell you now have total control of how the data is loaded into your cache due to the SPU ability to scatter DMA into local memory, and you have the internal ring bus to pass data around to other SPUs instead of having to go out to slow main memory, and of course you have 6-8(depending on the hardware you are using) SPUs all running in parallel.

    It is wonderful that every PS3 is setup to easily allow install Linux and have access to the Cell devkit. There is a wonderful world beyond the archaic x86 architecture just waiting for you.

    • Cell is in essence exactly what we wanted to see Sony take the PS2 design philosophy.



      Yes, it's ideas introduced with the PS2, taken to the "next level". SPE's instead of the VU's


    • As game developers we spend a huge amount of our time 1) organizing data 2) feeding that data to someplace to operate on it 3) sending that data back to step one to repeat the process

      I assume that most of this "operating on it" involves floating point operations on triangles.

      And I suppose that most of the results are essentially triplets of eight bit RGB [red, green, blue] values [i.e. your results are expressed in 24-bit color], and I assume that you rarely venture much beyond screen sizes of about 16
      • Re: (Score:3, Interesting)

        by shplorb (24647)
        They can, but then anyone making a game with maps large enough to cause such issues commonly split up the map into chunks with their own co-ordinate space and re-centre the global co-ordinate space's origin to the origin of each chunk as the camera moves into it.

        Animating objects like characters and such have all of their calculations performed in their local co-ordinate space before the result is transformed into world space.

        Most also use the scale of 1.0f = 1M, so you'll be going on for a few KM's before
  • Unbiased Source (Score:3, Insightful)

    by HappySqurriel (1010623) on Thursday December 07, 2006 @05:40PM (#17153894)
    " article features Insomniac Games, who developed the PS3 launch title Resistance: Fall of Man."

    Which is a game that is published by Sony developed by a company that is owned by Sony ...

    What's next "Bungie, the Developers of the XBox 360's highly anticipated shooter Halo 3, have announced that the XBox 360 is Super Powerful and that Sony Rapes Babies!"

    I want to hear from EA, Ubisoft, Activision and Sega (ie. companies which have little interest in the platform) on which is easy/hard to develop for; so far EA has said that next-gen development is insanely expensive.
    • This is the real problem with the Article. Has nothing to do with the Cell per se, just that we're getting PR crap instead of engineering information. Ask someone whose salary is not tied to the success of the platform.
    • by DarkJC (810888)
      Insomniac is an independant. They've so far chosen to release to playstation and be published by Sony, but in an interview I read with Ted Price they are still completely independant from Sony.
      • by PingSpike (947548)
        So...what you're saying that a company that releases playstation 3 exclusives should have no bias for the playstation 3 at all then. That makes sense, afterall, its not like they've tied their very success to that particular console or anything.
        • by DarkJC (810888)
          No, I'm correcting the parent. Bungie is owned by Microsoft, of course they'll say wonders about the 360. Insomniac is not owned by Sony, therefore the comparison wasn't fair. Stop sticking words in my mouth.
    • I couldn't have said it any better.

      Well, maybe I would have left out the "raping babies" comment, and that would have probably been at least slightly better.
    • As has now been said twice, insomniac is not owned by Sony. They have extremely close relations with the Sony second party Naughty Dog, and tend to be published by Sony on Sony platforms(I think they've made a game or three not on a Sony platform though), but they're indy.

      Of course, they're so far up Sony's ass you'd have to dig in there with that drill made from unobtainium from that horrible movie "the core" to find them.

That does not compute.

Working...