Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Supercomputing PlayStation (Games)

Eight PS3 'Supercomputer' Ponders Gravity Waves 293

Jamie found a story about a inexpensive supercomputer being used by an astrophysicist to research gravity waves. The interesting bit is that the system is built using 8 PS3s. Since nobody is actually playing games on the system, it makes sense to use them for research projects like this, but I really wonder now what is defining 'Supercomputer'... I mean, a hundred PS3s sure, but 8? I think we are de-valuing the meaning of the word 'super' :)
This discussion has been archived. No new comments can be posted.

Eight PS3 'Supercomputer' Ponders Gravity Waves

Comments Filter:
  • by zifferent ( 656342 ) on Wednesday October 17, 2007 @10:53AM (#21010501)
    64 cpu's. That seems supercomputerish enough for me.
  • Not surprising... (Score:3, Insightful)

    by grocer ( 718489 ) on Wednesday October 17, 2007 @10:55AM (#21010521)
    I believe that people were clustering PS2 for research shortly after the release of the linux kit...cheap processing power is cheap processing power.
  • Devalued super (Score:4, Insightful)

    by Teese ( 89081 ) <beezelNO@SPAMgmail.com> on Wednesday October 17, 2007 @10:56AM (#21010545)

    I think we are de-valuing the meaning of the word 'super'
    I'm pretty sure we devalued super when the PowerMac G4 [youtube.com] was claimed as a supercomputer all by its lonesome.

    Super is a relative term, what was a super computer is now a computer that I hand-me-downed to my mom so she could check her email and browse the web.

  • devaluing super (Score:4, Insightful)

    by mihalis ( 28146 ) on Wednesday October 17, 2007 @11:00AM (#21010607) Homepage

    Well the guy used to use a 200-node parallel supercomputer, but now he prefers to use 8 PS3s. That to me proves that 8 PS3s is like a supercomputer TO HIM.

    I'm sure there are faster setups available if had the money, but 100% of 8 PS3s indefinitely is preferable, from what he says, to the costly little slices of "real" supercomputers he tried to rent before.

    I wonder if Sony could offer a "HPC PSP3" which provided a stripped down processor board without the shiny case, graphics memory etc. It would be interesting if the Cell processor could get better economies of scale.

  • Re:devaluing super (Score:4, Insightful)

    by TargetBoy ( 322020 ) on Wednesday October 17, 2007 @11:03AM (#21010665)
    Wouldn't it rather be IBM that might offer this, since they actually make the cell?
  • by The13thSin ( 1092867 ) on Wednesday October 17, 2007 @11:03AM (#21010671)
    For a supercomputer that's pretty cheap. Also I find the statement in the summary that there are no games to be played on the PS3 a bit childish. The PS3 has not been out for a year yet and there are multiple great games to get for it right now and even more coming very soon. I expected more from the Taco.
  • by JamesRose ( 1062530 ) on Wednesday October 17, 2007 @11:17AM (#21010897)
    I could be wrong, but isn't it actually quite expensive, because within those 8 PS3s, you're buying 8 very high end graphics (possibly integrated, but still there), which surely would bump up the price by quite a large amount, would it not have been easier to buy components?
  • by NeilTheStupidHead ( 963719 ) on Wednesday October 17, 2007 @11:18AM (#21010913) Journal
    I think that the article imples that PS3s aren't flying off the shelves as fast as Sony might like and thus are sitting in a warehouse somewhere, otherwise going unused. Even the article claims that this was done mostly because of the open platform presented by Sony and the fact that this researcher was able to get the consoles free from Sony. This is great for Sony because a sold console is money in Sony's pocket regardless of who buys it and what they do with it. If they can convince researchers to buy PS3s then it's probably a better deal than selling them to gamers. Few gamers would buy the equivalent of 7 PS3s (about $2800) worth of games and accessories. Some will, but most won't; even when they do, it's spread over the life of the console. A researcher goes and gets 8 consoles, cash up front and there's $3200 for Sony (less taxes, mfg. costs, etc).

    Maybe it's jsut me, but that sounds like a pretty good deal from Sony
  • by flaming-opus ( 8186 ) on Wednesday October 17, 2007 @11:23AM (#21011007)
    64 cores does not make a supercomputer. There are database servers with more cores than this, and have been for years. Technical computer, sure. Maybe even high performance computer. Definately NOT supercomputer. 8 systems, that's what? 4GB of RAM? There are laptops that can hold that much memory.

    If you went to a technical conference like, for example, Supercomputing '07, you would get laughed off the floor calling that a supercomputer. Supercomputer is a changing definition, but I don't think I'd call anything a supercomputer that didn't have at least 1TF of peak double-precission performance, and at least 200GB of RAM.
  • by QuantumFTL ( 197300 ) on Wednesday October 17, 2007 @11:26AM (#21011065)
    Disclaimer: I hate the PS3 (though I love the cell, but not for gaming, because that's too complicated for most game programmers to handle). I love my XBox 360 and Wii (as long as they both continue to function and don't break).

    Since nobody is actually playing games on the system, it makes sense to use them for research projects like this

    Yes, because ~4 million people count as "nobody". But seriously, am I the only one that's tired of troll article summaries around here? It's either a flippant comment like that, or some asinine, leading question at the end, like "Could [people who are professionals and therefore have a clue unlike submitter who only skimmed the article in question] finally be getting it right?"

    Slashdot is where i go for excellent commentary - I've tried reading comments on sites like digg or reddit, and neither can compete with whatever strange and wonderful force it is that guarantees at least some highly-moderated comments on this site are really worth reading (often moreso than the article, which is probably why no one reads it anyways). But now that we have firehose, etc, I say we should start punishing stories early for this kind of trolling, tag them as such, and maybe even put up some prepublication commentary on it. I've only submitted a few articles, but I know that, despite popular belief, the editors *do* edit what is written, and maybe, just maybe, we can reduce this annoyance.

    Of course I know there are many more important problems in the world than the submitter being an ass, but this is one I can do something about - and so can you.
  • by Jarjarthejedi ( 996957 ) <christianpinch@g ... om minus painter> on Wednesday October 17, 2007 @11:31AM (#21011143) Journal
    Yeah, but in this case it's not /. doing the PS3 bashing, it's the actual article from Wired instead, just look at the first sentence

    "Suffering from its exorbitant price point and a dearth of titles, Sony's PlayStation 3 isn't exactly the most popular gaming platform on the block."

    Looks like /. isn't the only PS3 hating news source out there eh :P
  • by chrysrobyn ( 106763 ) on Wednesday October 17, 2007 @11:39AM (#21011267)
    256k of local storage for each SPE is a problem, but you can code around it. (It's not trivial, but it can be done.) You can't code around having a small amount of RAM and still maintain high performance.

    Let me see if I get this straight, you can imagine a piece of code that doesn't mind churning on itself within 256KB, but you can't imagine having to keep 256MB of main memory fed from a network or disk? In my experience, any piece of code that can both benefit from extreme parallelism and fit both the code and enough data to be worth working on within 256KB can handle a few reads from a disk or the network once in a while. If it can't, then 256KB of memory isn't enough to keep the (sub)processor fed, and you need a machine with more on-die memory (many of which can be found).

    Cell is very good at integers and single precision floats for workloads that are parallelizable and fit within 256KB. If you stray from any of that, there are plenty of interesting competitors.

  • by tb()ne ( 625102 ) on Wednesday October 17, 2007 @11:45AM (#21011395)

    Oh, OK. Then I guess they've just been cutting costs for fun. And Microsoft didn't lose billions of dollars on the original XBox.

  • by Fross ( 83754 ) on Wednesday October 17, 2007 @11:48AM (#21011427)
    8 games? Yikes, either you play *everything* or you've got some real crud in there. Care to elaborate? (I hope one of them isn't Lair ;) )

    To give an idea, the top 8 games on PS3 get metacritic scores of 85 or more ( http://www.metacritic.com/games/ps3/scores/ [metacritic.com] ). Only one of those is over 90.

    To compare, the 360 has *27* games at 85 or more ( http://www.metacritic.com/games/xbox360/scores/ [metacritic.com] ) 9 of which rate 90 or more.

    For me, of those 8 games I'd be interested in 4, 2 of which are also available on PC.

    I'm glad you're enjoying your PS3 for gaming (hell, competition is what keeps things improving) but the general sentiment is the PS3 needs a killer app (like a halo, gears of war, or some other really good exclusive title) to make it worth getting.
  • by lena_10326 ( 1100441 ) on Wednesday October 17, 2007 @12:02PM (#21011663) Homepage

    You can't code around having a small amount of RAM and still maintain high performance.
    I wouldn't agree with that. That's only true if the algorithm relies on access to the entire data set because it requires random access or multiple table scans. Lots of algorithms can operate on small independant chunks or can be rewritten to use sequential data access, which is chunk friendly. I think it's apparent his algorithm works on small chunks due to the relatively small amount of RAM, unless his entire data set fits within 256MB. Either way, the fact it's working for him implies the answer.

    Now, a lot of it is influenced on whether records are accessed once or multiple times. If it's once, the overhead is the same as loading it all up in memory and running computation on the entire data set, because there's 1 chunk read per access for N reads per N accesses. If the algorithm has to revisit chunks, then you've potentially got >N reads per N accesses (assuming a caching scheme is used), which kills performance if you're swapping chunks in and out or rescanning the data set from the beginning.

    So in summary, high performance is possible with "smallish" amounts of RAM if the following is true:
    1. Chunks are independant. Results are not passed as input for processing the next chunk.
    2. Algorithm is CPU bound, not IO bound.
    3. 1 time sequential access: optimized by prefetching.
  • by PitaBred ( 632671 ) <slashdot&pitabred,dyndns,org> on Wednesday October 17, 2007 @12:09PM (#21011807) Homepage
    Completely OT, and in reference to your sig:

    It was decided that way because we discovered font kerning and non-monospace fonts. You don't need two spaces when the font display properly separates the words for reading. The two spaces was a holdover from monospaced typewriters.
  • by zippthorne ( 748122 ) on Wednesday October 17, 2007 @12:13PM (#21011875) Journal
    Yeah, but is that real money or is that subdivision expensing. In other words, does it lose $240 because Sony must use $800 of resources to produce a $600 product, or because sony-chipfab charges sony-board-assembly $60 for a part that cost $5 to produce?
  • by Anonymous Coward on Wednesday October 17, 2007 @12:14PM (#21011895)
    Seriously speaking: given that they are sold at a loss and the economies of scale are large, while expensive as a gaming platform, they are probably the cheapest way to get that sort of computing power.
  • Good call (Score:3, Insightful)

    by styryx ( 952942 ) on Wednesday October 17, 2007 @12:25PM (#21012073)
    I had a look at using multiple PS3s for simulations a while ago. Purely based on the ass-rocking-ness of the CELL chip.

    There are servers that use the CELL chip, from IBM, see the Blade server. [wikipedia.org] But the Blade server is quite a bit expensive; that is 8 PS3's at the UK price was cheaper the last time I looked. On top of all that is the 'pooling' that the CELL chip does, while this won't be that good for simulation (with current, popular implementations, e.g. MPI2), it will be awesome for games: succinctly, any process that requires extra 'power' can request another node from the 'pool' and release it back when it is under less strain. The transport latency (often the biggest latency in Parallel, even with fibre optic switches, unless its a purely Monte Carlo sim...) is much reduced by having all processors on a single die. The architecture is a mix with vector based operations as well.

    Prima facie it would be perfect to use multiple PS3s. After speaking to some HPC chaps, at Edinburgh Uni,they informed me that the memory on the PS3's is pretty low (512MB split between video and the conventional) which can be a pain if you want to perform REALLY big simulations (which, when scaling is accounted for, is pretty much the point of using supercomputers... not _necessarily_ speed, lets not make this the point of debate, it is simulation dependent.). I will also add that the memory, though small, is bloody fast. If you can code to keep bloat completely removed, you won't need many BG processes; and split memory requirements between each of the PS3s then it is a really, really nice system. Takes a bit of effort and a learning curve, but there are many resources online, native Linux support is an Uber Bonus for Sony (though I am considering NOT buying a PS3, or many, due to their Media departments behavior!).
  • Re:Obligatory (Score:3, Insightful)

    by Maller ( 21311 ) on Wednesday October 17, 2007 @12:28PM (#21012105)
    Please enlighten me. Who is stupid enough to by a million+ dollar computer without factoring in facility costs?
  • by Anonymous Coward on Wednesday October 17, 2007 @12:41PM (#21012287)
    Yeah, it's a good thing the article never calls it a supercomputer. It's just the usual lack of decent editing at Slashdot letting a poorly written blurb by again.
  • by Intellectual Elitist ( 706889 ) on Wednesday October 17, 2007 @12:43PM (#21012315)
    > 8 games? Yikes, either you play *everything* or you've got some real crud in there. Care to elaborate?

    The four games I was referring to were Ninja Gaiden Sigma (88 [metacritic.com]), Skate (85 [metacritic.com]), Stuntman: Ignition (75 [metacritic.com]), and Warhawk (84 [metacritic.com]). The downloadable game was Super Stardust HD (84 [metacritic.com]). None of those games are even remotely close to "crud".

    The four games I referred to having an interest in purchasing before the end of the year are Ratchet & Clank Future: Tools Of Destruction, Uncharted: Drake's Fortune, Rock Band, and the collector's edition of Stranglehold. I might also consider Army Of Two, Assassin's Creed, and Call Of Duty 4, depending on the reviews.

    > To give an idea, the top 8 games on PS3 get metacritic scores of 85 or more [...] Only one of those is over 90. To compare, the 360 has *27* games at 85 or more [...] 9 of which rate 90 or more.

    The original post had nothing to do with the 360 -- it was about the insinuation that no one uses the PS3 for gaming, which is ridiculous.

    You're also making an apples to oranges comparison, because the 360 has been out longer and has a much larger base of titles. But if you want to compare, as of October 13th Metacritic's aggregated ratings for the 360 [metacritic.com], PS3 [metacritic.com], and Wii [metacritic.com] show that the 360 has 264 rated games, the PS3 has 82, and the Wii has 87. Since the PS3 and Wii came out later than the 360 and around the same time as each other, this makes sense.

    If you look at the percentage of each console's library that has a metascore of 75 (out of 100) or higher, the PS3 leads with 54%, followed by the 360 at 44%, then the Wii with only 16%. If you go with a metascore of 80+, the PS3 has 34%, the 360 has 27%, and the Wii has only 8% above that level. At 90+ the Wii has 3%, the 360 has 3%, and the PS3 trails with only 1% of its library at that level.

    Going by percentages, the PS3 and 360 libraries are of roughly equivalent quality, while the Wii's lags far behind.

    > the general sentiment is the PS3 needs a killer app (like a halo, gears of war, or some other really good exclusive title) to make it worth getting.

    The general sentiment is also that Iraq was involved in 9/11 and that Britney Spears's personal life is somehow newsworthy. I'll think for myself, thanks.

    That said, every console gets a "killer app" eventually. I'm sure the inevitable God Of War III will fill that void if nothing else does beforehand.
  • by Hotawa Hawk-eye ( 976755 ) on Wednesday October 17, 2007 @01:13PM (#21012813)
    Giving away 8 consoles that will not generate money from game license fees and getting an article in Wired that's linked to by Slashdot is a good deal for Sony marketing.
  • Re:devaluing super (Score:3, Insightful)

    by mihalis ( 28146 ) on Wednesday October 17, 2007 @01:52PM (#21013405) Homepage

    You mean like this?

    Thought I replied to this, but can't see it.

    Anyway, yes, that's jus the ticket, except it's $19k!!

    All of a sudden racking up actual Sony PS3s with their curved shiny cases, graphics chip etc seems eminently sensible

  • Not even close (Score:2, Insightful)

    by jefreyisnotzen ( 1091273 ) on Wednesday October 17, 2007 @03:06PM (#21014493) Homepage
    The term supercomputer is relative to what is in the top tier of all computation intensive platforms, relative to current standards. It is a supercomputer by yesterday's standards, but not todays. It's important not to discredit yesterday's standards, but it shouldn't be done at the expense of todays by leaving out one sentence that could have avoided this thread line. That sentence would haven't just been one that included, by yesterday's standards. But of course, there wouldn't be as many hits on the site if it were kept in perspective like that. One's and zeros have no bias. The fastest, is still, the fastest. And that will always be relative. You can't blame Steve Jobs for this breach. We all do this. We are all biased. One's and zeros don't have emotions, or reality distortion fields. Every RDF, is the responsibility of the person who let it out, no matter who did it first. If I believe in something that is false, that's my responsibility. If I put it out there further, after it came from someone else, that's my responsibility as well, not the responsibility of the person before me, even if I was duped or didn't have all the facts. I guess the only remaining question would be, or questions, are: How long ago is yesterday? When are standards considered current? Future shock is not considered by computers, but we have to consider it, considering how fast things compound with regard to technological evolution, in order to keep our definitions abreast with that evolution.

Our OS who art in CPU, UNIX be thy name. Thy programs run, thy syscalls done, In kernel as it is in user!

Working...