Distributed Computing on Next Gen Consoles 251
anonymous lion writes "Wired has a story on the need for Xbox 360 and PlayStation 3 to support distributed computing with a non-gaming purpose. The article goes on to discuss SETI@home, distributed.net, and Folding@Home." From the article: "The next generation of console gaming is going to see a huge increase in machine performance and overall computing power. Already planned for both the Xbox 360 and the PlayStation 3 are multiple 3.2-GHz PowerPC processors capable of handling advanced gaming and graphics simulations, along with out-of-the-box internet capabilities such as Xbox Live Silver. With all that horsepower in a machine that is used for only a fraction of a day, we should offer gamers a chance to put these unused resources to good use."
"Unused resources"? (Score:5, Insightful)
Its not that a game console is something like a desktop pc, running the whole day just to be quickly accessable....
Re:"Unused resources"? (Score:4, Insightful)
They aren't? [hardwarezone.com]
Re:"Unused resources"? (Score:2)
I wasnt talking about the hardware, but about the usage pattern (no long boot times, usually longer play sessions, no multitasking) that makes a difference between game console and desktop pc.
Re:"Unused resources"? (Score:3, Informative)
I've never been a big fan of game consoles for that reason. I modchipped a few X-Box's for friends and played with XBMC a bit, but it was very much a toy in my eyes too. It also seemed like Microsoft was fighting our attempts to turn it into a PC at every turn. This next generation is going to be different from the looks of things though. I found this quote particularly int
Unused entirely (Score:2, Insightful)
Do the "longer play sessions" of a game console continue through the night? This isn't the era of the NES, where 5-hour games didn't have a save feature *cough*Super Mario 3*cough* and players would leave the console on pause overnight. Besides, a TV can be used for only one thing at a time, and if it's not playing games through a console or playing DVDs through a console or other DVD player, it's either off, showing cable TV, or showing satellite TV.
Re:"Unused resources"? (Score:3, Insightful)
Re:"Unused resources"? (Score:3, Funny)
I've always thought it was an incredible shame that there are all these electric base board heaters out there that just do that - heat. It seems to my (possibly demented) mind, that it would make more sense to have those heaters consist of processors doing some type of useful calculation.
So, in houses heated by electricity, maybe it would make sense to leave the PS3/XBox-360s on
Um, no. Electric heaters are only 40% efficient. (Score:2)
"They are, after all, an almost 100% efficient heater."
Electricity has to be generated. Most generation plant is around 35% -> 40%, CCGT around 60%.
Electricity is an extremely inefficient way of providing heat. In houses heated or cooled with electricity the most efficient thing you could do is rip out the heating and air conditioning and replace it with district heating and district cooling.
Re:"Unused resources"? (Score:3, Informative)
What planet do you live on?
Re:"Unused resources"? (Score:2)
Re:"Unused resources"? (Score:2, Insightful)
Re:"Unused resources"? (Score:2)
Re:"Unused resources"? (Score:2)
Besides, electrical energy is the most valuable for of energy as it's the most versetile and hardest to create. Heat is an energyform that is a byproduct of most other forms of energy transformation. Heating using electricity is a humoungous waste of effort and even if electricity
Re:"Unused resources"? (Score:3, Funny)
Well, technically you are correct, but that's only because 1/2 is a fraction. Actually, we're at 55% [naturvardsverket.se] at the moment.
And while I'm personally a supporter of nuclear power I would even think of putting it in the same group of energy producers as wind, solar or water.
That depends on the context. If we're talking about renewables, then fission-based nuclear plants are out. If we're talking about cheap, then solar and wind ar
Because this is changing. Maybe... (Score:3, Interesting)
Well, it may not be much of an issue now, but this is quickly changing.
Both Microsoft and Sony are playing with the idea that these game consoles will do more than merely play games. If it also has DVR functionality, advanced DVD capabilities, etc., then the day will soon arrive where people DO leave them on 24x7.
I have a TiVo, which is just a special-purpose computer. I wouldn't mind at all if it had a "power down" mode that would run a grid application such as trying to help cure cancer while it's n
Re:Because this is changing. Maybe... (Score:2)
(If someone in the US could work out how much 150 kilowatt hours actually costs, would be appreciated)
However... doesn't the noise drive you nuts? I've woken up in the middle of the night, and
Re:Because this is changing. Maybe... (Score:2)
3.9219 cents per kWh delivery charge x 150 = 588.285 cents = $5.88
plus
6.8590 cents per kWh supply charge x150 = 1028.85 cents = $10.29
For a total of $16.17 per month.
This is usage alone and does not include taxes, stupid little surcharges, etc. Which probably bring the total closer to $25.
3.2 GHz PowerPC ? (Score:2)
Are these not the same PPC used with Apple? (PowerPC 970 FX for the game systems, not sure about the G5)
MHz for MHz, a 3.2GHz PPC should kick the crap out of a 3.2GHz Pentium 4, and shouldn't be far behind--if behind at all--the performance of a P4 or AMD 3.8GHz (or whatever they're u
Re:3.2 GHz PowerPC ? (Score:2, Informative)
It's actually not known what kind of chip Nintendo will use, not the clock speed, not the features, nothing except the cod
Re:"Unused resources"? (Score:2)
>
Well, other than the fact I like running SETI and stuff from home, I plan on using the XBox360 (and PS3?) as an internet chat client. I hope the VoIP software will help me to keep in touch with my gaming friends.
Re:"Unused resources"? (Score:3, Insightful)
Re:"Unused resources"? (Score:3, Interesting)
I seriously doubt that this is an issue. I have yet to actually see a single computer that breaks for this reason. Fans, harddisk and the like all break years before your electronics go by by. And a fan constantly rotating 24/7 for sure gets more used then one that only rotates for 40h a week.
### It doesn't have anything to do with constant access
Its *all* about constant access. If comput
Re:"Unused resources"? (Score:2)
Maybe you didn't recognize it as the problem, but it does shorten lifespan. Especially of things like hard drives. Dust buildup will kill the fan long before the temperature shock does though.
Re:"Unused resources"? (Score:2)
Of computers I've ever used (systems at work), and were NOT turned off regularly, one had its PSU fail during a power cut, one had a HD die in the same powercut, and another's motherboard apparently spontaneously cracked in two.
While we're on the subject, and
Re:"Unused resources"? (Score:2)
Re:"Unused resources"? (Score:5, Informative)
Shed the myth! Hard disks for the most part are now better designed than back in the days, systems boot very fast, there is no need to keep your computer on if you will not be using it for a long time.
Re:"Unused resources"? (Score:2)
You haven't booted XP in a while, have you? Every service pack seems to make the boot up process longer and longer. Even my somewhat pristine company provided laptop takes much longer than I would like.
Re:"Unused resources"? (Score:2)
Maybe it helps if you get used to turning the system on before you need it. For example, I get up, tap the on button, and go clean my teeth, by the time I get back it's ready. Or I get home, turn the computer on, then go get a drink.
*pause* Although I think I just proved I spend too much time infront of the computer. I'm going to go outside now...
Re:"Unused resources"? (Score:2)
My 3.2ghz Pentium 4 takes 60 seconds to boot up. You do the math.
Re:"Unused resources"? (Score:3, Insightful)
Shed the myth! Power saving modes for the most part are now better designed than back in the days, systems use very little power in standby, there is no need to turn your computer off if you will not be using it for a long time.
Sorry, couldn't resist... anyway, I agree with most of your statement, I just
Re:"Unused resources"? (Score:2)
There are other reasons for failure (Score:2)
That's one reason why manufacturers try to develop materials [principalmetals.com] with controlled expansion rates to minimize temperature related stresses.
OTOH, there are other failure modes. For instance, migration of atoms in semiconductors becomes faster with higher temperatures. A semiconductor that isn't used very much will probably fail faster if it's kept continuously powered.
Of course, this discussi
Re:"Unused resources"? (Score:2)
Most people won't do it (Score:5, Insightful)
You forget HIGH SCORE (Score:2)
Seriously, if this is like Folding@Home [stanford.edu] that gets out of the way when the CPU is being used, it would still get some crunching done in the game chat rooms and the in-between-the-levels limbo modes. If there's enough computing power left over for live TeamSpe
Re:Most people won't do it (Score:2)
Unspoken subtext: 'cause they're overpowered (Score:5, Insightful)
Also, will users have a choice concerning whether to so use their consoles' spare cycles, or will it happen without their concent or even overt knowledge? Will they be able to decide which project gets the use of their machine's time? And what if someone comes up with an entertainment use for those cycles...?
Re:Unspoken subtext: 'cause they're overpowered (Score:4, Insightful)
It's the ass, you n00b!
But this is irrelevant. The most sensible choice and the one Wired is advocating is a distributed client that runs when the system is not being used for gaming.
> will users have a choice concerning whether to so use their consoles' spare cycles, or will it happen without their concent or even overt knowledge?
Obviously the more control the user has, the better. But anything would be better than nothing.
> Will they be able to decide which project gets the use of their machine's time?
See above.
> And what if someone comes up with an entertainment use for those cycles...?
No doubt it will result in a story being submitted to Slashdot.
Re:Unspoken subtext: 'cause they're overpowered (Score:3, Funny)
Re:Unspoken subtext: 'cause they're overpowered (Score:2, Funny)
Run program? [Yes][No]
That should do it.
waste power (Score:5, Insightful)
the average consumer LOVES to waste power and bandwidth to search for aliens. Folding, Seti & others are good projects, but if Wired thinks the average console owner wants his console to suck power, bandwidth, and make huge fan noise while not doing something with it,they may be seriously mistaken.
I'm sure the same people that run Linux on their XBOX will run folding on their console, but not the majority of users, even if the console ships with that functionality.
Re:waste power (Score:2)
Re:waste power (Score:2)
I don't play well with the other kids (Score:3, Funny)
Theres a need? (Score:5, Insightful)
Re:Theres a need? (Score:4, Informative)
A 650W PSU doesnt draw 650W if its only under 100W load.
So my 350W enermaxx is perfectly happy drawing 50W when the pc is idle. Its efficiency may be lower, but thats not THAT huge of a difference.
AND PLEASE, learn your units. Saying "drawns more voltage then needed" really makes you look stupid.
If you put a hd, a 50W cpu, 512MB high speed ram and a GPU in a console, it doesnt magically NOT use that much less energy than in a PC.
And using DC on a console defeats to total purpose: using idle cycles, mostly on little used computers.
If you turn the computer on to run the DC client, you are doing something wrong (and if you BUY stuff to run DC clients, please die)
It's the money, stupid. (Score:4, Insightful)
If the console manufacturers provide software that somehow taps the raw horsepower of the new consoles what would stop organizations, legal or not, from buying large quantities of game systems just to make a supercomputer for very cheap? Fuck that.
If I had not preordered my PS2 a year in advance I would have had to wait NINE months to be able to get one in the states. The demand for the new systems is going to be even greater. The last thing consumers need to hear is that there is a shortage of their favorite game system because Nerd University bought 10,000 systems for their new supercomputer project.
Shared computing is all fine and good for PC/Mac users, but honestly, for a manufacturer to open the floodgates of their OS to satisfy the wants of
Re:It's the money, stupid. (Score:2)
Re:It's the money, stupid. (Score:2)
That's a hypothesis, not a fact. Right now, it looks like the Xbox 360 and PS/3 will be rather expensive, probably expensive enough to cover the hardware.
Furthermore, both machines will be far more general-purpose computers than curren
Re:Just the opposite (Score:2)
I believe you aren't seeing the point. Usually, when a console is first released they are sold at an actual _loss_, so NU buying 10,000 will do nothing to speed up a price cut. In fact, since the game companies plan to recoup their losses on consoles with accessory and game sales
Re:Just the opposite (Score:2)
When the next gen consoles come out, they'll be using some really fast (and expensive) memory, but as that stuff becomes more widespread and common, prices should drop.
Sony might start out having some yield problems with the Cell or something. Yield issues are expensive. As those technical diffi
Re:Just the opposite (Score:2)
The only way NU could subsidize the R&D cost is if they bought 10,000 games. If they buy 10,000 consoles, it's the game manufacturer that's subsidizing the university. The price slash will occur after the manufacturer has sold a number of games that gets the total project cost to break even. If they sell more consoles without selling games,
Not feasible (Score:5, Informative)
I know that SETI@home has been ported and tested at least on the XBox, and it performs miserably. These console gaming systems are designed to play games, not do radio signal analysis or other scientific calculation. For example, there's little need for fast memory writing when you're mostly reading textures from RAM, but there's an extreme need when you do millions of in-place Fourier transforms. Unless Microsoft and Sony change their architectures for some inexplicable reason, I can't imagine future architectures would perform much better.
This article smacks of ignorance on the part of the author, who clearly did no research into the actual performance of consoles in regard to standard scientific computing.
Re:Not feasible (Score:2, Insightful)
Signal processing (Score:2)
I don't know precisely what sort of algorithm SETI@Home uses, but the 7 SPE vector units in the Cell chip would be near-ideal for many types of signal processing (far far more so than the original Xbox), so I think it likely it'd work very well indeed.
If the owner could be bothered leaving it on all day - and if Sony
Re:Signal processing (Score:2)
It would be trivial for a games publishing house to include a client running in the background in their games. Of course, the client should run only during gameplay, so you could argue "but mom, I'm fighting cancer!"
Re:Not feasible (Score:3, Informative)
The problem t
Re:Not feasible (Score:2)
Why is the parent modded as informative? (Score:2)
>The author seems to do plenty of research on
>current distributed computing projects, but does
>none on how the consoles perform.
Apparently from what you posted you don't know jack shit about those new consoles architectures...
>For example, there's little need for fast memory
> writing when you're mostly reading textures from
>RAM, but there's an extreme need when you do
>millions of in-place Fourier transforms.
PS3 has XDR-DRAM which is
Re:Why is the parent modded as informative? (Score:2, Informative)
This is exactly my point. Individually the processor may perform well, but when it's placed in the actual system, perform will undoubtedly drop. Right now, I'm doing performance tests on FFTs performed on GPUs (graphics cards). Theoretically, these should perform at the same "incredible" speed as the Cell processor (10 Gflops or better), but in reality bandwidth and cache constricts pe
Re:Why is the parent modded as informative? (Score:2)
Large FFTs 100 times faster on Cell (Score:3, Informative)
The Cell processor (PS3) is made for those applications. At the Power.org convention in Barcelona [power.org], IBM presented a programming example of large FFTs on Cell. It turned out [beyond3d.com], that large FFT calculations are about 100 times faster than on a Xeon 3.2 GHz processor.
Keep in mind, that this presentation was held in front of super computer professionals and its not that easy to trick them.
It's not exactly a free resource. (Score:5, Insightful)
Distributed computing advocates always seem to neglect this. They think that all those unused CPU cycles are a vast, untapped resource just waiting to accomplish fabulous things. Well, as a guy who used to have a few boxes crunching RC5-64 for Distributed.net, I can tell you that it's not a free resource when you're the one paying the electric bill.
Joe Consumer isn't necessarily going to think this technology is a great idea when he realizes that he's paying an extra $10 a month on his electricity bill for the "privilege" of crunching numbers for some dubious cause.
And, let's face it. Not all distributed projects are dubious, but many are. The fundamental problem is that a lot of compute-intensive projects simply aren't embarassingly parallel like SETI or RC5-64. And a lot of other parallelizable applications require access to huge datasets that make them unsuitable for distributed work. For example, 3D rendering can be parallelized pretty well... but the datasets are huge. For your CPU to render a single frame of Pixar's latest movie, it would need access to anywhere from hundreds of MB to several GB of texture and geometry data. A lot of scientific applications are similarly constrained.
Re:It's not exactly a free resource. (Score:2)
Re:It's not exactly a free resource. (Score:2)
Re:It's not exactly a free resource. (Score:2)
To help who? (Score:2, Troll)
Re:To help who? (Score:5, Interesting)
(1) All of the distributed applications that you mention release the results of their research as public scientific publications. Any companies can use the results, but so can anyone else. Subscription to the journals is all that costs money, but generally free "e-prints" are available. All of the distributed applications that you mention are non-profit.
(2) Even if they were patenting the results (which they aren't -- see 1) it is better to have the patented result that one has to pay for than to have nothing. If I have breast cancer, I would rather pay $1000 for a test than be unable to get a test because no company wanted to invest in it.
As a side rant (somewhat related to (2)), you say patents are inhibiting progress. But without the financial incentive that the breast cancer patent generated, the medicine would never have been developed. I'm sorry that so many people only work out of greed, but that's reality at the moment. And it actually works pretty well.
Re:To help who? (Score:3, Insightful)
Ok, I stand corrected here. I have seen distributed computing come up where things were not going to be released back to the public though. Most universities, including the afore mentioned Stanford, are doing research with corporations who get to monopolize the results when something useful comes out of them (and taxpayers subsidize university research departments). Although this article doesn't indicate that one way or the other. It does give a link to the project but I don't really want my or
The Walmartization of pharmaceuticals (Score:2)
Re:The Walmartization of pharmaceuticals (Score:2)
Distributed Computing for Worthy Causes? (Score:2)
Want to get paranoid? You do? Cool!
Figure: There are plenty of distributed computing projects out there, and it may not be easy to tell from your console's behavior what project you're actually contributing to. Now consider who makes those consoles:
Now imagine y
Altruism is not the best motivator (Score:3, Interesting)
That would probably be enough to motivate a lot more people to turn their machines over to SETI.
The idea that people are going to let their machine run their crunching away, for free, for no benefit, is pretty stupid. The first distributed computing project to offer any sort of tschocke is likely to become more help.
Re:Altruism is not the best motivator (Score:2)
This isn't necessarily true. If you look at the hundreds of thousands of people running Folding@Home, GIMPS (mersenne prime search), SETI@home, etc. ,they are all dedicating their CPU usuage to the greater cause of their particular project. 99.9% of them will never get a direct personal return for their contirubtion, except for the rare few who, like for GIMPS, find a new Mersenne prime
Re:Altruism is not the best motivator (Score:2)
Sure. But look at the hundreds of *millions* of people who *aren't* running SETI@Home, etc..
/. and other tech sites, along with various science and tec
You literally have several orders of magnitude more people not running these apps than people who are.
Given that
And yet the stupidity persists... (Score:2)
Re:And yet the stupidity persists... (Score:2)
Secondly: Ageia are targetting the PC platform. They've joined up with ASUS to make a plug-in card for PCs. As far as I know, there are no plans for their chips to turn up in next-gen consoles. This particularly means they'll have to show that their card actually improves things. If it doesn't,
The future of business computing? (Score:3, Informative)
Perhaps IBM doesn't just want to sell chips to these people. Perhaps it has a reason for selling the PC division to Lenovo. Perhaps it sees an opportunity to create a business architecture in which the virtual business world runs on the server farm, while the graphics and sound capability of the very cheap clients delivers a superior user experience that makes users happy not to have a "PC" on their desk. Meanwhile the data mining and compute-intensive activities are farmed out to those clients while they aren't being used. Fault tolerant. Cheap to extend. And round objects to Microsoft.
another solution (Score:2, Insightful)
a) companies wouldn't spend money on building this into the console
b) most consumers cannot be bothered
There will be people who'd be interested though. I'd try it if I had a ps3... However with more and more use of clustered smaller machines in place of large supercomputers, clusters of consoles have been built in unis and research labs (for example here [uiuc.edu] or here [uh.edu]. There are a few ad
PS3 Linux (Score:2)
not distributed, but possibly racks of them (Score:2)
But given the relatively low cost and simple setup of these machines, labs could buy racks of them and use them as compute nodes. Perhaps Sony and Microsoft can view any small per-unit loss they may take on these machines as subsidizing research.
Not an easy article for me to read. (Score:3, Interesting)
2. I run ClimatePrediction.net on my Mac and Linux x86 systems. The program is huge, comes from a mainframe environment, and is married to an INTEL compiler. The PPC version is, needless to say, not very fast. Single work units can take months to complete.
The other projects in the article would be on my plate, too, if they compared with my concern for climate change.
mobile phones... (Score:2, Interesting)
Re:mobile phones... (Score:3, Insightful)
So to make it happen, consumers would probably have to suffer with shorter battery life or larger batteries. Given how neat everyone thinks it is to have a cell phone which they can lose inside their own ears, I just don't see it happening.
Maybe something
Re:mobile phones... (Score:2)
But an idle processor generally uses less power than a busy processor, and on battery powered devices such as mobile phones power usages is a _huge_ issue.
Folding Flaws (Score:4, Insightful)
Now the Captain is wondering how many of us actually leave our consoles on when not in use? Show of hands... Now! Hmmm, not too many. Now how many of you would actually like to pay extra in electric bills to do it? Ouch. Even less. And finally, how many are going to mod their PS3 and actually downloard the app to make it happen? That leaves just about... Nobody.
Re:Folding Flaws (Score:2)
You're not from the US? I'm from Germany, but for me it was quite shocking when I was working in our US branch office and realized, that noone turned off the desktop computer at the end of the day. It was even harder to understand that nobody turned off the monitor (CRTs and LCDs) when they went home. Just a few computers were configured to put the monitor into power save mode. I visited some of my colleagues and friends and when we arrived, their computer and monitor were turned on - the whole day. So I
Re:Folding Flaws (Score:4, Insightful)
But consoles are different, probably because just leaving them on doesn't really accomplish anything useful for 90% of the people. They boot nearly instantaneously and will have to load the media from scratch anyway, regardless of whether you leave it on or off. It's like there's no point to do so. Unless I missed a clue somewhere, i can't EVER remember walking into ANYBODIES room to find the console just left on, unless it was purely by accident. It's just not the trend and stuff like Seti and folding can't easily piggyback off something that isn't already an ingrained habit. Not a lot of people are going to change just so they can use their system.
I would use it for my distributed computing needs (Score:2)
I would love to be able to let my PowerBook or Mac Mini send compute jobs to a PS3 or XBOX360. I suppose the PS3's cell architecture could really crank out the MPEG2 video.
Hopefully the distribut
Give bonus levels, stronger guns as rewards (Score:3, Insightful)
Bert
Bonus sllogan: Save a live and you get an additional live.
The Real Issues... (Score:2)
First of all, don't hold your breath. Running distributed computing apps on a console == running arbitrary code. We update these programs all the time behind the scenes. So you will only see these apps on consoles if you see these companies let you run any code you want - not going to happen. Never.
And, if that happens, consoles will all be busy as spam zombies, not as helpers to us. Bad news - that's where all the serious black hat money is these days.
That s
Tough sell (Score:2)
EON ( was Re:Run something useful instead) (Score:2)
It allows them to make detailed predictions about the dynamics of materials, truly a vital task. If we could predict materials properties (hardness, tensile strength, conductivity, surface roughnes, etc.) by playing with composition on a computer -- much cheaper than by experiments, and much more controllable -- then we would have an entirely new realm
Not to burst your bubble... (Score:4, Insightful)
Anything like what you described (or any compensation for your CPU cycles) is unlikely to ever happen. Reason? Most of the organizations asking for your CPU cycles are either too poor or too cheap to give you anything in return. They can't even afford to pay for the power usage that you incur, let alone put anything towards your hardware.
And for what it would cost to create and maintain a MMOG like what you're talking about, at least one people would be interested in playing, they could just buy an assload of computers (think $100 to $200 a pop barebones systems) and plug them in.
Not that it isn't a cool idea, just not feasible. You have to see the organizations asking you to run this software for what they really are...beggers. Not that that's necessarily a bad thing, as long as you realize they will probably never have anything to offer you (other than the warm and fuzzy feeling of geekiness).
Re:There's no chance of this happening (Score:2)
Re:This IS a big step. (Score:2)
Re:Yeah right (Score:2)
But think about the PS3. Its processor does 256GFlops, something no current consumer processor can even dream of doing. In a year when it is relased, I am betting it will still be ahead of the dual-processors. A ahead of the curve console would make more of a splash than a behind the curve console.
Re:Question (Score:2)
For any serious scientific research, both 32-bit and 64-bit floating point units are designed to have extra precision that aren't immediately visible to the application. The idea of these is to prevent any precision errors from creepi
Re:thats all i need (Score:2)
Economy will go into a depression (Score:2)
So that by the time you're 65 and have made wise investment choices with your money
"Wise investment choices"? In 1929, what investment choice was "wise"? And if you think the end of oil [lifeaftertheoilcrash.net] won't cause a depression, think again.