Distributed Playstation 303
withinavoid writes "News.com has a story up about the next generation Playstation 3. Apparently the game developers are asking for a 1000 times performance increase and that's just not possible, so they are looking at distributed computing as a possibility. "
Tekken 4 (Score:1, Funny)
Playstation 3? It's already out!! (Score:5, Funny)
Re:Playstation 3? It's already out!! (Score:2)
Distributed (Score:2, Interesting)
Yes it is possible... (Score:2)
By the time they even get the spec done the technology will be there... (SMP for the playstation 3... that would be a no-brainer to really jack the performance in there.)
Re:Yes it is possible... (Score:3, Insightful)
Why does everyone want to turn game consoles into PC's? I enjoy the simplicity of the modern console game; just pop in a cartridge or CD, and play. That's it. No sysfiles to configure, no add-ons to buy (at least necessary to play most games, the N64 had a memory upgrade to play certain games, most notably the latest Zelda release).
I just wanna play dammit!
Its an early April folls joke (Score:2)
Distributed Consoles? no. I can believe we'd have a distributed OS, a distrubuted computer, but these computers could never be used for games.
Re:Yes it is possible... (Score:2)
Look at peripherals for consoles. The only highly successful add-on for a console is basically the memory card (and also probably the Sony Dual Shock controller for the PSOne). Other peripherals don't have anywhere near the market penetration to make it worth coding for. For examples of this, look how many 4-player games came out for the Dreamcast, N64 and GameCube (which had 4-support built in) versus the PSOne/PS2 (which required an add-on). Because of this, modularity is not going to happen.
but if I had the ability to switch from standard video chipset to the ultra-insane-fast $399.95 video upgrade option that adds the physics module for jiggley breats to the fighters in virtua fighter 73, then dammit I'll buy it!
You might, but history shows that most people wouldn't. Even games that required a measly memory upgrade on the N64 didn't sell nearly as well as games that did not. People (read: parents) don't want to have to figure out what needs to go with a game to play it. They want to know what system the game is on, and if their kid has that system.
Atari (Score:3, Funny)
Just think how much faster PONG will run!
Fantastic! (Score:4, Funny)
distributed game playing? (Score:2, Funny)
i'm not so sure this is a great idea.
How the dialogue really went... (Score:5, Funny)
Developers: We want a 1000 times speed increase
Sony: Would you settle for a press release containing a bunch of buzzwords
Developers: Which ones?
Sony: Let me think: "distributed computing", "biotechnology", "linux", "grid computing" and "Moore's Law"
Developers: OK, if you throw in some hookers at the next Comdex in Vegas
Sony: Deal
Re:How the dialogue really went... (Score:2, Funny)
Developers: OK, if you throw in some hookers at the next Comdex in Vegas
It would have been funnier if you said "more hookers", or "midget hookers", or "hookers with donkeys". The base hookers is a given.
Re:How the dialogue really went... (Score:5, Funny)
Re:How the dialogue SHOULD HAVE gone... (Score:5, Funny)
Hardware guys: *kick the developers in the nerts* Give us games 1000 times better and we'll think about it, you pathetic freaks.
Re:How the dialogue really went... (Score:2)
How about games that look like real life? The computation required to simulate reality (touch, taste etc as well as just very high res 3D graphics) would be good.
perhaps... (Score:2)
So, what is all the extra performance for?
Kewl .... (Score:2, Interesting)
Previous experience with 8xP3 cluster rendering PVM via 100bt already shows some signs of information passing problems, i guess they won't be using it for rendering then
But its still interesting at what the internet could offer for them
Maybe the distrbuted net might be in the "neighbourhood" area rather then internet based hehe
I really hope they'll lower their prices no matter what they come up with
Sony Hype Machine (Score:5, Informative)
It would appear to me that hyping a PS3 while the PS2 is still selling strong would be an attempt at keeping people from getting an Xbox or GameCube.
Re:Sony Hype Machine (Score:2)
You're just seeing the market in action. Calculated pre-release information helps keep people talking, and in some cases can't be helped.
Re:Sony Hype Machine (Score:2)
US companies release the game over there, and then six months later we might get it. That's BS.
Editorial math? (Score:4, Funny)
Maybe they're designing the next generation Playstation on a Pentium machine. Did they ever fix that bug?
-Jeff
Re:Editorial math? (Score:2, Insightful)
*"Moore's Interesting Trend" would be more technically correct.
Re:Troll math? (Score:2)
Moore's law says double in approximately 18 months, and they are after a 1000 times increase, but I highly doubt exactly 1000.
How would this work? (Score:5, Interesting)
My other concern is how would they achieve the distribute network. The thing I like about my consoles is that I stick in the disk/catridge and play, no pissing around. I hack on my PC, I play on my console, and thats the way I like it. If I have to start into configuraing and debugging (which as it gets more complex is bound to happen) then the whole reasn for the console goes out the window.
Re:How would this work? (Score:2)
Yeah, some machine in Japan is going to render my video for me and get it back to my box quick enough so I have more than 5 FPS? I don't know what the hell this article is trying to achieve, but it isn't accuracy.
Distributed systems work on problems like Seti or cracking codes, not real-time game playing! 1000 times power? That would put the PS3 onto the supercomputer 500 list, wouldn't it? Or are they talking 1000 times graphics power- that would be enough for photorealistic VR, for God's sake. (Excuse me if I'm wrong, but my bullshit meter started hitting 100 around the 2nd paragraph and I couldn't make myself finish the article)
"We can't wait for Moore's Law"- then make it multiprocessor, jackass. If, for some stupid reason, they did make the thing distributed, you'll have millions of people/machines trying to do work on everyone else's machines, meaning it will be just as slow + all of the network latency. Either these people are dumb as shit (which I doubt) or they are just trying to bang the drum a bit for sales.
Re:How would this work? (Score:2)
What this power *could* be used for is shared world processing. In all current games if something is offscreen it either doesn't exist, or it exists in a "frozen" state. If you have a thousand machines, each one can handle a little peice of the background processing and hand the information off to anyone who needs it. It can add quite a bit of realism when the rest of the world keeps working normally even when you aren't looking at it.
It can be a problem if anyone mods their game though. Not only could they cheat, but they could potentially wreak havok for every connected player. Probably of the best ways to deal with this is do all calculations on 2 or 3 different machines and make sure the results match. This also helps prevent issues when some machines disconnect without warning.
-
it's not the net, dude. (Score:4, Interesting)
Think about it: the memory bandwidth on the PS2 is at least twice as fast as the FASTEST network connections available. That's nothing to say for the pitiful 2-3Mb cable pipe that is available to most people.
The graphics are what the developers want to see the improvement in. I don't think you are going to see any improvement in performance in this area brought about by distributed computing. If it's possible, I'm really curious as to how.
Re:it's not the net, dude. (Score:2)
how are they going to sell these cycles to other companies? especially since these cycles would not be sony's to sell - but mine.
and - if the memory bandwidth is so great, yet the network bandwidth is so pitiful - how does that make it a suitable topology for distributed sellable bandwidth/cycles?
Re:it's not the net, dude. (Score:3, Interesting)
On the other hand when someone is playing a multiplayer card game for example, 99.9% of the cycles and network connection are going to complete waste. It would make sense to use the excess to sell distributed processing. It could eaily make the game network a free service rather than a pay-for-play service. Heck, they could potentially let you earn credit of some sort. Leave your PS3 hooked up to the network during your vacation and come home to a pre-release coupon. 10% off on a hot new game - and get to start playing it 2 weeks before it is even available in stores.
how are they going to sell these cycles to other companies? especially since these cycles would not be sony's to sell - but mine.
Whenever you play a game they have full control of your cycles. Whenever you connect to their netork they have full control over all the data you upload and download. They could do anything they like without telling you, but it would probably be safer for them to include some wording in the licence about it - "by connecting to our game network you agree to receive, process, and transmit distributed network data".
if the memory bandwidth is so great, yet the network bandwidth is so pitiful - how does that make it a suitable topology for distributed sellable bandwidth/cycles?
For graphics you are moving huge amounts of data, but you have to update the screen several times a second. A distibuted network is useless for this kind of data.
For some projects you only need to send/recieve a few hundred or a few thousand bytes, but they can take an hour to process. Seti signal analysis or molecular protine folding problems for example. Distributed networks are great for these kinds of problems.
-
Doesn't make sense... (Score:5, Insightful)
Unless I'm misunderstanding something about the article, this makes no sense at all. Rendering a video game isn't nearly the same kind of workload as rendering a movie. The former requires low-latency, whereas the latter can be farmed out and done in batches.
There's no way you're going to get a 1000x performance boost by distributing a video game over the Internet.
I would bet that the real idea is to build in support for distributed multi-player games, and somewhere between the engineers and the marketroids things got horribly twisted.
Comes from current PS2 architecture (Score:5, Informative)
The reason was the Emotion engine in PS2, that it explicitly multithreaded, i.e. you have to make your program use all threads (unlike PPRO for example, where CPU does it for you).
It's really a whole new way to program.
Now it seems that Sony convinced some developers to lean it there's nothing stopping them from making more threads (there are 16 in Emotion if I am not mistaken).
Oh, and it has nothing to do with distiributed computing over the Internet. The application architecture is similar, but that's it. And yeah, no batches here
As for IBM involvement, here is the article in Wired Magasine about their cell computer [wired.com]
Oh, and ahother one about PS2 and PS3 [wired.com], that one is quite old, but explains where Sony is going.
Re:Comes from current PS2 architecture (Score:5, Informative)
1) Poor tools
2) Arcane DMA alighnent issues
3) Misguided selection of VU integer instructions
(no imul, you have to jump through hoops to do xor, only 16bit, yet flags are in the upper 16bits)
4) Hard to singlestep the VUs
5) Very limited blending modes in the GS
6) Very limited iterator precision in the GS
7) No hardware clipping
8) Wierd GS rules where rendering horizontal triangles is much slower than large vertical ones
9) Non perspective correct iterated vertex colour
10) Limited vram
11) 1.0 * 1.0 == 0.999999999 in the VUs
I.e. it's cheap and flawed (but hey that's a challenge and some people seem to like it)
Then you go
"It's really a whole new way to program."
You are not a programmer are you? Didn't think so.
"Now it seems that Sony convinced some developers to lean it there's nothing stopping them from making more threads (there are 16 in Emotion if I am not mistaken)."
What are you talking about? I can't possibly imagine why you would want 16 threads in PS2 game.
Genrally CPU multi threadding is quite costly. Most games are written to run at a solid 60fps*, so you can often get away with out multithreadding
stuff like the AI, the Renderer or some trickle loader.
BTW The only console that really had seriously multi-thredded games was the N64.
* Due to field rendering you have 2x the fillrate (and more vram) at 60fps than at 30fps. Dropping to 30fps is bad!
Re:Fixed IBM article link (Score:2)
Re:Doesn't make sense... (Score:2, Insightful)
Don't just think distributed graphics processing, think distributed storage, and distributed AI.
This would allow for P2P massivly multiplayer RPGs.
Worlds could span and grow endlessly, as you could download details of the lanscape from the people who virtually hang out in said landscape.
This amounts to an nearly infanite amount of storage, for creating huge complex, and detailed worlds. Of course the problem would be synching so everyone sees the same world. but some games might not require as much synching as others.
These online worlds could would be rapidly evolving.
AI NPC's could evolve by learing how other users play, and learning from other AI's they meet traveling from Playstation3s to Playstation3s.
Bassically we are talking about a gigantic computer on wich to run genitic type algorithems, allowing for wolrds that might actually grow in depth and realism over time.
The possibilities are mind boggling really.
Compared to todays MMORPG's I would say that such an advancrment would open up the possibilities for video games at least 1000x.
I mostly agree with you... (Score:2)
Or of course they could also be talking of having more than one CPU like on a SMP box. A bit like the GPU is taking load of the CPU...
Who knowns, it might even just be a marketing ploy.
PC based? (Score:2)
Remeber, OpenGL only exists with the support that it has at this point because of a video game.
Hahahaha Latency! (Score:2)
Re:Hahahaha Latency! (Score:2)
Perhaps it'll make more bits travel across the line in a given time period, or allow longer distance runs, but it certainly won't reduce latency.
Re:Hahahaha Latency! (Score:2)
If somebody made a fully optical switch...
What about TAO? (Score:2)
Phillip.
Dumb (Score:5, Insightful)
I got as far as "maybe the Playstation 6 or 7 will be based on biotechnology", or some such garbage.
Please. This story is nothing more than a trumped up press release targetted towards the Xbox and GameCube in an attempt to either 1) slow their sales or 2) engender positive mindshare for the Playstation.
Distributed computing? In other words, "imagine a Beowulf cluster of these..."
Re:Dumb (Score:2)
Yeah. my reaction was "maybe the Playstation 8 or 9 will be based on warpfieldtechnology".
I'll be there in a minute mom! I'm remodulating my PS9!
-
don't laugh (Score:2)
Basically for Microsoft, they can pretty much put prices as far down as they like, and then it's a game of who can keep bleeding money longer.
Re:Dumb (Score:2)
What this reminds me of, more than anyting else, is the early 80s, during the Great Video Game Console Crash.
I just don't know--Sony is big, MS is big, but MS has a habit (and reputation) for stick-to-ittiveness unmatched in any other company. Remember how god-awful Windows 1.0 was? Any other company would have given up immediately and filed Chapter 11--MS managed to trump and dominate a field that could have been Lotus's through a long campaign of shrewd marketing, crude marketing tricks, bull-headed stubborness, and a little bit of real cleverness.
But then, what do I know... I have yet to see the game that makes me want either an XBox, PS2 or GameCube. (Well, the Star Wars game for Gamecube, maybe) All they have are poor ripoffs of 3D-FPS, sports games that I don't care about, driving games that aren't as much fun as Spy Hunter was back in the day, and franchise games (FF Eighteen Bajillion Million, Yet Another Goddamn Mario Game, etc)
I can see it now.. (Score:5, Funny)
Well, I guess if they're rack-mountable, I'm game. Bring it on.
Re:I can see it now.. (Score:2)
I never got an answer, but I never heard any solid argument against that idea. I was hoping GT3, with its link capabilities (6-player or 6-monitor solo games) would make this dream a reality, but alas!
GTRacer
- Now if I could just get my Japanese PS2 to link with the U.S. one...
But how? (Score:2, Interesting)
I mean, I can imagine(being NO expert), that distributing all the data, waiting for it to be processed and sent back, takes more time then actually doing it yourself...in such a case.
or am I wrong...?
Distributed computing (Score:3, Funny)
That sounds like a practical solution. I'll just buy a beowulf cluster of PS3's and ...
(Do these guys think I can offload the processing for my games to someone else's PS3? Won't that PS3 be busy trying to run someone else's game?)
Actually, you'pre probably closer than you think. (Score:2)
With that thought in mind, maybe the idea is to have the consumers buy more than one PS3, and install them in a rack. Or maybe have in-box rack space to add in extra mother-boards for multiple PS3s. With a custom bus/interconnect they could have fairly high bandwidth for distribution.
Then you have distributed right in your own home. Just add more PS3s until your performance reaches tolerable levels, different for each game. Sony sells many more PS3s, multiple to each customer. What a marketing plan!
Re:Actually, you'pre probably closer than you thin (Score:2)
Isn't it the case that Sony/MS/Nintendo sell the hardware at what amounts to a net loss? Don't they only begin to make money once the user buys 'X' number of games, revenue derived from licencing deals, etc?
If their plan is to have the user buy multiple platforms to be used in unison, then they had better figure out a way to manufacture these boxes at a dramitcally lower cost per unit.
From the consumer perspective, I can justify spending $300 to get the latest and greatest console platform, but having to shell out an additional $300 to get decent performance for my $60 Game Title? Or having to buy 4 $300 units?
This is a business plan that is doomed to failure.
Sony and Nintendo would indeed be wise to borrow Microsoft's idea, and assemble platforms based on mostly commodity hardware. Farm out the R&D to people like NVidia or ATI for graphics accellerators, and bus architectures.
This would allow the title developers to create games that could concievably run on 3 platforms, PLUS PCs, with only minor differences.
Seems to me that would have a lot of appeal, for consumers and developers.
Increasing performance by a multiple of 1000 is ridiculous in the span of a single generation. Doubling, tripling, or quadrupling? Maybe.
NVidia now owns all of the 3dfx SLI patents... They could do some "distrubutive processing" inside the single chassis if they thought this was the way to go. Effectively doubling the number of Graphics chips needed/used may allow them to gain some economics of scale to keep prices within reason, while boosting performance by factors of 2 or 4.
How about..... (Score:2, Interesting)
(And you can run GT3 when you're bored!)
Chip MultiProcessors? (Score:4, Insightful)
There are several problems with this. Memory bandwith, power consumption, etc... but the main one is that most normal applications are written for a single thread.
Imagine how many MIPS 4K cores you can fit in 300mm^2 in 4-5 years. That's a lot of power. Sure, they might only run at 1-2Ghz, but there will be 64 of them on a die. If you can harness that power, it might give your game developers much of that huge performance boost they want.
Think beowulf-cluster-on-a-chip. As with multiple-workstation distributed computing clusters, the trick is not in setting the thing up, but in figuring out how to distribute your work.
Re:Chip MultiProcessors? (Score:3, Interesting)
Imagine how many MIPS 4K cores you can fit in 300mm^2 in 4-5 years.
AAAAAAAH! This is my livlihood. 300mm^2 makes me scream. If you think the average consumer will be able to afford a game console that has a CPU that's 17mm on a side, I want your credit rating. Are you related to the guy whose name appears to be "Object of Envy" [slashdot.org]?
Half that size isn't bad. If you had mentioned about cramming CPUs into 150mm^2 or even 100mm^2 (I think the Game Cube processor is below 50mm^2), that would have been more realistic.
Re:Chip MultiProcessors? (Score:2)
Anyway, regardless of actual chip area, the theory is the same... in general, several tiny processors can be better than a single big one.
Re:Chip MultiProcessors? (Score:2, Insightful)
Sorry, but current video games do not fit your definition of a 'normal application'. The PS2 is actually a highly parallel machine. It is also quite different from any platform that game developers had ever programmed before. In fact Sony's delays in getting out a good set of programming tools to developers so that the PS2 could be fully utilized is a large part of the reason why it took so long for games to start coming out for it. GT3 is a bit of an exception, but that one game had to carry the console for quite a while...
Perhaps a few years ago I would have accepted your argument, but not today...
Re:Chip MultiProcessors? (Score:2)
This parallelism is typically very different than thread-level parallelism, as it isn't as easy to communicate over a network to another processor as it is to just pass things from one instruction to another throug a register.
However there is interestin research in doing fun stuff with multiprocessors. So who knows what will happen...
Re:Chip MultiProcessors? (Score:2)
Two reasons why this doesn't work so well in practice:
Short version: 64 processors is never 64x faster than 1 processor. Things usually start to suck at 4 or 8.
Long version: As you try to parallelize to more processors, the time spent on serial (unparallelizable) parts of the task and on communication starts to limit your performance.
Think about graphics cards for a minute. Remember how much arse two SLI-rigged Voodoo 2 cards kicked back in days of yore? Ever wonder why 3dfx didn't just put 16 of them on a chip a few years down the road?
Evolution of graphics cards' feature sets is only part of the answer. The other reason is memory bandwidth.
Most graphics cards are memory-bound under common conditions. Clocking the GPU twice as fast, or having two GPUs, would accomplish little if you still have to wait for texel fetches from card memory. This is especially bad if you're trying to build a 64-core rendering engine. To send all of that texture data and all of that triangle data you're going to need not just completely insane bandwidth, but 64 or *more* (if multitexturing) _independent_ memory channels, each with a silly amount of bandwidth.
Texel caches on-die don't help you. They'll be too small to do any good. What you need is 1000-way interleaved DRAM on your console's board, and a bus running fast enough to radiate leakage in far-infrared to transfer all of that data.
In summary, while multiple cores are a good idea, and help up to around 4 or 8 cores, a massively parallel on-chip solution won't help you for game rendering, because the working set is too large to be cached per-core.
The only approach I've heard of that even *might* reduce this problem is the PixelFusion approach, and I strongly suspect that that smacks into memory bandwidth and Amdahl's Law problems too (it renders many pixels in a tile in parallel, but you still need texel information for all pixels for blending; z filtering doesn't save you if you still need 4 texels per pixel and don't have very bad overdraw).
Re:Chip MultiProcessors? (Score:2)
Re:Chip MultiProcessors? (Score:2)
The idea behind chip multiprocessors is that instead of trying to execute a single execution thread really fast (which is what current chips do) you should try to execute many threads slower.
This has lots of advantages, but the big disadvantage is it's very hard for most applications that demand lots of performance. And you need lots of memory bandwith.
Think about 2- and 4-processor PCs. They don't help you very much for your video game, because it's typically one thread of execution. Where do 2-way and 4-way (and 64-way) systems really help out? When you are running many seperate processes. If you can divide things up into many tasks you can get great benefit over a sinle core, but if you have a single thread that needs to go fast it (usually) doesn't go fast.
To a certain extent, this is done in game systems... there are different parts to do audio, video, and "general". They use "distributed processing" in that they distribute the tasks to different parts. My guess is that they are investigating how to do this type of thing on a much larger scale.
Where to donate your spare CPU cycles? (Score:2, Insightful)
(a) A search for extraterrestrial intelligence.
(b) A search for Mersenne Primes.
(c) A rendering engine allowing the geek next door to play Tekken with really, really good graphics.
Take your time.
I could live with it (Score:2)
Power & Simplicity ! (Score:2)
Ignoring power for the moment, less complexity would be even better. It's a real b!tch having to manage ** 6 ** CPUs in parallel !! (EE, VU0, VU1, GS, IOP, and SPU) Throw in DMA transfers on top of that and it's enough to make a person pull their hair out.
Re:Power & Simplicity ! (Score:2)
Re:Power & Simplicity ! (Score:2)
I don't know - I have very little problem with it... I put in the disc, and press the power button. Then I follow the instructions on the screen. Maybe you need to make sure your controller is plugged in.
(For the sarcasm impaired, my point is that internal complexity does not affect consumer use. Most game companies now say that, once you get the hang of it (and now that there are code libraries available), the PS2 isn't that hard to program for, and offers nice flexibility).
--
Evan
PS3? Wow! (Score:2)
For those of you who don't wanna click the link, the relevant quote is:
More intriguing however, when asked about the status of development on PlayStation 3, Kutaragi-san responded, "Nothing has been started yet."
Yet another wonderful CNet SNAFU.
distributed vapour (Score:2)
Let's just say, for the sake of argument, that the hardware they manage to scrounge up (allowing Sony to keep the PS3 price inline with the initial retail price of the PS2) is 4x as powerful. So they still need to get 250x more power to satisfy the game developers. OK, fine. But distributed computing??
The way I see it, some Sony brainchild figures that most PlayStations are left "off" most of the time. So why not use that time to let other people around the world use *their* PS at a higher level? While Japan is sleeping, US PS3s could be using the spare Japanese processing power to improve the gaming experience.
Two problems:
This article is just ridiculous (Score:2)
Looking further ahead, Okamoto saw even bigger changes for Sony's game business. "Maybe the PlayStation 6 or 7 will be based on biotechnology," he said.
Yeah, and maybe it will still suck compared to Ninetendo. Listen, by the time they're at PS6 or 7, they better be marketing a damn Holodeck.
Yeah, right....this cant be accurate (Score:2, Funny)
"Sir, looking at our usage staticstics it seems to be that your Sony(tm) Playstation(tm) 3(tm) has not been doing its fair share of our distributed computing"
Nintendo to the rescue... (Score:2)
"It's not about the hardware you stupids, it's about the game."
BTW, as the end of the article reveals, M$ did an excellent job at localizing their HW and SW. Omitting to do international research on things like controller sizes and text size in dialogs: Wow, no need to have expensive offices in Tokyo and Europe to end up making such stupid mistakes. Not to mention that they could have simply copy the way Sony did it, as they had done for the rest.
PPA. the girl next door.
Umm (Score:2)
What in the world could they do with that much computing power? Holodeck?
I think 4x increase would be mind blowing. Although, Bill Gates was once quoted as saying that 640k should be more than anybody will never need.
PS2 only needs one improvement. (Score:2, Insightful)
Shorter load times couldn't hurt either...
Seriously, I wonder what the heck they would do with distributed computing. Obviously, it's not going to give you any better graphics at all. Maybe in multiplayer games you could split up collision detection/physics work. Maybe this means they want to make p2p massively multiplayer games. Maybe they want to make insanely cool new AI systems.
This could really kick ass...but it's probably just hype.
BUT FIX THE DAMN JAGGIES FIRST!!!! ; )
Playstation 8 Announced!! (Score:2, Funny)
Okamoto said the method also appears to hold the most promise for dramatically boosting the performance of the PlayStation. Instead of being reliant on a hardware "processor", all game computations would be performed in the user's own cerebrum. Unfortunately, this means that game developers can not work on a strictly "fixed platform" basis any more, considering performance will greatly improve with intelligence.
"I think we can easily overcome this barrier," Okamoto said, "Instead of hardware requirements as we see them now, we could instead have IQ requirements. Like, we would say that the minimum requirement for Gran Turismo XII is a high-school diploma, but we would recommend at least a college-level education to get any decent performance. But then for games like Resident Evil, well, any idiot could play that."
The gaming industry was reeling with excitement by this announcement, and Okamoto was further pressed for details on how this technology would actually be implemented. After a few minutes of uncomfortable shuffling and avoiding eye contact, he eventually admitted that he was merely "making shit up".
1000x performance? Yeah, sure. (Score:2)
Let's keep in mind that the PS3 is probably still some time away; 1000 times the performance is not as stupid as it sounds, only almost. There's also the question of what Mr. Okamoto (Sony's CTO in case you didn't read the article) really means by "performance" - CPU speed? HDD capacity? Screen resolution? Frames-per-sec? Or some mysterious combination of them all? Most likely, he was just trying to build up some hype - same as the fantasy that PS6/7 will be based on biotechnology. Yeah, right.
Also, I don't see distributed computing as something which will be very useful for playing games; sure, with a high-speed link between several PS3's you might be able to fake SMP, but the games would have to be optimized for it, or the performance increase would be abysmal compared to the extra cost of having to buy two PS3's. You might as well just get yourself a PC and have a gaming rig that's easier to upgrade, runs a wider variety of apps, has a decent-resolution monitor and gives you a choice of what OS you want to run. Of course, the PS3 might have all this, but don't bet on it.
Btw, I wonder what Pete Isensee (the Xbox developer guy) means by saying that Microsoft can't get stuff right until version three. Windows is WAY beyond 3.0, and there's still plenty of room for improvement (note the careful wording there).
PS9 (Score:3, Funny)
Don't any of you watch T.V. ????
VideoDrome (Score:2)
When they come to install Crash Bandicoot in my sternum, I am running the other way.
Iraq (Score:2)
Uh huh. (Score:2, Insightful)
Here are some problems with a distributed gaming console that I can think of off the top of my head:
- Latency: The main reason you'd want a lot of processor power in gaming is to calculate physics and graphics. This needs to be done on a damn-near-real time basis. No distributed computing network can provide this. High end clustering, maybe, but nobody is going to pay for multiple PlayStatia to play one game.
- Availability: Sony KNOWS that they are making a device akin to a toaster. When you turn on the console you should be able to play your game. Without worrying about your network connection, whether your neighbor's microwave is disrupting the Super National Ultra Wireless Grid, etc.
- Infrastructure: Don't even get me started. Sony would have to build millions of wireless POPs in a grid across the entire country. Or wire everyone's house when they buy a PlayStation.
- System Load: Say the PS3 is 10x more powerful than it is now. That means you still need 100 of them to reach the "1000x" figure they are blathering about. This means that if America has a million networked, always-on PS3s, only 1% of them cam be in use at any given time. During peak hours this is probably not possible.
In other words, this is dumb. Tell me if I'm wrong.
Justin
Just to clarify.... (Score:2, Interesting)
... As far as I can tell from the article, they're talking about *internally* making the PS3 a multi-processor system.
They are looking into basing the architecture on some of IBMs research into distributed computing (specifically, something called grid computing [ibm.com]).
They are *not* talking about *actual* distributed computing using the PS3 - this is purely about the internal design being based on a distributed model to get more performance.
Re:Just to clarify.... (Score:2, Interesting)
Grid computing, as defined in that IBM article, implies geographic seperation. Getting 1000X or even 100X of the PS2's processing power into the PS3 within 1 or 2 years is unrealistic. The price of the system simply does not allow it.
Even with internal multiprocessing, you'd still need a huge number of processors.
Justin
Sony has some problems. (Score:3, Interesting)
Speaking as a game developer (Score:2)
Maybe what they really meant to say was they're investigating parallel processing, not distributed computing. If they wait long enough that they can get a 10 times increase in graphics processing power and design the system such that it can run 100 of those processors in parallel, well then there's a 1000 times increase (of sorts, it's not really that easy, nor would that likely turn out to be a reasonable proposition for consoles that are meant to cost <$300!). But otherwise I think this is just marketing being out of touch with reality.
Crazy XBox fans... (Score:2)
Daydreaming at the podium (Score:2, Interesting)
As a game developer... (Score:2)
The interesting thing to note is that artists don't have the tools capable of managing nor time in the schedule to spend making billions of polygons for each model. Increased content means longer development time. As it is, the graphics chips on the current consoles take away so much of the real work that games have little to worry about when fitting in gameplay CPU requirements. Even memory constraints are fairly relaxed these days. I mean, I never thought I'd see the day when the STL is used for console games.
Power? Bah. Improve the libraries. (Score:2, Insightful)
Very interesting Microsoft quote (Score:5, Insightful)
Re:Very interesting Microsoft quote (Score:2)
Hmmmm... (Score:2)
Hey, Sony! How getting your head out of the tech closet and think about making games today that don't play like ass?
How much do you want to bet that even with a playstation three hojillion the Resident Evil series still won't have "custom features" like the friggin' ability to sidestep or save your games anywhere but the God forsaken typewriter?
Honestly, can we get some late 80's gameplay dynamics up in this thing?
With that kind of power, I can only imagine that it is just that much more easy to make a game series like Resident Evil or Syphon Filter look and play like total doo-doo.
region differences (Score:2)
This brings up two points.
Point one is that this guy is wrong; the Japanese xbox disk drive scratched up disks. While the US release went great, the same cannot be said for overseas.
The other point is that the software itself had to be changed for different regions in unpredicted ways.Not only were languages different, but That includes the Xbox start-up screen, which had to be redesigned for the Xbox's European launch because nobody realized that the German "einstellungen" wouldn't fit in the same text space as "settings."
So with all of these differences(using the xbox as an example) how is Sony going to make a distributed world-wide game? Everyone would have to be using basically the same software, right? Unforseen changes needed for different regions could cause problems.
They need to build a more reliable box first! (Score:2)
The PS2 is already notorious for having problems when the cooling fan gets clogged up and fails, and that's often with use by people who turn it off when they're done playing.
Ideally, you want a low power consumption unit that doesn't really ever power off completely. It should be designed to stay on all the time, so it can share CPU resources with other gamers whenever you're not actively playing on it.
Of course, this won't really go over so well unless/until broadband prices drop and it becomes more commonplace. Right now, I think even a lot of DSL customers would unplug a box designed this way because they only have 14K per second or so of upload bandwidth, and they might want to use it for other things besides an idle PS3.
Let's examine the premise... (Score:2, Insightful)
Why 1000x? Is this anything other than an a number they just pulled out of their ass?
Q
Dreams or Reality? (Score:2, Interesting)
Is there any harm in aspiring for these things to come true? What if Sony pulls off distributed computing for the PS3? Will the people here still be saying "that's stupid"? What if Sony has biotech running on the PS6 or PS7? If it wasn't for people coming up with crazy ideas would anything get invented? Innovation is important part of pushing things forward. If nobody tried to do the crazy things, then how would we know if it would be possible?
When Kenndy said let's go to the moon. What if eveyrbody had listened to the poeple shouting "It can't be done, it's stupid, it's a dumb idea." There are people out there working on fusion, anti-matter, FTL travel, grand unified theory, cures for cancer, etc... Are these people stupid and dumb? Hell, all Sony wants is a 1000 fold increase from the PS2. If they want to put biotechnology in their PS6, fine whats the problem with that?
You want to hear about something stupid and dumb? What about a "nextday delivery service?" or "being able to hear actors talk in movies?" or "going to the moon?" Frick! now these are stupid and dumb ideas.
Pre-conceptions (Score:2)
As for the real-time arguments, a lot of pre-rendering can be done before it gets to the point of being displayed. The renderer could even learn some lessons from the micro-processor world with super-scalar architectures and branch prediction.
Finally, the old "how much power do you really need" and "what's the point if I just have a standard tv/monitor" arguments: imagine how much power rendering an interactive movie with life-like characters real-time would take. It's WAY beyond anything we can do in the home today.
Phillip.
Game requirements.... (Score:2)
Although, I figure that they're just planning on making one box with 16 processors on a single die.
Cryptnotic
Re:distributed? (Score:3, Interesting)
Also even with these kinds of speeds, how would you keep a game in sync? what about errors? how would you save? distributed too? What would keep me from cheating by worse, using the PS3 to bypass the SSSCA law and use it as a computer to share my mp3s?
Re:distributed? (Score:2)
Re:Hohumm (Score:2, Informative)
Use pictograms. They are much more intuitive.
No, they're not. Icons (with VERY few exceptions) are only obvious after they've been explained. If the icons are good, and there aren't too many of them, you need to explain them only once. Try to replace everything with icons (which you're going to have to do if you want no localization problems), you end up with way too many icons for anyone to remember. Do you know (without looking at the tooltip) what every single icon in say, Word is for? Didn't think so.
Also, icons don't completely solve the localization problem. Images (especially the abstract drawings used in icons) can have different meanings in different cultures.
Re:Distributed? (Score:2, Insightful)
On the downside, the EULA for the PS3 now requires you to keep the machine on 24/7, and requires you to change disks occasionally so that it can crunch numbers for other games. If you do not have the game requested, you're required to go buy it.
Sorry, but this sounds like either a truly horrible idea, an attempt at cashing in on a hot buzzword, or (most likely) both.
Re:what to do with the power. (Score:2)