AMD Plans 1,000-GPU Supercomputer For Games, Cloud 148
arcticstoat writes "AMD is planning to use over 1,000 Radeon HD 4870 GPUs to create a supercomputer capable of processing one petaflop, which the company says will make 'cloud' computing a reality. When it's built later this year, the Fusion Render Cloud will be available as an online powerhorse for a variety of people, from gamers to 3D animators. The company claims that it could 'deliver video games, PC applications and other graphically-intensive applications through the Internet "cloud" to virtually any type of mobile device with a web browser.' The idea is that the Fusion Render Cloud will do all the hard work, so all you need is a machine capable of playing back the results, saving battery life and the need for ever greater processing power. AMD also says that the supercomputer will 'enable remote real-time rendering of film and visual effects graphics on an unprecedented scale.' Meanwhile, game developers would be able to use the supercomputer to quickly develop games, and also 'serve up virtual world games with unlimited photo-realistic detail.' The supercomputer will be powered by OTOY software, which allows you to render 3D visuals in your browser via streaming, compressed online data."
Oh Yeah? Well..... (Score:5, Funny)
Re: (Score:3, Funny)
2000 Intel GPUs?? Well, that's like a Radeon 3650, right?
Re: (Score:1)
Uhm, bandwidth? (Score:5, Insightful)
Even if the "work" is offloaded to the cloud won't you still need an assload of bandwidth on said devices in order to actually amount to anything? It's not like you're going to get pci-express bandwidth capabilities over dsl or cable internet connection.
Re:Uhm, bandwidth? (Score:5, Insightful)
We can already stream DVD-quality movies encoded at 1 mbps or so, well within the current consumer "broadband" offerings. I'd assume that would be in the target range.
But even if you wanted for some reason to go uncompressed, then 8-bit 800x600 at 25 fps would still be less than 100 mbps, not totally unreasonable.
I would imagine the latency would be a much bigger problem than bandwidth. If you ever used VNC you probably know what I mean.
Re:Uhm, bandwidth? (Score:5, Informative)
We can already stream DVD-quality movies encoded at 1 mbps or so, well within the current consumer "broadband" offerings.
No, we can't. Of course, if you've been fooled into thinking that scene crap is "DVD quality", then perhaps this holds true. Otherwise, you would realize that not even H.264 can deliver DVD quality video (720x480, no artifacts) in less than 1 Mbps.
Videphile-quality cables. (Score:5, Funny)
Yes, you can. You need to use the correct ethernet cables with high-level tin alloy shielding and vibration elimination: http://www.usa.denon.com/ProductDetails/3429.asp [denon.com]
Re: (Score:2, Funny)
You're sure that's not a Monster Cable rebrand?
Re: (Score:2)
Holy fsck. $500 for a 5 foot long ETHERNET CABLE!?!!? For the "serious audiophile"?!?!?
(Um, hello? It's DIGITAL?!?!)
Goes to show, there really IS a sucker born every minute, but at these prices, they'd make out like bandits if they only made 1 sale/week...
Re: (Score:2)
To be fair... (Score:1)
The cable does include directional markings.
Additionally, signal directional markings are provided for optimum signal transfer.
So, you know, the bits don't get confused and take a wrong turn. I hate it when that happens.
Re: (Score:2)
The cable does include directional markings.
Additionally, signal directional markings are provided for optimum signal transfer.
So, you know, the bits don't get confused and take a wrong turn. I hate it when that happens.
Look at the picture. The arrow points in both directions for the "directional markings" which is completely useless even if such a thing did matter...
Re: (Score:2, Funny)
Yeah. But the arrows don't point crosswise.
Re: (Score:2)
You'd think for 500 bucks, they'd at least use gold, platinum, or uranium for the wires. But it's still just the same old copper ethernet cables I can buy from Home Depot for 1/100th the price.
Re: (Score:2)
Bandwidth. That precious commodity.
Obviously, they're going off one or more of these assumptions/instances :
1) They have designed one hell of a compression algorithm. The OTOY site has between fuck-all and nothing on it, and the domain is relatively new (which doesn't say much - if some bright spark at AMD developed a mean compression algorithm that isn't overwhelmingly intensive, and s/he split off, then it would be new).
2) Mobile bandwidth will be making a fantastic leap at rough
Re: (Score:1)
Compression has advanced quite a bit. My Netscape ISP squeezes text websites to just 5% their original size, thereby increasing effective bandwidth by 20 times. If the "Cloud" implements a similar algorithm to handle the data, it could operate quite fast.
Re: (Score:2)
The kinds of data that they're implying that this "Cloud" (not an original name - the Supercomputing Conference has had "Cloud Computing" for several years, basically a pool/grid of various institutions and/or sponsors who contribute compute, storage, etc. to the conference participants) will handle does not lend itself
Re: (Score:2)
The fact that this one is specifically targeted towards rendering graphics doesn't make it any conceptually unique. And just like everyone has been rabbiting along about the wonders of grid computing (and yet there are so many varying alliances right now that I don't think we'll see much progress for a while; companies d
Re: (Score:3, Insightful)
(720x480, no artifacts)
Are you blind? DVD is full of artefacts, its mpeg, its ass.
Re: (Score:1)
We can already stream DVD-quality movies encoded at 1 mbps or so, well within the current consumer "broadband" offerings. I'd assume that would be in the target range.
I'm not exactly sure how this will work but they said you have to offload the data to their servers. So if you are playing a game wouldn't you have to upload all the data to their servers so they can process it? Consumer internet connections are fairly quick at downloading but it seems that the upload speed is going to be a problem. My internet connection is 10mbps down but only about 700kbps up. So that seems like it would be the problem.
Re: (Score:2, Insightful)
One thing that is really cool about this technology is that it has the potential to eliminate cheating in games such as first person shooters. A lot of the cheating in the past is because the game client running on a user's
Re: (Score:1)
I think the whole idea sounds stupid. I sit down at my keyboard, I log in, and now I have access to a central supercomputer that does all the processing while my PC acts as a dumb terminal.
This AMD idea sounds like something from the 1970s. A step backwards.
Re: (Score:1)
I hope to god that games don't start using this just to thwart cheaters. The one thing I do like about this, though, is that a user would use the same network bandwidth whether they use 2xAA or 16xAA, assuming that the server computer renders them in about the same time.
Re: (Score:1)
No, latency (Score:4, Insightful)
The bandwidth is only a problem until we build bigger tubes. As much as we all like to bitch about internet here in the US, we're at least capable of increasing the bandwidth quite well. The real problem is dealing with the latency. With enough time and money we could easily push as much data as we could possibly want, but we can only push it so fast.
For some games it probably won't matter, but who'd want to use it for an FPS where regardless of how detailed your graphics are, even a tenth of a second lag is the difference between who lives and who dies? Until we can get around those limitations, I don't foresee the traditional setup changing much.
Re: (Score:1)
Sure, games that deal with dead reckoning (e.g. FPS) aren't the first candidate for this it is perfect for deterministic peer to peer simulations (e.g. RTS)!
A typical RTS will only simulate at 8-12Hz. Yes, expect 126ms - 249 ms lag! But you don't even notice.
Re: (Score:1)
You WILL notice the input lag though.
Re:No, latency (Score:4, Interesting)
Re: (Score:2)
MMOs could benefit too.
EVE has a ONE SECOND tick rate. Admittedly the client interpolates so that you see smooth movement, but there is always at least a 1s gap between your weapons firing, or between clicking the button and having a module activate. No-one complains.
It goes a way towards explaining how they can support so many players in one universe at once.
Re: (Score:2)
For some games it probably won't matter, but who'd want to use it for an FPS where regardless of how detailed your graphics are, even a tenth of a second lag is the difference between who lives and who dies?
I might just be talking out of my ass here, but... If latency is your only bottleneck, and you have plenty of bandwidth and CPU on the server, wouldn't it be possible to deliver as many renderings as there are possible inputs, and only use whichever one corresponds to what the player actually does?
A simple example would be a game where, at any moment, the player could be moving up, down, left, or right. The server could generate four different views, one for each possible input. All four are delivered to th
Re: (Score:1)
A problem would be that the number of frames increases exponentially with the time you render ahead. 100ms lag on 60fps would mean something like (number of input options)^6 frames to render. With your four options that would be 4^6=4096 frames. You'd need a system that's more than 4096 times as powerful as the average user's computer times the number of users you have. At this point it's easier to just tell the user to buy his own damn hardware.
Re: (Score:1)
But in a FPS game only camera movement is almost infinite.. There's millions, probably billions of "possible" actions, from moving the camera to walking, shooting, and all the combinations between them, for every player in the game! Even if the "cloud server" could make it and the bandwith was enough, you probably would have to get a "super" CPU just to process that gigantic stream of data.
Re: (Score:2)
Re: (Score:1)
I'm not sure the rendering hardware is the bottleneck for amateur movie CGI. It's more likely that they simply don't have the necessary artists to create the scenes in first place. You need a large staff to do the things modern movies do in reasonable time, hiring 20-30 professionals (probably more even) for the task tends to be a bit too expensive for amateur movie budgets.
Re: (Score:2)
While I agree that this is odd with games, I definitely see the potential for 3d animators. It takes my home computers (note the plural) hours/days to render complex scenes, depending on the length of the scene. The advantage in computing power would greatly outweigh the bandwidth cost here, especially if you could just upload the job and wait for the result (instead of sending each frame to be rendered).
But I would imagine it would not take many people to bog this down.
Re: (Score:2)
Even if the "work" is offloaded to the cloud won't you still need an assload of bandwidth on said devices in order to actually amount to anything? It's not like you're going to get pci-express bandwidth capabilities over dsl or cable internet connection.
There are services that have low demands on the client and high demands on the server. For example, a game with a huge player population (like several hundred thousand). I think that Second Life or Eve Online would be examples of such games. The graphics isn't that demanding on older PCs and they have a huge player population. So no, you wouldn't need to have a huge amount of bandwidth, but it's not going to be state of the art graphics.
Re: (Score:2)
You don't need PCI-e bandwidth. All you are doing is transporting 2-dimensional video. We are already very good at doing that over moderate bandwidth connections.
1-2 Mbps will do standard definition video comfortably well.
Re: (Score:2)
You think this is so you don't have to buy a new graphics card? The only reason companies would go for this is because it changes their games from a product to a service, so piracy goes away. Next-gen DRM, if you will (next-gen doesn't have to be worse, however; I avoid buying games due to how invasive the DRM is and know plenty of people who do the same, so the next generation of the stuff damn well better address that).
If it's implemented correctly it would still offer us advantages - play from any comp
Re: (Score:1)
Good luck (Score:5, Insightful)
"VNCing" games through the Internet and possibly a wireless network, and getting a decent enough latency and enough throughoutput to get a good image quality/FPS? Good luck with that, not saying it won't work, but if it does work satisfyingly and reliably it'll be an impressive feat.
Well I know StreamMyGame [streammygame.com] does it, but it's meant to be used locally, not over the internet + WiFi, right?
Re: (Score:2)
WiFi itself is enough to completely kill a gaming session. When I'm at home on my laptop, I like to remotely login to my desktop. Allows me the horse power of my desktop, along with access to all the files (read: pr0n).
Works flawlessly really, but the difference between ethernet and wifi is perceptible. But as soon you try gaming over it, it becomes unusable (for any action game at least). Even simple games like kasteroids or kbounce are not worth using (I get routine 1 second freezes). On the other hand, d
Re: (Score:2)
Replying to myself. The end of my post got truncated by html parser thingy. Slashdot has to be the bulletin board where you need to write < to get a < sign..
The end of the post should have:
< 1ms). I assume it must be from packet loss, but it very well might be a bandwidth issue too.
Re: (Score:1, Funny)
Slashdot has to be the bulletin board
I'm assuming you used angel brackets to emphasize <only> as well :P
Re: (Score:1)
Re: (Score:2)
Sure...it would need the bandwidth of receiving fullscreen video/audio.
Latency wise, some game may be more suitable than the others. For example, RTS game like Warcraft III - where the action are carried out ONLY after the command is synced to each other players, and is lagged anyway.
Though I think it's has much more value in doing pre-rendering, animation rendering, etc. AMD just rents the CPU hour for you to get your job done.
Only 1.000? (Score:4, Interesting)
Folding@home is at 1.007 PFLOPS of just ATI GPUs :)
(which is an entirely different sort of "computer", but still)
Re: (Score:2)
Cloud?! TWO IN ONE DAY? (Score:3, Funny)
Attention, AMD Marketroids: Please kill yourselves. Now. Do it now.
*blink*
Yes. All of you.
Re: (Score:1)
It seems the idea has... /sunglasses
clouded their mind.
(YEEEAAAH!)
What about latency in gaming? (Score:4, Interesting)
Re:What about latency in gaming? (Score:4, Insightful)
Having a cloud in your own house would be nice, so everyone could share computing power across multiple computers.
I, for one, do not want my computing power on lease.
Re: (Score:1)
Having a cloud in your own house would be nice, so everyone could share computing power across multiple computers.
Yeah. That would be nice. If you have the hardware all you need is the right [sourceforge.net] software [lcic.org].
I don't know why more people don't do it -- not just homes, either. All sorts of orgs could use their desktops as a grid for on-demand supercomputing if only they would configure it to do that.
Re: (Score:2)
Can I do a joke about how it's dangerous mixing rain and electricity?
Re: (Score:1)
Latency is and would be a huge factor. I truly don't think this would be meant for FPS. I could see it being used for local gaming on a phone, but not a multi player game at all. This is not a technology that will let you play Crysis on your crappy PC.
The whole idea, if I am not mistaken, is for 'mainframe', err 'cloud' GPU to render content that is beyond the capability of the device accessing it. So now, instead of just game data being transmitted over the network, now we are going to render graphics to
Re: (Score:2)
Maybe it could just bake occlusion maps and such and stream that out to you. That stuff can tolerate a little lag once in a while (things will just look weird, or fall back on something less realistic), requires a whole lotta processing per scene, and in say a mmorpg type environment, it only has to be done once for everybody.
online powerhorse? (Score:5, Funny)
the Fusion Render Cloud will be available as an online powerhorse
AMD also described NVIDIA's Quadroplex as more of an online My Little Pony.
How will this save money? (Score:3, Insightful)
Instead of buying a $400 video card, now you're paying AMD to buy that video card for you, paying them for the management of that card, and paying your ISP for the bandwidth. The only possible way this works is if you only use your card 10% of the time, then AMD can utilize it at 100%, selling you just one-tenth the total.
Of course, that's great for gamers, who will sporadically play throughout the day, but awful for movie studios who could probably keep a render farm at 100% anyway.
Re: (Score:2)
Movie-grade CG tends to rendered via raytracing, which, AFAIK, is an algorithm that is more suited to be run on a general-purpose CPU, instead of a GPU.
I'm sure part of the reason that nVIDIA and ATI have been working to develop alternative applications of their GPU technology is that their GPUs could potentially become unnecessary to gamers, should CPUs ever reach the speed where real-time raytracing is practical.
One Problem (Score:5, Insightful)
for reals? (Score:1, Redundant)
Re: (Score:1)
Don't you understand!? With this, my god, we could build an entire virtual world, out of fully interactive, fully physic'd, fully exploding *barrels*!
Re: (Score:2)
I know kung-fu. (Score:2, Funny)
AMD also says that the supercomputer will 'enable remote real-time rendering of film and visual effects graphics on an unprecedented scale.' Meanwhile, game developers would be able to use the supercomputer to quickly develop games, and also 'serve up virtual world games with unlimited photo-realistic detail.'
they have this in the future. don't they call it the matrix?
I look forward to (Score:5, Insightful)
Re: (Score:1, Redundant)
Re: (Score:1)
All that from your flying car, I assume?
Nah, I say from his grave, from the look of things.
Re: (Score:1)
But The Plants! All the plants would die. In fact, the ridiculously negative carbon equilibrium so established would SUCK THE CARBON FROM OUR VERY BONES! (Carbon is a relatively major component of bone tissue, the calcium phosphate component aside.)
Contest, the rematch... (Score:3, Funny)
A comment from the story earlier today about nVidia's new 2-teraflop multicore card:
Yet again, Nvidia showed ATI that it, indeed, has the biggest penis.
Hah! HAH! While nVidia dicks around with expansion cards measured in mere teraflops, AMD is building a SUPERCOMPUTER. That's a /peta/flop, nVidia! If you don't know what that is, here's a hint: take your teraflop. Then add three zeros to the end. BAM!
AMD's penis is now 500 times larger than nVidia's. It's math.
Re:Contest, the rematch... (Score:5, Interesting)
Nvidia's GTX 295 was around 1.7 teraflops I believe, while the (similarly priced) 4870X2 is 2.4. The 'mere' 295 supposedly beats the 4870X2 by 15% average.
The difference is? Nvidia always has pretty good drivers. ATI struggles to allow games to take >50% advantage of even the lowly 3870 (as measured by the card's own performance counters)...let alone a 2.4 tflop card...let alone a massive array of 4870s.
Plus, wouldn't a 1000 GPU 4870 cloud...only allow some 1000 users some fractional percentage of one 4870 capped by latency and other overhead?
Or...are we talking about providing a larger number of mobile devices the equivalent capabilities and speed of 1999's Geforce 256?
Either way...I don't think it'll catch on, and will be a huge money sink for AMD when it needs to be fixing its processor and video card issues for the average, real consumers who are losing faith in AMD's ability to provide reasonable and usefully competitive products.
Revival of the video-arcade (Score:2)
Plus, wouldn't a 1000 GPU 4870 cloud...only allow some 1000 users some fractional percentage of one 4870 capped by latency and other overhead?
Earlier in this thread, people were talking about the latency over the general IPv4 internet - but suppose that AMD/ATI could get the price on this thing down to $20,000 or $10,000 - to the point that an entrepreneur could purchase one of these boxes, and a gigabit [or maybe even 10-gigabit] ethernet switch, and some ethernet cabling, and some base stations [with v
The Really Important Question... (Score:1, Redundant)
I'd love to... (Score:1)
...build games for it - but how does this translate to serving up virtual world games with unlimited photorealistic detail?
Does it draw the perspective for every individual logged on player ahead of time, cache it, and somehow overcome bandwidth and latency concerns to deliver something in higher quality than a local GPU can do?
Or is this about the architecture of the virtual world itself - messaging, AI threads, triggers, events, decision making? It would have to be one incredible world that required more
Whoa!!! (Score:1)
Streaming video games over the net from a server cloud?
Who let the marketing guys out of their cage on this one?
I mean... it will be faster than Intel's local 3D chips sure, but still... come on!
beowulf cluster (Score:1)
Ah, the Big Iron versus micros war again.... (Score:5, Interesting)
Figures. See, most people thought that war had been won long ago. Perhaps it was, but now the Big Iron camp has a new ally: Big Software, who REALLY wants to do away with one-time licenses and purchases and substitute the far more lucrative "Web apps" and the subscription licensing and fees that paradigm will allow. They want to re-brand software as "content" and they want consumers to willingly buy into that. Their latest sneaky flanking maneuver is what you know as Web apps, but the objective is the same.
If you say yes to either one, centralized computing or software subscriptions, you're actually saying yes to BOTH.
Nancy Reagan had the better advice: Just Say No... to both.
Re: (Score:2)
If you say yes to either one, centralized computing or software subscriptions, you're actually saying yes to BOTH.
I think that tinfoil hat has managed to slip over your eyes.
For software that's happy on your desktop or laptop, the best place for it is on that desktop/laptop. And right now, the mass market tends to be not keen on rental software (outside gaming, where it seems to be somewhat workable).
Once you get to the heavier-weight stuff (top-end simulators of various kinds particularly) then prices to buy go up fast. (Why? Because they're difficult to write and the total market isn't that big. Even if the software
Re: (Score:2)
I was of course referring specifically to mass-market off-the-shelf software, not the vertical-market stuff for which licensing and pricing has always been different. There's no tinfoil hat involved, only a lack of specificity.
I might agree that renting use of software - whether local or remote - for infrequent purposes would be a potentially useful OPTION alongside one-time purchases, but NOT as a complete substitution. That complete substitution is what is trying to be sold as a bill of goods now, under
Has anyone considered the possiblity (Score:1)
I hope we'll show implementation in April (Score:1)
Bullshit Stack Overflow (Score:5, Insightful)
"serve up virtual world games with unlimited photo-realistic detail."
Considering that CGI effects in movie houses have only started approaching effects indistinguishable from reality within the last five or so years this spikes my bullshit meter pretty high.
Factor in Weta/IL&M and the rest are using huge render farms for an extremely non-realtime render process and my meter explodes.
Even if I take the claim at face value and postulate that it is possible to do this then I am forced to wonder about how many concurrent, real-time, 100% realistic scenes it can process at once.
Sounds like the marketing department wet their pants a bit early on this one.
another issue (Score:1)
FLOP what? (Score:1)
So what happens after it completes those 10^15 floating point operations? Or did the poster mean 1 PetaFLOPS? The S stands for "second" It's not a plural of FLOP!
Guilty of pedantry myself, really (Score:1)
But there comes a time when the common usage outweighs the derivation of a term. Like now.
Practical benifits (Score:2, Interesting)
The 1970s? Did I step into a time machine? (Score:3, Insightful)
I sit down at my dumb terminal, I log in, and now I have access to a central supercomputer (via the network) that does all the processing.
This AMD idea sounds like something from the 1970s.
What a hype.. (Score:1)
The supercomputer and cloud part is obviously realistic. The gaming part is just marketing hype as it is now, the internet would "break" if everyone played games and watched HD movies over the internet on a large scale. The problem is that given the distance, on top of the latency the distance brings, there is bound to be a bottleneck at some point, from the distributor to the consumer. And that is something internet users even experience today, before people even have begun adopting IPTV and similar. That'
Re: (Score:1)
the internet would "break"
Yeh just like Googling for Google.
A Hybrid Future (Score:1)
Fusion Render Cloud? (Score:2)
Don't you mean 'fusion center' cloud?
Latency? (Score:5, Insightful)
And I doubt that streaming a 3d rendering will really save much battery either considering all the network activity.
Re: (Score:1)
Um, I need two of these setups, so I can finally play Crysis with my friend.
Re:Wow, streamed 3D games.. (Score:5, Funny)
What does your friend use?
Re: (Score:1)
Re: (Score:1, Redundant)
Wow ... the effort that went into writing that boggles the mind. I mean, I feel bad wasting 5 seconds of my life typing up this reply.
Re: (Score:1, Troll)
Re: (Score:1, Redundant)
Ok metamods, nuke from orbit for the 'redundant' mod please.
Re: (Score:1, Informative)
Read that way, it's pretty funny.
Re: (Score:1, Flamebait)
Now, I personally enjoy the hell out of drugs, sex, and computers (preferably supercomputers, which is why I'm reading this damn article) - in which order depends on which day
Re: (Score:2)
Compression is, after all, for losers.
Re: (Score:2)
My cellphone has an OpenGL ES rendering engine, as do many of the new generation of smartphones.
Despite that, I'm willing to bet the problem with this cloud computing engine is not the bandwidth, if they get it worked out, but the latency with the display. It's bad enough playing online and having lag issues. But now I have to wait for my screen to update?