Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
AMD Supercomputing Entertainment Games Hardware

AMD Plans 1,000-GPU Supercomputer For Games, Cloud 148

arcticstoat writes "AMD is planning to use over 1,000 Radeon HD 4870 GPUs to create a supercomputer capable of processing one petaflop, which the company says will make 'cloud' computing a reality. When it's built later this year, the Fusion Render Cloud will be available as an online powerhorse for a variety of people, from gamers to 3D animators. The company claims that it could 'deliver video games, PC applications and other graphically-intensive applications through the Internet "cloud" to virtually any type of mobile device with a web browser.' The idea is that the Fusion Render Cloud will do all the hard work, so all you need is a machine capable of playing back the results, saving battery life and the need for ever greater processing power. AMD also says that the supercomputer will 'enable remote real-time rendering of film and visual effects graphics on an unprecedented scale.' Meanwhile, game developers would be able to use the supercomputer to quickly develop games, and also 'serve up virtual world games with unlimited photo-realistic detail.' The supercomputer will be powered by OTOY software, which allows you to render 3D visuals in your browser via streaming, compressed online data."
This discussion has been archived. No new comments can be posted.

AMD Plans 1,000-GPU Supercomputer For Games, Cloud

Comments Filter:
  • by Todd Fisher ( 680265 ) on Friday January 09, 2009 @10:14PM (#26395317) Homepage
    Intel Plans 2,000-GPU Supercomputer For Games, Lightning
  • Uhm, bandwidth? (Score:5, Insightful)

    by Taibhsear ( 1286214 ) on Friday January 09, 2009 @10:15PM (#26395327)

    Even if the "work" is offloaded to the cloud won't you still need an assload of bandwidth on said devices in order to actually amount to anything? It's not like you're going to get pci-express bandwidth capabilities over dsl or cable internet connection.

    • Re:Uhm, bandwidth? (Score:5, Insightful)

      by megaditto ( 982598 ) on Friday January 09, 2009 @10:38PM (#26395475)

      We can already stream DVD-quality movies encoded at 1 mbps or so, well within the current consumer "broadband" offerings. I'd assume that would be in the target range.

      But even if you wanted for some reason to go uncompressed, then 8-bit 800x600 at 25 fps would still be less than 100 mbps, not totally unreasonable.

      I would imagine the latency would be a much bigger problem than bandwidth. If you ever used VNC you probably know what I mean.

      • Re:Uhm, bandwidth? (Score:5, Informative)

        by Frenchman113 ( 893369 ) on Friday January 09, 2009 @11:14PM (#26395653) Homepage

        We can already stream DVD-quality movies encoded at 1 mbps or so, well within the current consumer "broadband" offerings.

        No, we can't. Of course, if you've been fooled into thinking that scene crap is "DVD quality", then perhaps this holds true. Otherwise, you would realize that not even H.264 can deliver DVD quality video (720x480, no artifacts) in less than 1 Mbps.

        • by megaditto ( 982598 ) on Saturday January 10, 2009 @12:26AM (#26395973)

          Yes, you can. You need to use the correct ethernet cables with high-level tin alloy shielding and vibration elimination: http://www.usa.denon.com/ProductDetails/3429.asp [denon.com]

          • Re: (Score:2, Funny)

            by Megatog615 ( 1019306 )

            You're sure that's not a Monster Cable rebrand?

          • by mcrbids ( 148650 )

            Holy fsck. $500 for a 5 foot long ETHERNET CABLE!?!!? For the "serious audiophile"?!?!?

            (Um, hello? It's DIGITAL?!?!)

            Goes to show, there really IS a sucker born every minute, but at these prices, they'd make out like bandits if they only made 1 sale/week...

            • Comment removed based on user account deletion
            • The cable does include directional markings.

              Additionally, signal directional markings are provided for optimum signal transfer.

              So, you know, the bits don't get confused and take a wrong turn. I hate it when that happens.

              • The cable does include directional markings.

                Additionally, signal directional markings are provided for optimum signal transfer.

                So, you know, the bits don't get confused and take a wrong turn. I hate it when that happens.

                Look at the picture. The arrow points in both directions for the "directional markings" which is completely useless even if such a thing did matter...

          • by mog007 ( 677810 )

            You'd think for 500 bucks, they'd at least use gold, platinum, or uranium for the wires. But it's still just the same old copper ethernet cables I can buy from Home Depot for 1/100th the price.

        • Um, yes, everyone.

          Bandwidth. That precious commodity.

          Obviously, they're going off one or more of these assumptions/instances :

          1) They have designed one hell of a compression algorithm. The OTOY site has between fuck-all and nothing on it, and the domain is relatively new (which doesn't say much - if some bright spark at AMD developed a mean compression algorithm that isn't overwhelmingly intensive, and s/he split off, then it would be new).
          2) Mobile bandwidth will be making a fantastic leap at rough

          • Compression has advanced quite a bit. My Netscape ISP squeezes text websites to just 5% their original size, thereby increasing effective bandwidth by 20 times. If the "Cloud" implements a similar algorithm to handle the data, it could operate quite fast.

            • That's the rub, though. Text is easy to compress, especially ASCII, UTF-8, et al. Hence the reason for mark-up languages that are rendered locally by your browser, as well. That's old news.

              The kinds of data that they're implying that this "Cloud" (not an original name - the Supercomputing Conference has had "Cloud Computing" for several years, basically a pool/grid of various institutions and/or sponsors who contribute compute, storage, etc. to the conference participants) will handle does not lend itself

        • Re: (Score:3, Insightful)

          (720x480, no artifacts)

          Are you blind? DVD is full of artefacts, its mpeg, its ass.

      • We can already stream DVD-quality movies encoded at 1 mbps or so, well within the current consumer "broadband" offerings. I'd assume that would be in the target range.

        I'm not exactly sure how this will work but they said you have to offload the data to their servers. So if you are playing a game wouldn't you have to upload all the data to their servers so they can process it? Consumer internet connections are fairly quick at downloading but it seems that the upload speed is going to be a problem. My internet connection is 10mbps down but only about 700kbps up. So that seems like it would be the problem.

        • Re: (Score:2, Insightful)

          by Wallslide ( 544078 )
          The idea is that the only thing you are uploading to the server is input, such as mouse/keyboard/voice information. The game logic and assets all reside on the server itself, and thus don't have to be upload by your machine. It's as if you were playing a game over a VNC connection.

          One thing that is really cool about this technology is that it has the potential to eliminate cheating in games such as first person shooters. A lot of the cheating in the past is because the game client running on a user's
          • I think the whole idea sounds stupid. I sit down at my keyboard, I log in, and now I have access to a central supercomputer that does all the processing while my PC acts as a dumb terminal.

            This AMD idea sounds like something from the 1970s. A step backwards.

          • by Mozk ( 844858 )

            I hope to god that games don't start using this just to thwart cheaters. The one thing I do like about this, though, is that a user would use the same network bandwidth whether they use 2xAA or 16xAA, assuming that the server computer renders them in about the same time.

    • by sreid ( 650203 )
      more or less.. i think a high def movie can be streamed at under 300kbs
    • No, latency (Score:4, Insightful)

      by alvinrod ( 889928 ) on Friday January 09, 2009 @10:43PM (#26395495)

      The bandwidth is only a problem until we build bigger tubes. As much as we all like to bitch about internet here in the US, we're at least capable of increasing the bandwidth quite well. The real problem is dealing with the latency. With enough time and money we could easily push as much data as we could possibly want, but we can only push it so fast.

      For some games it probably won't matter, but who'd want to use it for an FPS where regardless of how detailed your graphics are, even a tenth of a second lag is the difference between who lives and who dies? Until we can get around those limitations, I don't foresee the traditional setup changing much.

      • by Anonymous Coward

        Sure, games that deal with dead reckoning (e.g. FPS) aren't the first candidate for this it is perfect for deterministic peer to peer simulations (e.g. RTS)!

        A typical RTS will only simulate at 8-12Hz. Yes, expect 126ms - 249 ms lag! But you don't even notice.

        • by KDR_11k ( 778916 )

          You WILL notice the input lag though.

        • Re:No, latency (Score:4, Interesting)

          by i.of.the.storm ( 907783 ) on Saturday January 10, 2009 @02:47AM (#26396507) Homepage
          For some people, maybe. But professional RTS gamers can have between 300-400 actions per minute, and some ridiculously good ones have 500, and if they had that much lag I wager they would notice. And of course, that's on top of the amount of time it takes for the supercomputer to generate the image.
        • MMOs could benefit too.

          EVE has a ONE SECOND tick rate. Admittedly the client interpolates so that you see smooth movement, but there is always at least a 1s gap between your weapons firing, or between clicking the button and having a module activate. No-one complains.

          It goes a way towards explaining how they can support so many players in one universe at once.

      • For some games it probably won't matter, but who'd want to use it for an FPS where regardless of how detailed your graphics are, even a tenth of a second lag is the difference between who lives and who dies?

        I might just be talking out of my ass here, but... If latency is your only bottleneck, and you have plenty of bandwidth and CPU on the server, wouldn't it be possible to deliver as many renderings as there are possible inputs, and only use whichever one corresponds to what the player actually does?

        A simple example would be a game where, at any moment, the player could be moving up, down, left, or right. The server could generate four different views, one for each possible input. All four are delivered to th

        • by KDR_11k ( 778916 )

          A problem would be that the number of frames increases exponentially with the time you render ahead. 100ms lag on 60fps would mean something like (number of input options)^6 frames to render. With your four options that would be 4^6=4096 frames. You'd need a system that's more than 4096 times as powerful as the average user's computer times the number of users you have. At this point it's easier to just tell the user to buy his own damn hardware.

        • But in a FPS game only camera movement is almost infinite.. There's millions, probably billions of "possible" actions, from moving the camera to walking, shooting, and all the combinations between them, for every player in the game! Even if the "cloud server" could make it and the bandwith was enough, you probably would have to get a "super" CPU just to process that gigantic stream of data.

      • Comment removed based on user account deletion
        • by KDR_11k ( 778916 )

          I'm not sure the rendering hardware is the bottleneck for amateur movie CGI. It's more likely that they simply don't have the necessary artists to create the scenes in first place. You need a large staff to do the things modern movies do in reasonable time, hiring 20-30 professionals (probably more even) for the task tends to be a bit too expensive for amateur movie budgets.

    • While I agree that this is odd with games, I definitely see the potential for 3d animators. It takes my home computers (note the plural) hours/days to render complex scenes, depending on the length of the scene. The advantage in computing power would greatly outweigh the bandwidth cost here, especially if you could just upload the job and wait for the result (instead of sending each frame to be rendered).

      But I would imagine it would not take many people to bog this down.

    • by khallow ( 566160 )

      Even if the "work" is offloaded to the cloud won't you still need an assload of bandwidth on said devices in order to actually amount to anything? It's not like you're going to get pci-express bandwidth capabilities over dsl or cable internet connection.

      There are services that have low demands on the client and high demands on the server. For example, a game with a huge player population (like several hundred thousand). I think that Second Life or Eve Online would be examples of such games. The graphics isn't that demanding on older PCs and they have a huge player population. So no, you wouldn't need to have a huge amount of bandwidth, but it's not going to be state of the art graphics.

    • You don't need PCI-e bandwidth. All you are doing is transporting 2-dimensional video. We are already very good at doing that over moderate bandwidth connections.

      1-2 Mbps will do standard definition video comfortably well.

    • by Firehed ( 942385 )

      You think this is so you don't have to buy a new graphics card? The only reason companies would go for this is because it changes their games from a product to a service, so piracy goes away. Next-gen DRM, if you will (next-gen doesn't have to be worse, however; I avoid buying games due to how invasive the DRM is and know plenty of people who do the same, so the next generation of the stuff damn well better address that).

      If it's implemented correctly it would still offer us advantages - play from any comp

    • by malv ( 882285 )
      Compress and stream like a youtube video.
  • Good luck (Score:5, Insightful)

    by 4D6963 ( 933028 ) on Friday January 09, 2009 @10:16PM (#26395335)

    "VNCing" games through the Internet and possibly a wireless network, and getting a decent enough latency and enough throughoutput to get a good image quality/FPS? Good luck with that, not saying it won't work, but if it does work satisfyingly and reliably it'll be an impressive feat.

    Well I know StreamMyGame [streammygame.com] does it, but it's meant to be used locally, not over the internet + WiFi, right?

    • WiFi itself is enough to completely kill a gaming session. When I'm at home on my laptop, I like to remotely login to my desktop. Allows me the horse power of my desktop, along with access to all the files (read: pr0n).

      Works flawlessly really, but the difference between ethernet and wifi is perceptible. But as soon you try gaming over it, it becomes unusable (for any action game at least). Even simple games like kasteroids or kbounce are not worth using (I get routine 1 second freezes). On the other hand, d

      • Replying to myself. The end of my post got truncated by html parser thingy. Slashdot has to be the bulletin board where you need to write &lt; to get a < sign..

        The end of the post should have:

        < 1ms). I assume it must be from packet loss, but it very well might be a bandwidth issue too.

        • Re: (Score:1, Funny)

          by Anonymous Coward

          Slashdot has to be the bulletin board

          I'm assuming you used angel brackets to emphasize <only> as well :P

        • You sure it wasn't the lousy wi-fi connection? ;-)
    • by sam0737 ( 648914 )

      Sure...it would need the bandwidth of receiving fullscreen video/audio.

      Latency wise, some game may be more suitable than the others. For example, RTS game like Warcraft III - where the action are carried out ONLY after the command is synced to each other players, and is lagged anyway.

      Though I think it's has much more value in doing pre-rendering, animation rendering, etc. AMD just rents the CPU hour for you to get your job done.

  • Only 1.000? (Score:4, Interesting)

    by Duncan3 ( 10537 ) on Friday January 09, 2009 @10:17PM (#26395343) Homepage

    Folding@home is at 1.007 PFLOPS of just ATI GPUs :)

    (which is an entirely different sort of "computer", but still)

    • These days they've got nVidia ones too though. The nVidia ones get more PPD somehow, even though the ATI clients have been out longer and ATI GPUs seem to have higher theoretical output. And yay, I'm contributing to that major FLOPpage with my ATI GPU.
  • by morgan_greywolf ( 835522 ) on Friday January 09, 2009 @10:17PM (#26395349) Homepage Journal

    Attention, AMD Marketroids: Please kill yourselves. Now. Do it now.

    *blink*

    Yes. All of you.

  • by WiiVault ( 1039946 ) on Friday January 09, 2009 @10:20PM (#26395373)
    I'm all for cloud gaming- it would be great to not have to upgrade my GPU all the time to play new games, however I wonder how this could be accomplished in a way where lag was so minimal as to not affect gameplay. It seems this would be especially hard if one were to play online games. Correct me if I'm wrong but it would seem one would need to add the lag from the client to the cloud AND the lag from the player to player (or server) in the multiplayer networking. That seems like a too much lag for most FPS's, which I'm assuming are one genre which would gain the most from such a supercomputer.
    • by Waccoon ( 1186667 ) on Saturday January 10, 2009 @05:44AM (#26397153)

      Having a cloud in your own house would be nice, so everyone could share computing power across multiple computers.

      I, for one, do not want my computing power on lease.

      • Having a cloud in your own house would be nice, so everyone could share computing power across multiple computers.

        Yeah. That would be nice. If you have the hardware all you need is the right [sourceforge.net] software [lcic.org].

        I don't know why more people don't do it -- not just homes, either. All sorts of orgs could use their desktops as a grid for on-demand supercomputing if only they would configure it to do that.

      • Having a cloud in your own house would be nice, so everyone could share computing power across multiple computers.

        Can I do a joke about how it's dangerous mixing rain and electricity?

    • Latency is and would be a huge factor. I truly don't think this would be meant for FPS. I could see it being used for local gaming on a phone, but not a multi player game at all. This is not a technology that will let you play Crysis on your crappy PC.

      The whole idea, if I am not mistaken, is for 'mainframe', err 'cloud' GPU to render content that is beyond the capability of the device accessing it. So now, instead of just game data being transmitted over the network, now we are going to render graphics to

  • by mihalis ( 28146 ) on Friday January 09, 2009 @10:27PM (#26395419) Homepage

    the Fusion Render Cloud will be available as an online powerhorse
    AMD also described NVIDIA's Quadroplex as more of an online My Little Pony.

  • by Lord Byron II ( 671689 ) on Friday January 09, 2009 @10:29PM (#26395437)

    Instead of buying a $400 video card, now you're paying AMD to buy that video card for you, paying them for the management of that card, and paying your ISP for the bandwidth. The only possible way this works is if you only use your card 10% of the time, then AMD can utilize it at 100%, selling you just one-tenth the total.

    Of course, that's great for gamers, who will sporadically play throughout the day, but awful for movie studios who could probably keep a render farm at 100% anyway.

    • Movie-grade CG tends to rendered via raytracing, which, AFAIK, is an algorithm that is more suited to be run on a general-purpose CPU, instead of a GPU.

      I'm sure part of the reason that nVIDIA and ATI have been working to develop alternative applications of their GPU technology is that their GPUs could potentially become unnecessary to gamers, should CPUs ever reach the speed where real-time raytracing is practical.

  • One Problem (Score:5, Insightful)

    by Akir ( 878284 ) on Friday January 09, 2009 @10:38PM (#26395473)
    They're going to have to write a driver that works before they get that to work.
  • for reals? (Score:1, Redundant)

    by kaizokuace ( 1082079 )
    People are trying too hard just to play Crysis. wtf!
    • by Draconi ( 38078 )

      Don't you understand!? With this, my god, we could build an entire virtual world, out of fully interactive, fully physic'd, fully exploding *barrels*!

  • AMD also says that the supercomputer will 'enable remote real-time rendering of film and visual effects graphics on an unprecedented scale.' Meanwhile, game developers would be able to use the supercomputer to quickly develop games, and also 'serve up virtual world games with unlimited photo-realistic detail.'

    they have this in the future. don't they call it the matrix?

  • I look forward to (Score:5, Insightful)

    by sleeponthemic ( 1253494 ) on Friday January 09, 2009 @10:55PM (#26395557) Homepage
    Playing Duke Nukem Forever @ 1900x1200 through the Fusion Render Cloud, occasionally reloading the latest results of the (fully operational)Super Hadron Collider on my Nintendo VR Goggles powered by a free energy device producing negative infinity carbon emissions.
    • Re: (Score:1, Redundant)

      by shawb ( 16347 )
      All that from your flying car, I assume?
      • by Nemyst ( 1383049 )

        All that from your flying car, I assume?

        Nah, I say from his grave, from the look of things.

    • producing negative infinity carbon emissions

      But The Plants! All the plants would die. In fact, the ridiculously negative carbon equilibrium so established would SUCK THE CARBON FROM OUR VERY BONES! (Carbon is a relatively major component of bone tissue, the calcium phosphate component aside.)

  • by graymocker ( 753063 ) on Friday January 09, 2009 @11:13PM (#26395643)

    A comment from the story earlier today about nVidia's new 2-teraflop multicore card:

    Yet again, Nvidia showed ATI that it, indeed, has the biggest penis.

    Hah! HAH! While nVidia dicks around with expansion cards measured in mere teraflops, AMD is building a SUPERCOMPUTER. That's a /peta/flop, nVidia! If you don't know what that is, here's a hint: take your teraflop. Then add three zeros to the end. BAM!

    AMD's penis is now 500 times larger than nVidia's. It's math.

    • by Zephiris ( 788562 ) on Friday January 09, 2009 @11:23PM (#26395689)

      Nvidia's GTX 295 was around 1.7 teraflops I believe, while the (similarly priced) 4870X2 is 2.4. The 'mere' 295 supposedly beats the 4870X2 by 15% average.
      The difference is? Nvidia always has pretty good drivers. ATI struggles to allow games to take >50% advantage of even the lowly 3870 (as measured by the card's own performance counters)...let alone a 2.4 tflop card...let alone a massive array of 4870s.

      Plus, wouldn't a 1000 GPU 4870 cloud...only allow some 1000 users some fractional percentage of one 4870 capped by latency and other overhead?

      Or...are we talking about providing a larger number of mobile devices the equivalent capabilities and speed of 1999's Geforce 256?

      Either way...I don't think it'll catch on, and will be a huge money sink for AMD when it needs to be fixing its processor and video card issues for the average, real consumers who are losing faith in AMD's ability to provide reasonable and usefully competitive products.


      • Plus, wouldn't a 1000 GPU 4870 cloud...only allow some 1000 users some fractional percentage of one 4870 capped by latency and other overhead?

        Earlier in this thread, people were talking about the latency over the general IPv4 internet - but suppose that AMD/ATI could get the price on this thing down to $20,000 or $10,000 - to the point that an entrepreneur could purchase one of these boxes, and a gigabit [or maybe even 10-gigabit] ethernet switch, and some ethernet cabling, and some base stations [with v
  • Is will it be able to run Windows Vista?
  • ...build games for it - but how does this translate to serving up virtual world games with unlimited photorealistic detail?

    Does it draw the perspective for every individual logged on player ahead of time, cache it, and somehow overcome bandwidth and latency concerns to deliver something in higher quality than a local GPU can do?

    Or is this about the architecture of the virtual world itself - messaging, AI threads, triggers, events, decision making? It would have to be one incredible world that required more

  • Streaming video games over the net from a server cloud?

    Who let the marketing guys out of their cage on this one?

    I mean... it will be faster than Intel's local 3D chips sure, but still... come on!

  • Imagine a beow... oh... damn it, thats what they did.
  • by macraig ( 621737 ) <mark...a...craig@@@gmail...com> on Saturday January 10, 2009 @12:07AM (#26395887)

    Figures. See, most people thought that war had been won long ago. Perhaps it was, but now the Big Iron camp has a new ally: Big Software, who REALLY wants to do away with one-time licenses and purchases and substitute the far more lucrative "Web apps" and the subscription licensing and fees that paradigm will allow. They want to re-brand software as "content" and they want consumers to willingly buy into that. Their latest sneaky flanking maneuver is what you know as Web apps, but the objective is the same.

    If you say yes to either one, centralized computing or software subscriptions, you're actually saying yes to BOTH.

    Nancy Reagan had the better advice: Just Say No... to both.

    • by dkf ( 304284 )

      If you say yes to either one, centralized computing or software subscriptions, you're actually saying yes to BOTH.

      I think that tinfoil hat has managed to slip over your eyes.

      For software that's happy on your desktop or laptop, the best place for it is on that desktop/laptop. And right now, the mass market tends to be not keen on rental software (outside gaming, where it seems to be somewhat workable).

      Once you get to the heavier-weight stuff (top-end simulators of various kinds particularly) then prices to buy go up fast. (Why? Because they're difficult to write and the total market isn't that big. Even if the software

      • by macraig ( 621737 )

        I was of course referring specifically to mass-market off-the-shelf software, not the vertical-market stuff for which licensing and pricing has always been different. There's no tinfoil hat involved, only a lack of specificity.

        I might agree that renting use of software - whether local or remote - for infrequent purposes would be a potentially useful OPTION alongside one-time purchases, but NOT as a complete substitution. That complete substitution is what is trying to be sold as a bill of goods now, under

  • that we could finally play crysis on this?
  • Our company is planning to present Nvidia based GPGPU solution at Cloud Computing Conference 2009, keep tuned - http://www.cloudslam09.com/ [cloudslam09.com] imho, AMD's idea is sound and timely from different points. Those who doubt, just lag behind like SGI did.
  • by elysiuan ( 762931 ) on Saturday January 10, 2009 @12:12AM (#26395907) Homepage

    "serve up virtual world games with unlimited photo-realistic detail."

    Considering that CGI effects in movie houses have only started approaching effects indistinguishable from reality within the last five or so years this spikes my bullshit meter pretty high.

    Factor in Weta/IL&M and the rest are using huge render farms for an extremely non-realtime render process and my meter explodes.

    Even if I take the claim at face value and postulate that it is possible to do this then I am forced to wonder about how many concurrent, real-time, 100% realistic scenes it can process at once.

    Sounds like the marketing department wet their pants a bit early on this one.

  • They say that they want it to do games and 3D animation in a mobile web browser. Call me nuts, but Quake4iPhone takes alot of skill and patience to control reliably...and now they want to try to get Unreal Tournament 5 in that environment? heh. Almost as much fun as doing...basically anything in Maya or 3DS Max from a phone. "desirable" is not quite the first adjective that comes to mind.
  • 1 Petaflop = 10^15 floating-point operations.

    So what happens after it completes those 10^15 floating point operations? Or did the poster mean 1 PetaFLOPS? The S stands for "second" It's not a plural of FLOP!
  • Practical benifits (Score:2, Interesting)

    by cee--be ( 1308795 )
    One practical use for this would be to run staggeringly complex real-time physics calculations in real time. One example would be doing the necessary calculations to render a physically realistic sea with weather conditions into an animation. You could then send this to users in a sea MMO for example. There are many other cool game related things you could do with it, rather than wastefully rendering some uncanny valley mobile phone game at 2 FPS.
  • by commodore64_love ( 1445365 ) on Saturday January 10, 2009 @04:07AM (#26396855) Journal

    I sit down at my dumb terminal, I log in, and now I have access to a central supercomputer (via the network) that does all the processing.

    This AMD idea sounds like something from the 1970s.

  • The supercomputer and cloud part is obviously realistic. The gaming part is just marketing hype as it is now, the internet would "break" if everyone played games and watched HD movies over the internet on a large scale. The problem is that given the distance, on top of the latency the distance brings, there is bound to be a bottleneck at some point, from the distributor to the consumer. And that is something internet users even experience today, before people even have begun adopting IPTV and similar. That'

  • Does anyone else agree with me that the future is unlikely to be entirely offloaded, but a hybrid situation? Even the cheapest of phone chipsets will shortly have fairly decent rendering by today's standards. It's not hard to envisage something where a great deal of the processing can be handled by the server, whereas each device does a certain amount of rendering / coping with the immediate 1/10th sec to remove the lag. Somewhere between AMD's ideological future and current MMOs
  • Don't you mean 'fusion center' cloud?

FORTUNE'S FUN FACTS TO KNOW AND TELL: A black panther is really a leopard that has a solid black coat rather then a spotted one.

Working...