Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!


Forgot your password?
PC Games (Games) The Internet Entertainment Games

New Service Aims To Replace Consoles With Cloud Gaming 305

ThinSkin writes "Imagine playing bleeding-edge games, yet never again upgrading your hardware. That's the ambitious goal of OnLive's Internet delivered gaming service. Using cloud computing, OnLive's goal is to 'make all modern games playable on any system,' thanks in large part to OnLive's remote servers that do all the heavy lifting. With a fast enough Internet connection, gamers can effectively stream and play games using a PC, Mac, or a 'MicroConsole,' 'a dedicated gaming client provided by OnLive that includes a game controller.' Without ever having to worry about costly hardware upgrades or the cost of a next-gen console, gamers can expect to fork over about $50 yearly just for the service. If this thing takes off, this can spell trouble for gaming consoles down the road, especially if already-established services like Steam and Impulse join the fray."
This discussion has been archived. No new comments can be posted.

New Service Aims To Replace Consoles With Cloud Gaming

Comments Filter:
  • Re:No thanks (Score:3, Informative)

    by 0xABADC0DA ( 867955 ) on Tuesday March 24, 2009 @09:47AM (#27310977)

    Instead of normal online game lag, you have lag between you actually pressing a button and the game responding at the server.

    Not necessarily. A LOT of lcd computer monitors have 'input lag' of say 50ms (meaning the computer sends the image to the monitor but you don't see it for 50ms after that) whereas lcd tvs don't, and few people complain. Presumably if the game servers are co-located with the ISP you could get lag much smaller than that.

  • Re:No thanks (Score:4, Informative)

    by Wovel ( 964431 ) on Tuesday March 24, 2009 @09:58AM (#27311093) Homepage
    Except that is not what it says... It says the server will do the lifting to a thin client. The server is not just streaming binaries to be rendered on the client, the server is receiving input from and return video to be displayed on the client.

    I think Amazon sells crowbars to remove your foot from your mouth.
  • Re:My predictions (Score:4, Informative)

    by truthsearch ( 249536 ) on Tuesday March 24, 2009 @10:09AM (#27311245) Homepage Journal

    It sounds like the device would be a thin client. No local storage and little processing other than graphics, maybe not even local 3D rendering. The device can probably be so cheap that they wouldn't mind the small percentage of loss to hackers. At $50/year they're really charging for the servers and service much more than the client hardware.

  • Re:Caps (Score:4, Informative)

    by Anonymous Coward on Tuesday March 24, 2009 @10:10AM (#27311249)

    You seem to be assuming that this service will stream VIDEO to your unit, but with TFA not being too clear on the subject, my guess is that they will stream just 'polygons' to their 'netconsole', which then displays them as video frames. The bandwith needed should be far smaller.

    The biggest difference with mmorpgs is that mmorpg servers send program data to the client, who then does most of the calculations -the hard work- and displays the results.

    Also, many slashdotters seem to assume that mmorpgs require a huge bandwith. I think that's wrong. As a well known example WoW was quite playable using a 512 Kb DSL connection.

    As other posters have said before, the biggest problem with On-Live's approach is the lag, which is inherent to the Internets, and will continue so for the foreseeable future. Most mmorpg clients use lots of code and processing power just to minimize the effects of lag in the gameplay, with mixed fortunes (Go to Dalaran and ask anyone :)

  • Re:Image bandwidth (Score:2, Informative)

    by Wovel ( 964431 ) on Tuesday March 24, 2009 @10:12AM (#27311279) Homepage
    Even the summary says the games are rendered by the servers.... The article will tell you they have a proprietary compression algorithm, which will send compressed video for 480p at 1.5mbps and 1080p at 5mbps, and nothing higher...

    I will tell you this all works great inside their offices, and probably not anywhere else on this planet.
  • Re:Caps (Score:5, Informative)

    by SanityInAnarchy ( 655584 ) <ninja@slaphack.com> on Tuesday March 24, 2009 @10:13AM (#27311287) Journal

    However, even a fiber optics line I'd have my doubts.

    Doing some quick calculations:

    The highest number I've gotten for Blu-Ray maximum bandwidth is 54 megabits per second. I've seen torrents much smaller that still looked good.

    Assuming uncapped, that's actually doable. Fiber is typically 100 mbits per second, and I'm sure some places offer gigabit.

    However, encoding time is on the order of hours or days, and is certainly not live. So the real problem is latency -- take 50 ms from your LCD monitor, plus whatever a wireless controller ads, plus the latency between you and their servers, plus the lag for them to render, capture, and encode, then decode back at the client... that's easily getting up to 200 ms, which I'd consider unplayable.

    Also, unless the $50/year includes games, it makes little economic sense, either. These systems are designed to last some four years or so. A Wii can be had for $160, according to a quick Google; this would be $200. A Wii can work when your Internet is down, or when your internet is not fiber. And a Wii actually has games already -- not as many as its competitors, but some.

    Where I could see this working is in a LAN environment

    Not really. LANs are typically 100 mbits, or if you're willing to spend money on a good switch, gigabit. Same situation as fiber.

    The only advantage of a LAN is, with a good switch, you aren't using everyone else's bandwidth, but if you're proposing this:

    make some kind of "xbox360server" to host all the games as basically virtual machines across a lan,

    That's still likely to be a single port, which means now everyone on the LAN is limited to a combined 100 mbits for their video. It means the concept of a LAN party just got very, very impractical.

    And WTF would be the point, if it's a console anyway? In what way is that "xbox360server" better than a real Xbox 360?

    As for their "no piracy" claim, as a consumer, that doesn't make me want to sign up for the service. That makes me want to go far away, into the open arms of indie developers, who typically ship with reduced or no DRM.

  • by SanityInAnarchy ( 655584 ) <ninja@slaphack.com> on Tuesday March 24, 2009 @10:35AM (#27311555) Journal

    I was looking for somewhere to attach this comment, and you're it.

    "Cloud" is the modern term for a mainframe, time-sharing-like model.

    One advantage is that your data lives on a server somewhere, meaning someone else is responsible for backup, and you can access it from any "terminal" (typically a web browser, but could also be things like the Steam client).

    Another advantage is a potential pricing model for developers -- Amazon EC2 charges per hour of server time used, at a very flat rate. If you only use an hour, you only pay ten cents.

    The big advantage of an infrastructure like EC2 is shown in pathological cases, like websites which tend to receive more traffic at certain times of the day. So every night, you can shut down whatever capacity you don't need, and stop paying for it -- and Amazon can then allocate it to someone else who needs it at that time, possibly overnight.

    At a different level, you see the same pattern with web applications -- you don't need a computer more powerful than it needs to be to run Firefox. The server can do whatever computing you need that isn't already happening locally -- but most GUI apps spend a lot of time waiting for the user. So when you do a search in Gmail, that takes some server CPU -- but while you're examining the results, that server is off running someone else's search.

    Here's the problem: None of these advantages apply to these guys.

    The "my data is elsewhere" advantage is irrelevant. Steam already provides this. So long as I remember a username and password, I can download all my Steam games, along with all their savegames and settings.

    The idea that a piece of hardware might not fully be utilized by a single user, and could thus be re-allocated, is similarly irrelevant here. Unless they have some sort of weird economies of scale where one video card can serve a thousand users, and cost less than a thousand times the cost of one normal video card... they're pretty much stuck with one machine per gamer.

    And since these machines will have to be geographically close to the gamer to be at all viable, there's going to be very little gain from half the gamers going to sleep just as the other half wakes up. You're still going to get the bulk of your traffic from large groups of gamers coming home and logging in at about the same time.

    The only advantage is the not-having-to-think-about-maintenance bit, which is pretty weak against consoles. A console is something even John Q. Gamer can unpack and plug in himself. Having to do it every four years is really not that big a deal.

    So that's a very long way of saying: I agree with you, "cloud" is being abused. This is clearly someone trying to cash in on the buzzword, without really understanding what it's good for -- it would be like creating an XML representation of a waveform from a sound file.

    This doesn't mean XML is worthless, or that it lacks meaning. It just means that someone drank a little too much kool-aid.

    Similarly, "cloud computing", as vague as it can be, is really about a couple of related concepts that are concrete enough to write down. This is just something that it's really not suited for.

  • Re:No thanks (Score:3, Informative)

    by donaldm ( 919619 ) on Tuesday March 24, 2009 @10:41AM (#27311621)

    So really I don't see lag as a real objection to this. I don't see bandwidth as a huge problem going forward either... lots of people already have fiber going to their house these days.

    For those people who have ADSL, fibre or even cable can you answer the following:

    1. Is your download capability unlimited if not what is your limit?
    2. Does your ISP throttle your service after a certain threshold is reached? If so what is your throttle speed?
    3. If you have a download cap do you pay for any excess? If so how much?
    4. What is your average network speed during peak periods.
    5. What is your average network speed during off peak periods?
    6. What does your service cost?
    7. Do you pay for this service or does someone else?

    If you can honestly say your ISP provides a high speed, high bandwidth network connection with totally unrestricted download capability for a low cost that you actually pay for then you are extremely lucky because most of world does not have this and will not even come close to this ideal (this is subjective) for many years to come.

  • Re:No thanks (Score:3, Informative)

    by aj50 ( 789101 ) on Tuesday March 24, 2009 @11:36AM (#27312447)

    Unless the game you're playing trusts the client to do its own hit detection (which would preclude any competitive Internet play), it's the server that disagreed over whether you hit the person, not their client. (although it's possible that some artifacts are produced due to lag compensation)

    The only game that I'm aware of that doesn't do server side hit detection is bzFlag, where each client checks for hits against itself which would make cheating trivial, even if the source code wasn't already available. (More server side logic is planned for v3.0.)

    Further reading:
    http://developer.valvesoftware.com/wiki/Source_Multiplayer_Networking [valvesoftware.com]
    http://developer.valvesoftware.com/w/index.php?title=Lag_Compensation [valvesoftware.com]
    http://my.bzflag.org/w/Lag [bzflag.org]

  • by dazedNconfuzed ( 154242 ) on Tuesday March 24, 2009 @11:46AM (#27312599)

    And yet, many big name companies are able to stream HD quality video over the internet

    Sure - because they buffer the content on your end, and you don't notice the lag between frame sent and frame displayed. Additionally, the content is pre-rendered. Netflix's "Instant" option sure looks instant to me, because when I click "play" I overlook the few seconds of buffer loading while I settle into a comfy chair, and that's not even considering the additional delay of render time.

    It's not a matter of getting HD images to you. It's a matter of getting HD images constructed and delivered and displayed within about 1/30th of a second of you pressing a button. Big urban bandwidth & lag is fine for delivering HD video, but not this-split-second gaming images. There's a big difference between direct CPU-to-GPU-to-display lag vs. CPU-to-ISP-to-renderfarm-to-ISP-to-CPU-to-display lag, as in orders of magnitude.

  • Re:No thanks (Score:3, Informative)

    by marcansoft ( 727665 ) <(hector) (at) (marcansoft.com)> on Tuesday March 24, 2009 @11:56AM (#27312743) Homepage

    You're a victim of the marketing. There's a difference between "how fast a pixel can flip" and "how long it takes to START flipping after the computer tells it to". 2ms response time means no ghosting. It doesn't mean the LCD processing won't take over 50ms to actually propagate the change to the screen. In fact, very often, these low pixel response times are achieved using driving tricks and heavy preprocessing, which ADD lag by buffering more input frames.

    Long ago, the complaint was ghosting and blurriness in high motion environments, but that's long since gone. The problem now are some LCDs which buffer a bunch of frames in order to perform questionable advanced processing, and which adds a ton of lag to the actual picture. Manufacturers don't quote numbers for that, unfortunately.

  • by relguj9 ( 1313593 ) on Tuesday March 24, 2009 @03:56PM (#27316831)

    You seem to be assuming that this service will stream VIDEO to your unit, but with TFA not being too clear on the subject

    Actually, the article is quite clear:

    The secret sauce to making OnLive work is its proprietary, on-the-fly video compression capability. As you're playing the game, the outgoing frame buffers are compressed as a video stream and sent to your local client. Perlman estimates that servers need to be within 1,000 miles of a client, at a maximum, to maintain latencies low enough to ensure playability. User data, such as inputs and commands, will be sent back over the Internet, but those usually consist of fairly small data packets.

    Of course, a broadband connection is required. For standard definition (480p) resolutions, users will need a minimum of 1.5 megabits/sec. A 5 megabits/sec connection will support high definition (720P or 1080i) connections. Initially, the service won't support 1080p or higher resolutions, but that may come later.

  • by relguj9 ( 1313593 ) on Tuesday March 24, 2009 @03:59PM (#27316899)

    We got some hands on with Company of Heroes, and the game certainly seemed to play well on a standard MacBook Pro (running Windows Vista, ironically). We were sitting at the Rearden Steel offices in Palo Alto. According the McGarvey, the server hosting the game was running in Santa Clara, about fifteen miles down the road. Although we only played for a few minutes, there was no visible lag or other latency issues. Of course, fifteen miles isn't 1,000 miles, and the servers didn't have thousands of users trying to run at the same time.

    The article also states that it only requires 1.5 mb connection for 480p and 5mb for 720p and 1080i. Just really good proprietary video compression software.

"The number of Unix installations has grown to 10, with more expected." -- The Unix Programmer's Manual, 2nd Edition, June, 1972