Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
PC Games (Games) The Internet Games

OnLive Gaming Service Gets Lukewarm Approval 198

Vigile writes "When the OnLive cloud-based gaming service was first announced back in March of 2009, it was met with equal parts excitement and controversy. While the idea of playing games on just about any kind of hardware thanks to remote rendering and streaming video was interesting, the larger issue remained of how OnLive planned to solve the latency problem. With the closed beta currently underway, PC Perspective put the OnLive gaming service to the test by comparing the user experiences of the OnLive-based games to the experiences with the same locally installed titles. The end result appears to be that while slower input-dependent games like Burnout: Paradise worked pretty well, games that require a fast twitch-based input scheme like UT3 did not."
This discussion has been archived. No new comments can be posted.

OnLive Gaming Service Gets Lukewarm Approval

Comments Filter:
  • by l_bratch ( 865693 ) <luke@bratch.co.uk> on Friday January 22, 2010 @06:30AM (#30857784) Homepage

    The menu video seems to be available, but the in game videos now give:

    "This video is no longer available due to a copyright claim by OnLive, Inc..."

  • But for serious gamers it is common knowledge that remote playing will not ever be as quick as a LAN frag fest.

    Possibly true, but possibly also might not matter, if it's still quick enough. After all, playing on the internet isn't as quick as a "LAN frag fest", and yet the vast majority of gamers, even of twitch-heavy games, are playing on the internet, not on LANs.

  • by slim ( 1652 ) <john.hartnup@net> on Friday January 22, 2010 @06:46AM (#30857850) Homepage

    Even when it's a full product, you won't be allowed to sign up if you're not in a geographically suitable place.

    It seems that the eventual plan is that it will dynamically assign your session to the closest datacentre. But for the timebeing, each Beta tester's ID is assigned a datacentre at registration time, and that's the one that ID will use every time.

    It explains in the TFA that he borrowed the login credentials from a beta tester in another part of the country. Hence he wasn't using a nearby server, as he would have been if he was a real beta tester, or in future, a paying customer.

    It's pretty amazing it worked as well as it did, considering all that.

  • by slim ( 1652 ) <john.hartnup@net> on Friday January 22, 2010 @06:57AM (#30857902) Homepage

    And yet this review - from a sceptic - says it pretty much works. While it's in beta. From a location that would have been excluded from the beta if he'd gone through proper channels.

  • by jbb999 ( 758019 ) on Friday January 22, 2010 @07:41AM (#30858160)
    The major problem isn't overall latency, it's little spikes of latency on an otherwise good line. A moment of 100ms lag on an otherwise good line doesn't matter for online games because of client prediction and at worst it's a tiny moment where the controls don't seem responsive. It's not a problem for normal video because they can buffer 250ms or 500ms or 1000ms of video without any problem. But on this they can't do any significant buffering or the latency will be too much to play.And even 100ms of sudden latency will cause the picture to lag or freeze or jump. It might only happen occasionally but I suspect people won't put up with it. And they can't do anything about it either, even if your ISP is only 10% loaded on its lines and routers, there will be times when all that 10% send packets at the same moment and they get queued in a router somewhere, just for a tiny time but tiny little amounts of jitter like this are normal and expected and to be honest I think will be the downfall of this project because there is no real way to deal with them. But I guess we'll see :)
  • 1 MB/sec... (Score:4, Interesting)

    by V50 ( 248015 ) on Friday January 22, 2010 @07:54AM (#30858218) Journal

    There are still large areas of North America stuck with either stone-age Dial-Up (in 20-freakin'-10) or slow expensive satellite. Like mine (I cry myself to sleep over my 1200ms latency) This is absolutely a no-go there. Obviously.

    Now, in better places, I'm sort of out of the loop. Whenever I've spent time in cities, either visiting my brother in Ottawa or living in London (Ontario, not the good one) for a few months at a time, it's been my experience that even connections that are supposed to get up to 1MB/sec would be lucky to get that in practice, especially at peak times. Furthermore, the sheer amount of lagspikes, connection hiccups, or general time when the interrnet craps out for no apparent reason makes it seem like you'd be dealing with one frustration after another. The number of times I see people get DC'd on World of Warcraft seems to back up my theory that staying connected, and maintaining a constant connection at 5KB/s or so (for WoW) is difficult enough, doing the same for a (whopping?) 1/MB/s while keeping latency under 100ms would be hellish.

    So is my experience with the Internet indicative of the general population, or have I just had the misfortune of having terrible service? Can people really keep 1MB/s sustained, without lag hiccups, DCs, lost packets, etc, while keeping under 100ms?

  • by mykos ( 1627575 ) on Friday January 22, 2010 @08:30AM (#30858352)

    This is an obvious pump and dump scheme, unless they have somehow unlocked technology previously unseen and unknown by mankind, and have done so for the purpose of playing video games.

  • Re:Duuuuuh (Score:5, Interesting)

    by Svartalf ( 2997 ) on Friday January 22, 2010 @08:33AM (#30858368) Homepage

    Actually...it's doable technically with only a very, very small number of subscribers.

    Latency and bandwidth will kill the whole thing.

    You have to use peak values per customer in your figuring for it to even remotely work the way they portrayed this.

    Given this:

    1.5Mbits/s for the feed per user for SD experience with OnLive.

    You can serve roughly as an absolute maximum :

    30 users on a T3.
    103 users on an OC-3.
    404 users on an OC-12.
    1658 users on an OC-48.

    You can expect about $250-500k/mo recurring costs on that OC-48. As another observation, you will likely need to serve 2/3rds to 3/4ths of those numbers to keep the latency usable because as you fill the pipe to capacity, traffic will be subject to the congestion algorithms in the routers and machines at both ends of the pipe. Now, some will state that they'll place the stuff at the ISP's end of things... Then the ISP gets the joy of this same level of connectivity- and they're bitching about "freeloaders" and "bandwidth problems" right now.

    OnLive is snake oil trying to be sold to the game industry as a solution to their "control" problem. It's an alternate DRM play. And it can NEVER work in our lifetime. You can't field enough bandwidth cheaply enough to accomplish it.

Thus spake the master programmer: "After three days without programming, life becomes meaningless." -- Geoffrey James, "The Tao of Programming"

Working...