Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Graphics Games

Why Project Flare Might Just End the Console War 166

An anonymous reader writes "Project Flare, the new server side gaming technology from Square Enix, turned heads when it was announced last week. The first tech demos do little more than show the vast number of calculations it can handle with hundreds of boxes tumbling down in Deus Ex, but the potential is there to do much more than just picture-in-picture feeds in MMOs. As a new article points out, what's most interesting is the potential to use the technology for games that use more than one system — OnLive may have used this tech before, but only to play games you can buy on discs in the shops anyway, but the future is in games that need the equivalent of dozens of PS4s or Xbox Ones to power them. Ubisoft has already partnered with Square on the project."
This discussion has been archived. No new comments can be posted.

Why Project Flare Might Just End the Console War

Comments Filter:
  • by faragon ( 789704 ) on Sunday November 10, 2013 @04:24PM (#45385899) Homepage
    Why to spend power in datacenters when people can use it at home? Other than vendor-lock, is non-sense. Another thing is how scalabe the thing is, etc.
  • Right... (Score:5, Insightful)

    by Pinhedd ( 1661735 ) on Sunday November 10, 2013 @04:31PM (#45385957)

    OnLive was such a bastion of success wasn't it?

  • by Anonymous Coward on Sunday November 10, 2013 @04:33PM (#45385967)

    I'd imagine there'd be some scalability advantages for specific use-cases (re-use of assets, models, animations and the game world across multiple instances of the game), so MMOs could generally benefit from this approach because many users share the same content at the same time, while it would be close to useless for single-player games where basically every player has different content on-screen and in-game than every other player.

  • Great, Square ... (Score:3, Insightful)

    by Anonymous Coward on Sunday November 10, 2013 @04:39PM (#45385999)

    The company that last released a good game 16 years ago. I can barely contain my excitement.

  • No thanks (Score:5, Insightful)

    by epyT-R ( 613989 ) on Sunday November 10, 2013 @04:39PM (#45386003)

    There's a reason I've cut cable tv from my life. Being remote controlled and the only game in town, it's become overpriced, ad-laden, and content thin. If that's where gaming is going I will have to cut that too. The prospect of overpaying to 'stream' a laggy, ad-filled game experience with overly-constrained lossy-compressed AV doesn't sound inviting either. I LIKE the idea of having power under the hood locally, so to speak, just like I want server binaries for games to run my own servers and mod tools to make my own mods/maps. This way the game stays alive as long as there are interested players and doesn't die the moment it stops making money for its creators. To top it off, the current 'cloud' model for a lot of software now charges the 'owner-controlled boxed software' prices of the 90s for what amounts to a rent-a-go arcade level of service. What a rip-off.

    The more computing looks like ibm's wet dream of 'service', the less interesting and more oppressive it gets. No thanks.

  • Obvious: latency (Score:5, Insightful)

    by De Lemming ( 227104 ) on Sunday November 10, 2013 @04:39PM (#45386007) Homepage

    Even with modern broadband, latency is still an issue for these kinds of applications. In the article are some examples of currently used server side gaming enhancements, like "Forza 5 will even use cloud computing to monitor the way you drive, and alter virtual drivers’ AI (artificial intelligence) accordingly." That has no need for low latency. But if you want the environment to immediately react to players actions, there need to be low latency. And you can't remove the distance (and related network infrastructure) between the player and the data center.

  • by PhrostyMcByte ( 589271 ) <phrosty@gmail.com> on Sunday November 10, 2013 @04:51PM (#45386073) Homepage

    Some people kept saying "It's not that bad right now, it'll work eventually!", but Microsoft just (accidentally) tested OnLive's idea for low-latency games by introducing some small input lag into Windows 8.1 [slashdot.org]. Guess what? FPS gamers noticed.

    Other game types which don't need super low latency, I'm sure, will eventually get here if only because game companies are still annoyingly DRM-focused and this will make piracy impossible.

  • by WaffleMonster ( 969671 ) on Sunday November 10, 2013 @04:53PM (#45386095)

    It's basically a way to keep the price of consoles at a point where people will still buy them, while being able to offer the level of processing power that would be too expensive.

    Current consoles are well beyond the point of diminishing returns with regards to graphics power while cost of replicating existing capabilities keep getting cheaper year after year.

    I don't know if it's vendor lock-in, but atleast this is a way to offer paying customers a better experience than pirates instead of the other way around with current DRM.

    On what planet does high latency translate into a better experience?

  • by WaffleMonster ( 969671 ) on Sunday November 10, 2013 @05:20PM (#45386327)

    Its very simple - power savings, and cheaper thin consoles for end users.

    What power savings? Power is being consumed somewhere else where as a customer YOU are paying for that too. Lets not forget about additional power requirements required to push insane number of real-time bits for trivial reasons over the Internet.

    This would not work too well for multiplayer because of the latency between user-server-user, but would be great for single player.

    Since everyone would experience input latency and there is no network latency for the multi-player link latency would be about the same persistent problem whether it were single or multiplayer game. The only lag assuming lack of operator incompetence would be in the form of input delay with very limited opportunities to compensate with prediction algorithms. Nobody who plays on anything approaching a competitive basis would touch this thing.

  • by aaaaaaargh! ( 1150173 ) on Sunday November 10, 2013 @06:36PM (#45386765)

    It's basically a way to keep the price of consoles at a point where people will still lease them, while being able to offer the level of processing power that would be too expensive.

    Fixed that for you. This has nothing to do with buying anything, it's a temporary lease and the servers will be shut off sooner than later.

  • by clockwise_music ( 594832 ) on Sunday November 10, 2013 @08:01PM (#45387271) Homepage Journal

    Works great till you realize the USA is currently worse than a third-world country in terms of broadband penetration and up/down speeds...

    Sorry, have you been to any third world countries? In many you're lucky if you can get dial-up speeds, yet alone a constant connection.

  • by wwalker ( 159341 ) on Sunday November 10, 2013 @10:25PM (#45387961) Journal

    Sniper rifle should be the *least* latency-sensitive weapon. In real life, no sniper can hit a running target at any reasonable distance (unless they are running directly towards, or away). More so if the target is passing by a window and is only visible for a fraction of a second, which makes any sort of leading practically impossible.

  • by Anonymous Coward on Monday November 11, 2013 @02:56AM (#45389037)

    >How about 100% cheat prevention? When all the computing is done centrally, how could you possibly cheat in the game anymore?
    Most games have all, or nearly all, of the processing happening on the server as it is. Doesn't stop cheaters. Sure, they can't exactly just memory edit the amount of gold/points/health/whatever they have anymore, but there are an infinite number of other ways to cheat. Think about botters; that doesn't rely on client-side processing. Aimbots in FPS games do not need to rely on client-side processing (from the game, anyways) either; they will detect enemies and simulate mouse movement to auto-aim.

    >Plus, it totally eliminated the lag factor in FPS, as only the central server do the processing and rendering. Rubberbanding and blinking/shifting enemies will be eliminated.
    No it doesn't. In fact, it's going to make it worse. It might *look* different, but the actual effect will be more detrimental to your gameplay. If you have issues with lag with a very, very low amount of information being transferred (ie. position updates), then what the fuck makes you think upping it to uncompressed 1080p video streaming is going to improve it? Instead, it'll be like trying to watch a Youtube video that is constantly trying to buffer. Even if, and I stress "if," it were to stream halfway decently for you, you are still going to be feeling the effects of the lag. Everything will feel sluggish and the controls will always seem to trigger actions that happen much later than when you pressed the button.

    >With only 1 copy of the world, then the number of players will only be limited by the number of CPU doing rendering from the POV of each player, and that is probably easier to scale as the rendering process is read-only. So you can have MASSIVE number of players in the same game, imagine hundreds of player all in the same battlefield, and that limit can be increased by a server upgrade instead of waiting 5 years for another console generation.
    This can already be done and it does not need that the video output be rendered on the server which requires even more massive servers.

Remember, UNIX spelled backwards is XINU. -- Mt.

Working...