Forgot your password?
typodupeerror
PlayStation (Games) Wii XBox (Games) Games

Should Next-Gen Game Consoles Be Upgradeable? 348

Posted by Soulskill
from the downloadable-hardware-unlockers dept.
MojoKid writes "Historically, console add-ons that boosted the performance of the primary unit haven't done well. Any attempt to upgrade a system's core performance risks bifurcating the user base and increases work developers must do to ensure that a game runs smoothly on both original and upgraded systems. The other reason is that a number of games rely on very specific hardware characteristics to ensure proper operation. In a PC, swapping a CPU with 256K of L2 for a chip with 512K of L2 is a non-issue assuming proper platform support. Existing software will automatically take advantage of the additional cache. The Xbox 360, on the other hand, allows programmers to lock specific cache blocks and use them for storing data from particular threads. In that case, expanding the amount of L2 cache risks breaking previous games because it changes the range of available cache addresses. The other side of the upgrade argument is that the Xbox 360 has been upgraded more effectively than any previous console; current high-end versions ship with more than 10x the storage of the original, as well as support for HDMI and integrated WiFi. It would also forestall the decline in comparative image quality between console and PC platforms."
This discussion has been archived. No new comments can be posted.

Should Next-Gen Game Consoles Be Upgradeable?

Comments Filter:
  • First Post (Score:3, Informative)

    by Anonymous Coward on Tuesday February 07, 2012 @08:33PM (#38961625)

    No

    • Re:First Post (Score:5, Interesting)

      by Anonymous Coward on Tuesday February 07, 2012 @09:11PM (#38961923)

      No

      Correct.

      The reason platforms become popular are for one of two reasons.

      1. A known base system so developers know what to build for. The Kinect is an outlier as it was advertised as the "next-gen" of the XBox and it was interesting enough for people to get to play with. It wasn't a memory increase (N64), but it was a Rumble Pack which came packaged with a product that requires it.

      Apple did well with the requirement of having 1 mouse button as the standard. It forced developers to make simpler interfaces, which made Macs easier to use.

      2. Cheap replaceable and interchangeable parts. The PC falls into this category, but companies with systems like Consoles or consumer gadgets do not want people poking around them. To top it off, all major console manufacturers have acted against altering the systems systematically.

      • Re:First Post (Score:5, Insightful)

        by Gideon Wells (1412675) on Tuesday February 07, 2012 @09:54PM (#38962345)

        You basically said what I was going to say.

        Essentially, allowing them to be "upgradeable" removes the last barrier that effectively makes them computers with odd user interface devices. So I must say to anyone who wants upgrade-able consoles, it is okay. You don't have to be in the closet. PC gaming isn't so evil you need to hide it under a hipster like charade. We understand.

        • Re:First Post (Score:5, Insightful)

          by Adriax (746043) on Wednesday February 08, 2012 @12:39AM (#38963361)

          It wouldn't be that nice. Only approved upgrade kits would work, every 6 months a new $100 kit would come out, and developers would be forced into an SDK that automatically keeps any game's minimum requirements lock-step with the console upgrade schedule. The upgrades would be nothing more than unlock codes for clockspeed and features already built into the machine.

          Apple would sue them for ripping off their business model.

    • Re:First Post (Score:5, Insightful)

      by Joce640k (829181) on Wednesday February 08, 2012 @06:17AM (#38964805) Homepage

      No

      Mod parent up.

      The great thing about console programming is that you know every last detail of the target machine. You know what works, what doesn't. You can budget everything right down to the last clock cycle and squeeze out 100% performance from the chips.

      If you take that away then it's game over as far as optimization is concerned.

      • by Rotag_FU (2039670)

        I agree that the ability to squeeze out every drop of performance possible by having a fixed hardware configuration is the strongest reason why consoles should not be upgradeable (except on the peripheral and I/O side like HDMI outputs, hard-drive storage, etc.). However, a close second factor is the fact that having a non-upgradeable machine can also dramatically reduce development and per device costs. Flexibility is great, but costs money. This is why I can have a PC that costs 3-10X the cost of a gam

  • by Master Moose (1243274) on Tuesday February 07, 2012 @08:34PM (#38961629) Homepage

    It is too much of a change from the current gen being downgradable.

    • by an unsound mind (1419599) on Tuesday February 07, 2012 @08:53PM (#38961805)

      Oh give it up. So Sony disabled your hardware's capabilities. So what? At least they didn't totally disable the hardware, which they could. You should be grateful for that. You'd have a right to complain if Sony goons came to your house and cut your hands off, at least if you aren't a pirate. If you're a pirate or complain online about Sony, it's totally justified to cut your hands off, because you are hurting Sony and costing billions of Americans their jobs.

    • Re: (Score:2, Insightful)

      by Anonymous Coward

      You're referring to the PS3, the most standards-using (basic USB, standard hard drives, etc) popular console in history?

      • Re:Doubt Sony will (Score:5, Insightful)

        by Darkness404 (1287218) on Tuesday February 07, 2012 @09:43PM (#38962257)
        Yep, the only console that I know of that removes features in firmware updates. It doesn't matter that the hardware was standard, Sony believes its their hardware to do with as they wish, regardless of what you want.
      • Re:Doubt Sony will (Score:5, Insightful)

        by jensen404 (717086) on Tuesday February 07, 2012 @10:09PM (#38962465)
        And my favorite, a built in power supply that uses the same power cord as most desktops (original PS3) or notebooks (slim PS3). Despite being larger than the Wii, it is a lot more portable.
    • I lol'd, but in a sad way.

  • No. (Score:4, Insightful)

    by americamatrix (658742) on Tuesday February 07, 2012 @08:35PM (#38961635) Homepage
    Isn't the point of them to be simple? n00bs use them. ;)

    Step up to PC gaming if you want to able to upgrade your stuff.


    -americamatrix
    • More to the point.. once they're upgradeable, what's the fundamental difference between PC's and Consoles?

      I submit that the next gen consoles can't be more than trivially upgradable, because they wouldn't be consoles any more.

      • Re:No. (Score:4, Insightful)

        by Kelbear (870538) on Wednesday February 08, 2012 @04:35AM (#38964423)

        The difference between PC and console gaming has always been "control".

        Hardware doesn't matter, software doesn't matter, it's who chooses what goes where. On PCs, users have full control to install or tweak hardware and software(even changing the games themselves through mods). On consoles, it's up to the manufacturer. Giving users access to hardware upgrades would erode the difference between PC and console, but it wouldn't eliminate it.

        There are obvious advantages to both approaches. I'd like for consoles and PCs to stay separate so that I can continue to enjoy the advantages of each.

    • What's wrong with having both platforms? I game on my PC when I want a personalized environment that offers a richer experience. But playing alone in meatspace is no fun at times. Some friends, a case of beer, peanuts, large TV and a comfy couch all demand a console gaming experience. Just turn on the unit, pass out a few controllers and let the good times roll.

      Sheesh. It's not that difficult to choose when and were to use the appropriate gaming hardware, is it? Why are we even having this stupid debate? I

  • by samriel (1456543) on Tuesday February 07, 2012 @08:35PM (#38961645)
    The entire point of game consoles is that developers at least have a chance at a homogenous platform where they can make sure the game mostly runs the same everywhere. If you allow upgrading CPU, GPU, etc. then it's just PC gaming with a weird OS and components that will most likely cost more just because they can.
    • Personally? Consoles for games, open(ish) HW for work - PC or Mac, I don't care.
      • by tepples (727027) <{tepples} {at} {gmail.com}> on Tuesday February 07, 2012 @09:14PM (#38961957) Homepage Journal

        Consoles for games, open(ish) HW for work

        Then what for indie games? Xbox Live Indie Games and nothing else?

        • Honestly there really aren't that many decent Indie games out there. For "Indie" games you'd best just stick to Flash game and Minecraft because that's really all the Xbox Indie games are, either clones of a different game or flash games with 3D graphics that cost money.
          • that's really all the Xbox Indie games are, either clones of a different game

            One could say the same for mainstream games. What are all the military FPS games other than clones of each other? Even Katamari Damacy is just the old Williams arcade game Bubbles redone as a 3D platformer. The last genre launch I know of was around 1997 when Parappa the Rapper was released.

            or flash games with 3D graphics that cost money

            One could say the same for a lot of Wii disc games in the $20 bin at Walmart.

            So how should one join the industry if one's family is unwilling to move to Austin or Seattle?

    • by foradoxium (2446368) on Tuesday February 07, 2012 @08:59PM (#38961847)

      +1 I was going to make a similar reply. The whole point is so developers can make their game run on 4 year old hardware, optimized of course. This is why so many console games don't look as nice as their PC counterparts..but they do play on 4 year old hardware.

      the other nice benefit of consoles is multiplayer, everyone is on equal hardware. Where as in the PC world, someone playing on 4 year old hardware might not be able to perform as well as someone with the latest and greatest system (think fps)...that is one benefit of consoles.

      • Re: (Score:3, Informative)

        by Andor666 (659649)

        Even more than 4 years... XBox 360 is almost 7 years old, as it was unveiled in may 2005

    • by Darinbob (1142669)

      Although the summary as described essentially says "devs don't want to figure out size and configuration of caches at run time", something that's commonly done even in some otherwise immutable hardware. After all there's always the next gen of hardware and it's nice to be able to not redo the software if you don't have to. In other words, devs shouldn't be lazier than necessary.

      I think this attitude really came from early days where the PC was just an awful mess with applications and games completely bypa

      • I think this attitude really came from early days where the PC was just an awful mess with applications and games completely bypassing the nearly nonexistent operating system on a whim (ignoring decades of experience in other platforms).

        Um, it wasn't "on a whim", at least on IBM PCs. On early IBM PCs, ROM BIOS was the closest thing to a HAL; but ROM BIOS was also dog slow (stemming from the fact that RAM was many times faster than ROM, meaning a program that made a lot of ROM calls would tend to be dog slo

    • If you allow upgrading CPU, GPU, etc. then it's just PC gaming

      That and unlike with PCs, there's a culture of plugging consoles into bigger monitors so that people on a sofa can play together in person. Not all games are competitive FPS or RTS where splitting the screen destroys the multiplayer experience. Fighting games, for instance, don't even need a split screen.

      • by timeOday (582209) on Tuesday February 07, 2012 @09:46PM (#38962293)
        I've played tons of splitscreen Halo with my son and it's lots of fun. Even if you're playing against each other, it's a level playing field. If you're playing as a team against others online it's a bit of an advantage because you have two vantage points.

        I've been disappointed with perhaps decreasing support for split-screen in console games. To me it's where consoles really shine above PC games. I haven't upgraded from Forza 3 to Forza 4 because they didn't make much improvement to the splitscreen mode (co-op online play, more than 2 AI cars, etc).

        • Would you buy PC games if they let you plug in multiple USB gamepads and actually use them?
  • Consoles vs. PCs (Score:4, Insightful)

    by omganton (2554342) on Tuesday February 07, 2012 @08:36PM (#38961653)
    I believe that if you want an upgradable gaming/HT platform, then you should build a PC. Consoles are specifically manufactured to run on a set hardware specification. Adding and/or changing the predefined hardware of a console will only add to the development cost of games, which will eventually be passed on to the consumer in the form of even more expensive games. Although the concept seems cool, I don't want next-gen xbox games to cost $100 each.
    • Re:Consoles vs. PCs (Score:4, Interesting)

      by GabriellaKat (748072) on Tuesday February 07, 2012 @11:28PM (#38962957)
      They just might come close to it. I am betting they will cost about $75 new and this is a huge reason they are also trying to kill the used market. You wont have a choice but to buy new, at the price they set. And then the cracking and hardware modders will really slam it to the console makers and the pirating will boom and people go offline or a darknet similar to LIVE/PSN will emerge. But, this comment will be never read and modded up...
  • It doesn't matter (Score:5, Interesting)

    by Jiro (131519) on Tuesday February 07, 2012 @08:36PM (#38961655)

    Suppose all those problems were resolved, and after resolving them we concluded "yes, next gen consoles should be upgradeable".

    It wouldn't make any difference. Consoles are proprietary platforms--controlled by one company. The fact that making the console upgradeable would benefit *you* isn't going to result in an upgradeable console. It wouldn't benefit the company, and that's what matters. I mean, I'm sure that PS3 Linux benefitted people.

    (Incidentally, for an example of a successful add-on, look at the PC Engine CD. We just don't remember it much because the system barely got a foothold in the US.)

    • by ShakaUVM (157947)

      Well, imagine if you had the option of buying an Xbox 361 right now, that would function identically to the Xbox 360, but would get better frame rates. Would people want to buy it? Sure. I hate it when cutscenes suddenly drop down to 15fps, or when a game suddenly lags under the weight of all the action on the screen. I'd pay a hundred bucks or so to upgrade the 360 at this point.

      You'd just have to enforce a decree that all games are playable on the low end systems, and try not to have too many different up

      • I would buy a 361 with advanced post-processing in a heartbeat. Forget even changing anything about the code at all, jsut apply better post-process with more AA, more Aniso, faster frames etc.
      • by MMMDI (815272)
        At that point, every game would be developed for the 361, and it would "technically" still run on the 360. Horrible framerates, unplayable multiplayer, the whole nine yards... but it technically works!

        This is why I switched from PC to console gaming years ago. If I'm playing against you on a console, it comes down to skill: we both have the same console, so it's a level playing field. On a PC, we could be equally matched in whatever game... but since you spent that extra $100 on your video card, you'r
    • by bonch (38532) * on Tuesday February 07, 2012 @08:52PM (#38961781)

      Not to cut in on the OMG-PROPRIETARY-PLATFORMS rant, but benefiting the company is kind of the point of running a business. And the console business is doing extremely well, much better than the PC gaming market, so mainstream customers are clearly okay with it. The fact that people on Slashdot still rant about PS3 Linux as if any significant share of the PS3 user base even bothered with it is illustration enough how out-of-touch many of the posters are.

      • I thank you.
      • If I sold you a bicycle, but then told you how you were allowed to drive it, how fast would you tell me to fly a kite? What if I then told you I wasn't kidding, and came back later and removed the back wheel and welded on a block of concrete in it's place?

        Tough, the eula is in the panier in the back and you agreed to it with you got on. Nah nahnah nahnah nah!

      • by cheekyjohnson (1873388) on Tuesday February 07, 2012 @09:16PM (#38961973)

        but benefiting the company is kind of the point of running a business

        Forget society. I can do whatever I please to make money.

        The fact that people on Slashdot still rant about PS3 Linux as if any significant share

        Of course. If something bad happens to a few people, it doesn't matter. It's only a few people, right? Something "bad" suddenly changes into something "neutral" or "good" because it only happened to a few people!

    • (Incidentally, for an example of a successful add-on, look at the PC Engine CD. We just don't remember it much because the system barely got a foothold in the US.)

      The N64 memory upgrade would be an actual example of a successful console upgrade. Plenty of people bought that and it was well supported.

  • by Daetrin (576516) on Tuesday February 07, 2012 @08:37PM (#38961663)
    Past consoles that had upgrades didn't do too well. In particular changing aspects that the programmers depend on (the amount of memory being the particular given example.) The "counter-example" is that adding entirely new optional features or additional file storage that the programmers can choose to use or not, and which do not change _anything_ about the regular architecture if they choose not to use them, doesn't seem to have any adverse problems. (Which says nothing about how well new games using the optional features sell, just that it doesn't break old games.)

    Using that "counter-example" to argue that perhaps they should allow upgrades to the components the programmers depend on is just weird. Certainly you'd have to include a disclaimer in the docs right from the start about which components might be upgraded in the future. Even so, a large number of programmers would either not notice the disclaimers and fail to account for the possibility in their programming, or decide that dealing with it would be too difficult and thus fail to account for the possibility in their programming.
  • No they shouldn't. The whole reason you get a console is because it just works. Don't have to worry if your video card is good enough, if you have enough ram, if you have the right drivers installed.. etc. You just plug it in, hook it up to a tv, and put your game in. If you want upgradable consoles, then just use your pc and buy a controller.
    • True, this is how I feel. But there are a lot of PC users out there that get second rate ports. hence my personal love of homogeneous consoles. That's why Sony were right to charge for 'future proof'.
      • We get some second rate ports sure, but the first rate stuff we get FAR outweighs the bad. Even bad ports still look better on PCs.
    • If you want upgradable consoles, then just use your pc and buy a controller.

      And pray that your game even supports controllers. Too many PC games support only a mouse and keyboard, not a HID or Xbox 360 gamepad. And even if they do let you use a gamepad without JoyToKey, they make you use a separate computer and a separate copy of the game (cha-ching [cracked.com]) for players 2, 3, and 4.

  • Good lord no. (Score:5, Insightful)

    by RyuuzakiTetsuya (195424) <taiki@cUMLAUTox.net minus punct> on Tuesday February 07, 2012 @08:38PM (#38961673)

    Unless you mandate that older hardware works just as well as newer hardware, no.

    People will rush to point out things like Kinect, or PSMove, or WiiMotion Plus... Those are accessories. Often cheap too, relatively speaking. The CPU is still the same, the RAM is still the same, game compatibility is still the same(more or less; there are bizarre examples across the board). Having upgradable mass storage or expandable accessories doesn't break the underlying assumptions.

    I think that consoles should be "good enough." Big deal, Battlefield 3 looks amazing on PC. Surprise, it also looks amazing on Xbox and PS3. Increased levels of detail do improve immersion a LOT. But when there's a huge trade off between bleeding edge graphics and stability and compatibility, I'll lean towards stability and compatibility.

  • by timeOday (582209) on Tuesday February 07, 2012 @08:47PM (#38961745)
    Now would be the dumbest time to start making consoles upgradeable. The long lifespan of current-gen consoles shows that the hardware is no longer improving very rapidly in any way that people are willing to pay for. The low cost yet low sales of desktop PCs confirm the same fact. The next-gen consoles ought to be designed to run a generous poly count at 1080p resolution at 120hz (i.e. 60hz in 3d). Do that, and people will be happy for quite some time.
  • They are called Gaming PC's. It's a niche market and there is reasons for it. The XBox 360 adding internal WiFi is one thing, changing anything relating to processing power is completely different.
  • by Groo Wanderer (180806) <charlie@NoSPaM.semiaccurate.com> on Tuesday February 07, 2012 @08:49PM (#38961759) Homepage

    Upgradable in anything more than a trivial way, HD or Optical for example, basically an add-in card, blows the console economics out of the water. Socketed ram adds cost and drops speeds vs soldered. Same with CPU, and then we get to cooling issues..... Given MS's ability to keep the bumps on the 360 from shattering, would you want people to start mucking with that?

    Part 2 is pointed out well above, console = fixed platform = cheap software testing. Upgrades = not fixed platform = testing nightmare.

    While I haven't read the article (yeah, shame on me), I know more than enough about console development, economics and programming. I also talk to people doing the 'next gen' consoles almost every week. Having written for a console, I can tell you directly that 'upgrades' are, and will always be a non-starter. Anyone who posits it seriously is the walking equivalent of a flashing neon 'N00B' sign, complete with arrows. :)

                                -Charlie

  • ...at least I learned about pork brains in milk gravy! Almost made it worth reading.

  • by Khith (608295)

    Yes, I think that upgradeable consoles would be great! We could even come up with a special name for them... Since you could customize them as you choose, that makes it more personal to the owner. They also perform various computations that allow you to play games. Let's call them personal computers! Perhaps we could even shorten that to "PC" if people prefer.

    Just imagine one of these PCs and all of the parts you could put in it. You could even attach different types of input devices! I can even see them us

    • Personal computer? Too personal. If you have more than one gamer in the household, you want something the whole family can use. That's why a Family Computer [wikipedia.org] came with two controllers, and just about every console since then has supported two to four controllers and a TV output so that they could be used with a monitor big enough for the whole family.
  • by Karmashock (2415832) on Tuesday February 07, 2012 @09:10PM (#38961909)

    The Next Gen console is an inexpensive PC capable of playing the newest games with reasonable quality.

    The whole console paradigm is based on two qualities.

    1. Price. Consoles cut corners and lack certain qualities that PCs have and as a result have great game performance at a reduced price. This is entirely possible with PC hardware today. If MS builds their own PC from the ground up to be a gaming machine then there's no reason why it can't support windows and have superior game performance.

    2. Ease of use. PCs have been hobbled for years by being too complicated for their own good when it comes to games. More sophisticated gamers have no problem with this but it can be an issue with many. Take a cue from Apple and lock down these console replacement PCs by default so the casual users doesn't mess them up. For one thing, restrict multitasking by default as that harms game performance. If people want to have lots of background processes running while they play their game then give them a setting that lets them disable the feature. But by default, just as with typical consoles, have them devote all their attention to the game when it's running. Everything else is suppressed. Also as MS would be releasing these machines there would be no driver confusion since all the systems would come with the exact same hardware installed in them.

    This would also break down the barrier between Xbox users and PC users. This barrier is not in MS's interests. If the Xbox and the PC play the exact same games then no other console is going to be able to compete with them. Exclusive titles just for the xbox that don't get released on the PC don't help the xbox... they hurt the PC.

    As an additional aside, the consoles and MS especially need to get serious about producing a REAL media center. Something like XMBC only better. XMBC is pretty impressive for an open source community built project but MS, Sony, Nintendo, or Apple can do better. Stop dicking around. Stop trying to restrict what people can and cannot play on the machine. This only hobbles the utility of the system and ensures it won't catch on. Who gives a damn about windows media center edition? Who ever cared? It was a flop right out the door because it was half baked. Produce a complete product and release it. We want it.

    Oh, and MS... consider dropping a version of windows on a phone that can run desktop applications. These smart phones are vastly more powerful then the machines that ran windows 3.1 . I think some have to be faster then those that initially ran windows XP. If you can't squeeze a version of windows 8 on one of those phones with a custom touch UI... then you're fools. A system that had that sort of capability would be vastly more useful then any other device on the market.

    • The whole console paradigm is based on two qualities. 1. Price. [...] 2. Ease of use.

      At the risk of sounding like a Monty Python inquisitor [tvtropes.org], make that three qualities: price, ease of use, and local multiplayer. The Wii especially is fun when you have other gamers living with you or when you have friends or relatives visiting you at home. It's a lot easier (and a lot more spouse-acceptable [wikipedia.org]) to buy more controllers than to set up a LAN party.

    • by phriedom (561200) on Tuesday February 07, 2012 @09:55PM (#38962353)
      Your bias is showing. You think PC gaming is superior and if consoles were more like PCs they would be better. But us console gamers DO NOT WANT the barrier removed. We like to know that none of our opponents are using aimbots or custom textures that let them see through walls or macros or other such cheats. We like the fact that people that mod the hardware of their XBOX in order to cheat run the risk of getting locked out of XBox Live. Microsoft makes more money on Xbox games than it does on the same game for PC. And PC game sales are dwarfed by the volume of console game sales, so the value of wooing PC gamers onto the console is not that big. It looks to me like that barrier benefits both Microsoft and console gamers. Also, Microsoft has been pretty serious about making the XBox into a media center, when was the last time you tried it? I have a friend who recently canceled cable TV and uses his PC as a DVR for over the air programming, then streams from the PC to all the XBoxes in the house. The Xbox also runs Hulu+, Netflix, Espn, Last.fm and a UFC channel. I hear the next OS update will add more. I believe they plan on being able to replace set-top boxes from some cable companies in the future. If they are limited, it is because the content providers want to maintain control, not because MS, Sony Nintendo, or Apple are "dicking around" YOU trying telling the networks they should stream everything so the users don't have to pay for cable.
    • by Sir_Sri (199544) on Wednesday February 08, 2012 @12:15AM (#38963233)

      Actually, as the summary talks about, programming for PC's and for consoles is very very different in some aspects. You can do a lot of stuff with a console you just can't on a PC because you know precisely what resources you have, where they are, and how fast they will be. The PS3 actually has custom libraries from Sony/Naughty Dog that are similar but different from openGl precisely because OpenGl would be too slow for the PS3 if you can avoid it. If you started allowing different GPU's you'd have to move to something like directx. Which is a good concept for 200 or so different video card models, but it's not worth the efficiency loss if you'd only ever have say, 4. Directx (and openGl) manage a lot of the GPU memory system stuff for you. That's easy, but it can be very inefficient, which is why a video card with 1 gig of memory does about as well as a PS3 or 360 with shared 512. Now directx and opengl (and the GDI layer in general on windows) have to account for the arbitrary nature of what might also be in video memory at the time. Right now I have two web browsers, some office applications I left open, a game, and steam all doing stuff that might take up memory. That's actually a really tough problem to manage in general, which is why consoles can do some awesome stuff with less, because you know exactly how much memory you get. When you could lock down a full screen application in windows and boot everything else out it was easier (but not easier on users and had its own complications).

      In short. Your point 1 is wrong. If it supports windows it has to support general program environments and random crap hanging out in the desktop. Windows is a productivity OS (despite what people may think) and you can use a stripped down version of the kernel, but the actual OS as sold does a lot of stuff you definitely would not want in a memory constrained environment, like layers of stuff on the desktop etc.

      And yes, the idea with windows 8 is to have a unified environment to execute phone or desktop code. Same OS, different skin. Now if then get intel into a 3 way with nokia they will have one hell of a product on their hands.

  • It is ridiculous that they are not and the waste involved in the industry. It is nice that most consoles live a 5-7 year lifespan but there is very little reason that they could not be designed for some modularity and allow for them to be upgraded. Basically at this point it is a MB/CPU and GPU upgrade. I'd be OK if it was even the same price as a new system just to reduce the ewaste. Also, it is ridiculous that consoles are not required to offer backwards compatibility at least through emulation. I used to

    • Also, it is ridiculous that consoles are not required to offer backwards compatibility at least through emulation.

      Nintendo upgraded the GameCube in 2006. The new version had not only more RAM, a slightly faster CPU and GPU, and a distinctive remote controller, but also an online service to buy older consoles' games. The Wii can play downloadable NES, Super NES, and N64 games in Virtual Console emulation, and units from about the first five years of production can play GameCube game discs. What other console has as much backward compatibility?

      • by rAiNsT0rm (877553)

        I never said that they don't, but that they should be required to. The reason they do not is because they feel it promotes the used market and that people will not buy new versions if they can keep playing their old games. Of course that is B.S. and what it does do is keep older less efficient consoles in operation or a waste of the user throwing everything away since it usually has little value. The Wii is probably the standout in this area.

        • by tepples (727027)

          they should be required to

          Would you apply that to the PC as well? Even PC back-compat isn't perfect. The 64-bit versions of Windows Vista and Windows 7 can't run DOS games, DPMI games, or Windows 3.1 games. Heck, some games relying on undocumented behaviors of Windows 9x never got patched even to run on NT-based Windows XP.

      • That's a bad analogy. The Wii is backwards compatible 1 generation, it can play GameCube games and that's it. If I have some old SNES cartridges lying around, I can't exactly plug them into my Wii, I can't even send them in to Nintendo to have them imaged into Wii ROMs nor can I go to walmart and get a device that lets me transfer my NES/SNES/N64/Turbo-Grafix-16/Genesis/Neo Geo cartridges and play them on the Wii. Instead, I have to buy them again. That isn't backwards compatibility, that is simply Nintendo
  • Considering the price of games, one would think that the next-gen consoles would be nearly free, subsidized by the publishers. Especially as efforts to limit the value of used games becomes more prevalent.

    I don't see Sony doing it, but I could totally see Microsoft partnering with a distribution channel to keep the price of the new consoles down. As it is, they'll probably be closer to $1000 than they will be to the prices of the current consoles.

    I hope they fail. Consoles do nothing to improve the lives

  • by Effugas (2378) * on Tuesday February 07, 2012 @09:45PM (#38962275) Homepage
    The platform that most successfully upgraded itself was the NES. One of the degrees of freedom they had, because there were chips in each cartridge, was to deploy new memory management units inside the games themselves. Quite literally, the NES became more powerful for games released later in its dev cycle. SNES did this too, with the SuperFX chip inside of Starfox (the most popular DSP in the world, for its era) but it wasn't quite the "all games ship upgrading hardware".

    I suspect if there was ever to be upgradable hardware, it'd have to work by yearly subscription, and it'd have to be no more than $50 a year for the part. However, with guaranteed sales in the millions of units (as games would hard-require it) the logistics of making some pretty crazy stuff fit into $50/yr wouldn't be unimaginable. Remember that XBox Live is already pulling, what, $60/yr?
    • by pecosdave (536896) *

      So - Sega could have made the 32X successful if they had just put it in the cartridges?

      ---hey, who threw that knife at me?

  • by XxtraLarGe (551297) on Tuesday February 07, 2012 @09:51PM (#38962321) Journal
    One of the advantages of a console is that I can count on the games running in a relatively stable manner. The quality of the games usually get better over time because the developers learn better techniques & more optimizations, precisely because it isn't a moving target.
  • Absolutely Not. (Score:5, Insightful)

    by Jarik C-Bol (894741) on Tuesday February 07, 2012 @10:06PM (#38962439)
    Absolutely Not, and here is why:
    With non upgradable consoles, you never go to buy a new game, and wonder 'wait, will this run on my machine?' That is the appeal of consoles over PC gaming, for the most part, 'it just works' you put the disc in, and play the game, and it is the same for everyone. No wondering if your graphics card will be able to make it look like the videos you saw online, no wondering if it will lag during action sequences, no wondering if you're going to need to drop another 50$ on more ram to play.
  • by grumbel (592662) <grumbel@gmx.de> on Tuesday February 07, 2012 @10:31PM (#38962615) Homepage

    Making a console user upgradeable makes little sense, as consoles are meant to be compact and user upgradeable parts would work against that. In times where you can't even swap the battery in most devices, you can't expect to swap the GPU or CPU. On the other side consoles should reach a point where they can get upgrades in the mid of a generation or more dramatically the whole "console generation" thing should disappear and updates should be more fluent. Essentially they should reach a point where they act like a TV: Want to see the a movie in glorious 1080p, you have to buy a new TV, but you can also just use your SDTV and view the movie just fine, but at lower quality. Furthermore your 1080p can still play old SDTV content just fine. There is a lot of forward and backward compatibility in the system. Consoles don't have that right now, backward compatibility is very limited and forward compatibly almost non-existent (except for a few GameBoyColor games). Of course at some point there would be a cut-off where the old-console would really be to old to play some new content, but things like small PSN/XboxLive games could easily be made flexible enough to run not only on the latest generation of hardware, but also a generation before that.

  • by billcopc (196330) <vrillco@yahoo.com> on Tuesday February 07, 2012 @11:08PM (#38962827) Homepage

    They're consoles. The whole point is to have a consistent hardware base, so developers can custom tailor their code to the platform, leading to simplified testing and improved stability. One CPU, one memory spec, one GPU... the key parts are consistent.

    You want to upgrade your console ? Trade it in for a new one! Or, if you're like me, you put it away and take it out from time to time for nostalgia.

  • by DarthVain (724186) on Wednesday February 08, 2012 @12:07PM (#38967629)

    You buy the next generation of console.

Going the speed of light is bad for your age.

Working...