Slashdot is powered by your submissions, so send in your scoop


Forgot your password?
PlayStation (Games) Wii XBox (Games) Games

The 5-Year Console Cycle Is Dead 422

Pickens writes "The Xbox 360 recently turned five years old, and with no known successor on the horizon for the 360, PlayStation 3 or Wii, Cnet reports on the death of the 5-year console cycle — one of the video game industry's most longstanding truisms. For example, the Nintendo Entertainment System came out in 1985, followed by the Super NES in 1991, the Nintendo 64 in 1996, the GameCube in 2001, and the Wii in 2006. But now, why should console makers upgrade their offerings? Consumers are still buying their machines by the hundreds of thousands each month, and ramped-up online initiatives are breathing new life into the systems. A lot of it has to do with the fact that with the current generation of consoles, each company found a way to maximize either the technology behind the devices, or the utility to a wide range of new gamers."
This discussion has been archived. No new comments can be posted.

The 5-Year Console Cycle Is Dead

Comments Filter:
  • by UnknownSoldier ( 67820 ) on Monday November 29, 2010 @08:11PM (#34382076)

    > vast majority of people playing games at 720p max

    Your comment skirts around the issue, but is not entirely accurate. It is not the players, but the game devs themselves that are "not demanding" a new console. The PS3's RSX is ~= 7800 GTX. Most _games_ DON'T render at the native 1080p but at 720p simply because most (PS3) games are GPU bound. (XBox 360 games are CPU bound if you are curious.) That said, currently the SPUs are _still_ under underutilized. Naughty Dog said this a few years back, but it is slowly getting better: []
    "I'm more impressed with the hardware the longer we get to work with it. Imagining trying to develop Uncharted without the Blu-ray drive, without the hard drive, or without the Cell processor makes me wonder what kind of game we would have ended up with. It certainly would have required a lot more compromises than I would have been comfortable making. And much like the PS2, I think the longer developers work with the machine, the better the games are going to get. For instance we are only using approximately 1/3 of the processing power of the SPUs on the Cell processor in Uncharted."

    The presentation "Getting Unreal Engine 3 to 60Hz" isn't (yet) available on Devnet, but thankfully can be found here... []

    Other presentations (GDC 2009) worth reading are
    * The PlayStation®3's SPUs in the Real World - A KILLZONE 2 Case Study
    * Practical SPU Usage in GOD OF WAR 3

    It will be REAL interesting to see what Polyphony Digital (Gran Turismo 5), and Team Ico (Ico, Shadow of the Colossus) since these two studios are known to typically push the PlayStation (2 & 3) to its limits.


  • by javakah ( 932230 ) on Monday November 29, 2010 @08:15PM (#34382110)

    Indeed. Slashdot has a very, very short memory. Just a few days ago there was an article featured on the consoles being too slow. []

    Although honestly, I think the larger danger to the consoles is not the PC market, but the mobile market with the iPad and such. I've been surprised at how much the iPad can actually pull off for not being just a gaming device (N.O.V.A., etc).

    This article reminds me a bit of some of the early predictions where the people couldn't see the need for more than a few computers in the world. It reeks of something that will come around and bite them in the ass for not progressing quick enough.

  • by Anonymous Coward on Monday November 29, 2010 @08:26PM (#34382234)

    Also, the technology of game platforms isn't advancing quickly enough any more to make a five-year-lag a competition killer.

    My impression was that it wasn't that the technology available to platform manufacturers isn't advancing, it's that game developers aren't really interested in taking advantage of technology beyond what's currently available. I remember an article for a couple of years ago that discussed the massive costs involved in creating new games and how the return on that investment didn't seem to merit continuing to increase the development budgets.

    It stands to reason that if game developers aren't willing to put the money into making games that require faster hardware, console makers don't need to push out new consoles and can, instead, work on creating smaller, quieter, cheaper versions of their current consoles.

  • by aztracker1 ( 702135 ) on Monday November 29, 2010 @08:32PM (#34382324) Homepage
    I have to agree on most of your points, and would like to add that $400+ for a console (at initial release), and even more for a modest sized HDD for said system kept people away for a while. Many of the PS3 purchases were for the blue ray functionality as much as gaming. And the 360 limited to DVD discs (since the crash and burn of HD-DVD) has held it back some. I just bought my kid a 360 this last year, waited for the RROD issues to be squared away first. I won't buy a Sony product, so PS3 isn't an option for me. As it stands, the 360 and PS3 are both passable systems, and as Nintendo has shown, playability means a lot more than uber graphics. I do with Nintendo would come out with a Wii+ or something as a second-gen device, which would be nice... even a bump to 1080p, and built in blue ray for an extra $100-150 would be a big seller, would mean a faster CPU, but minimal changes as far as compatability... Maybe an ATI or nVidia discrete graphics chipset... compatability is a must imho, though dropping the Game Cube controller ports wouldn't make me cry..
  • by aliquis ( 678370 ) <> on Monday November 29, 2010 @08:35PM (#34382356) Homepage

    So basically he claim that if it can run Amiga Bratwurst [] in 1080 there's not need to upgrade the hardware because hey, it's 1080p?

    Omg the graphics!,192350/ [] ;D

    (Actually it's very fun, zooming in and out as you approach each other.
    Amiga Roketz [] looked better [] but played worse.
    And then there was Gravity Force [] of course.)

  • by BenoitRen ( 998927 ) on Monday November 29, 2010 @08:46PM (#34382494)

    Especially the Wii would of course benefit from an upgrade.

    Part of the Wii's success is that development on it is cheap thanks to not needing to adopt different development practices (necessary by multi-core CPUs) and invest in HD graphics.

  • by monkyyy ( 1901940 ) <> on Monday November 29, 2010 @08:52PM (#34382548)

    i think the power of my gaming pc in a handheld would be nice

  • 3DS, iPhone, Android (Score:3, Interesting)

    by PIPBoy3000 ( 619296 ) on Monday November 29, 2010 @08:56PM (#34382572)
    I think most of the innovation is in the hand held arena these days. New markets often get the focus of developers and manufacturers for awhile, but I think in time we'll circle back to consoles as graphics, processing, and sensing technologies improve.
  • by Anonymous Coward on Monday November 29, 2010 @08:56PM (#34382574)

    As someone who has no use for handhelds, I think they'd be nice, but shouldn't be the focus.
    I drive everywhere. I talk to people. I have no need for a electronic distraction to distance me when I'm not at home.
    Your mileage may vary.

  • Premature (Score:4, Interesting)

    by guspasho ( 941623 ) on Monday November 29, 2010 @09:20PM (#34382796)

    Let me check the date. Yep, still 2010, four years after the Wii came out. Wikipedia says the Playstation came out in 1994, PS2 in 2000, and PS3 in 2006, so we shouldn't expect a PS4 until 2012. Doesn't the summary contradict itself?

    But wait, the Xbox came out in 2001 and Xbox 360 in 2005. Where is my Xbox 720???

  • Re:Say again? (Score:4, Interesting)

    by UnknownSoldier ( 67820 ) on Monday November 29, 2010 @10:46PM (#34383468)

    As someone who wrote and implemented OpenGL on the Wii and shipped 2 Wii games that used it, actually, you and the GP are both right, and wrong.

    The Wii was Gamecube x2. Meaning in the Real-World it was twice as fast. Check the Nintendeo forums where Jack Matthews benchmarks the performance (especially memory.)

    Nintendo DIDN'T fix _any_ of the hardware GPU rendering bugs in the Wii, which is why the derogatory Gamecube is applicable.


  • by walshy007 ( 906710 ) on Monday November 29, 2010 @11:15PM (#34383694)

    While not a video card, the N64 memory expansion pack doubled the amount of ram in the system.

  • by tepples ( 727027 ) <{tepples} {at} {}> on Monday November 29, 2010 @11:23PM (#34383748) Homepage Journal

    NES was 256x240. Not to mention the 16 colour limitation on NES.

    What 16 color limit? I read and count 25: one background color, four sets of three for parts of the tile plane, and four sets of three for sprites, not to mention the tint bits that can be turned on for "rising water" effects. Perhaps you're estimating that some of these sets often share identical colors.

  • by UnknownSoldier ( 67820 ) on Monday November 29, 2010 @11:31PM (#34383834)

    > they had to bolt on a GTX to the Cell because the cell didn't have the horsepower to compete with the Xbox 360.

    Huh? The PPU was never designed to do rendering. Looking at the data flow, say for skinning, you have this:

    PS2: CPU (EE) -> vector processors: VU (T&L) -> GPU (VS)
    PS3: CPU (PPU) -> vector processors: SPU (T&L) -> GPU (RSX)

    Ergo, if you pardon the French, you don't know WTF you are talking about.

    One of the reason the PS3 was initially so much was because of the hardware. Specifically, the Blu-Ray drive for one, the PS2 hardware compatibility for two, and all the superfluous flash-type memory card slots for three.

    There were teething problems, because the *whole* industry was changing from single-core to multi-core design. Taking a PC game and porting to the PS3 will of course have extremely poor performance (because you are letting the hardware go unused / to waste); when you design for multi-core from the beginning, and say port a PS3 game to the Xbox 360 or PC, you won't have toilet performance on the Xbox 360.


  • No he's quite right (Score:3, Interesting)

    by Sycraft-fu ( 314770 ) on Tuesday November 30, 2010 @12:27AM (#34384378)

    Part of the PS3's problems stem from the fact that the Cell wasn't supposed to be just the CPU, it was supposed to be the GPU. Sony had demonstrations to this effect. However that was all wishful thinking, when the real Cell hardware was delivered it couldn't stand up to dedicated GPUs. So Sony remade it in to the CPU only, for which it was not well suited. They then had a problem in that they didn't have a GPU. nVidia was, of course, happy to oblige but the thing was they didn't have time for a full redesign. Normally console GPUs are specially designed for consoles. A big thing is sharing system RAM, since consoles have less RAM than PCs and are single user single process and so and handle that better. Most consoles allow for direct GPU use of system RAM or total integration. Also they often feature things like embedded DRAM (the Wii and 360 do and the PS2 did). Well the problem was there wasn't the time for that kind of redesign. Chip design takes a long time. So nVidia was only able to modify a 7800/7900 series architecture a bit. Not a bad card, but not what you wanted for a console.

    The net effect with this late design chance is that the Cell has power that isn't used, and may not be able to be used. The actual PPC CPU part gets startved for time and bandwidth and can't dispatch to the Cells effectively. There may not be a way around it, IBM canceled further Cell production because it just doesn't stack up well. Regular CPUs do better at general tasks, GPUs do better at vector/stream tasks.

    The PS3 was not a well planned design, it was what they could hack together in the time they had.

  • by Hadlock ( 143607 ) on Tuesday November 30, 2010 @12:37AM (#34384460) Homepage Journal

    The Wii's GFX output looks pretty dreadful on any decent display
    If you own the games but want better graphics than your Wii can provide, you should try running them under emulation on your computer, they look fantastic at 1920x1080@60hz true HD and runs just fine on a midrange i5 computer.

  • by Phopojijo ( 1603961 ) on Tuesday November 30, 2010 @12:48AM (#34384562)

    They must of been drinking the same, with all due respect to an otherwise extremely bright programmer, software rendering kool-aid as Tim Sweeney.

    "The End of the GPU Roadmap" []

    And while Real-Time Ray Tracing is the "Holy Grail" and is achievable, there is no way VRAM is going away to be replaced with traditional CPU memory. There are so many memory optimizations in the rendering pipeline that it would be stupid to suggest that it all should be tossed out and use slow DRAM instead.

    He was actually talking about something like CUDA or OpenCL programs that look similar to a typical software rendering engine.

    GPUs would still be there... but you would "talk to them" in a similar way you would a CPU. Only with slightly more simple commands that are parallelized across thousands of cores.

    Basically Tim Sweeney is annoyed at all the DirectX and OpenGL quirks they need to dodge and would want to program each engine basically from first principles -- but still use the GPU for calculations that could be split into hundreds or thousands of independent parts.

  • by Phopojijo ( 1603961 ) on Tuesday November 30, 2010 @12:56AM (#34384632)
    Just about every console port doesn't have the resolution locked. ... that said their internal assets are usually low-res enough that the extra resolution will just be rendering the crappy artwork with even more crappy detail.
  • Re:Consoles (Score:3, Interesting)

    by thesandtiger ( 819476 ) on Tuesday November 30, 2010 @03:23AM (#34385640)

    Graphics are not the same as gaming.

    I'd say that if anything is holding back gaming it's the fact that AAA titles are so freaking expensive and are so huge an investment that they need to be incredibly safe.

    But gaming is not being held back. There's PUH-LENTY of innovation going on, it's just happening outside the glare of the major developers. The economic factors that force AAA titles to play it safe make it a no-brainer to take risks and try to innovate with less expensive games.

Would you people stop playing these stupid games?!?!?!!!!