Has the Console Arms Race Stalled? 231
An article at Eurogamer argues that even with a successor to the Wii on the horizon, the console arms race we've watched over the past few decades is in the process of changing dramatically, with base hardware taking a back seat to software and peripherals.
"Even the most basic yardstick for console improvements has become a little hard to read. It used to seem like a reliable idea that every five years or so, consoles would catch up to the PC — a platform which sees advancements every few weeks — and remain competitive for a while, before the PC's cutting-edge accelerated away. ... However, the upgrade cycle appears to have slowed considerably — with games that actually demand cutting-edge systems being few and far between, and core gamers far more likely to continue happily playing on two-, three- or even four-year-old PCs than they were in the past. ... If not a halt to progress, this is certainly a slowing — and probably one which is welcomed in most quarters. Consumers love improvements in graphical quality, but most would probably prefer to see any major increase in development budget being spent elsewhere — more detailed content, more expansive storytelling, more progress in areas that have been neglected in the former headlong rush to cram more polygons and effects onto every screen."
It's about ROI (Score:5, Insightful)
It's not really saying that the console arms race has stalled, but is instead saying that the graphics arms race has stalled, which is probably true, and that efforts are shifting, which is also probably true.
After all, just as dpi in printers stopped being a selling point once they all got "good enough", and just as megapixels are becoming increasingly irrelevant as a differentiating factor between cameras, so too are the graphics in today's games reaching a point where the return isn't worth the investment for the developers. Graphics are already "realistic enough" for most people, and trying to move things closer to photorealistic gameplay is probably not worth it, since the return they get is minimal, while the effort required is exorbitant. Instead, spending it on improved gameplay or other elements is a better return on their investment.
Games like Minecraft doing so well just hammers that point home.
Demographic Shock (Score:2, Insightful)
RTRT is the next hurdle (Score:2, Insightful)
Re:Is this a bad thing? (Score:3, Insightful)
I for one have never really seen the point behind spending thousands of (pounds/dollars) on a gaming pc capable of playing the latest games, only to be surpassed within a few months.
As things currently stand, it's actually opening PC gaming to a far wider audience as the price of an adequate gaming rig is quite reasonable.
Also, i'd rather have longer and better games than I would slightly better looking ones. And even still, games with modding support can often receive graphical boosts down the line anyway.
Since when have you ever spent a thousand bucks on a PC and then faced a situation where you where not able to play a game that comes out just months later? If you're going to the Gray Box Store and buying the cheapest thing you see with a mouse, keyboard, and LCD, sure, you can't play everything at even decent visual levels, but you're looking at $400 pre-built computers at this point but even that thing should be able to run Crisis with decent settings*.
Would I like to see games improve in quality? Absolutely! But how? Developers are working on that, they have things like motion capture trying to attract a more communal experience that we lost with online play. We have achievements to add to the since of competitions that used to be covered by the guy who got high scores and filled up pack man arcade with his initials 'BUT'. If you want a longer game, you can do the side quests of hunting 200 chickens. These are all fat though, actual quality comes from story lines and the writing, but this is easier said than done. Mods and DLC's are the easiest way to accomplish this problem, but they are not everyone's cup of tea for obvious reasons.
*HP Pavilion Slimline s5710f PC - $410 on Amazon, pre-built with mouse and keyboard
It's because hardware has stalled (Score:4, Insightful)
Beyond increasing core counts (which appears mostly useless for most gaming engines beyond a couple), nothing much is doing in the world of CPUs these days.
I remember choosing between a 486 @ 25MHz versus 50MHz for an extra several hundred bucks. That's twice the clock speed within a single CPU generation for those who are keeping track.
A generation later I purchased a Pentium 75MHz, and 18 months after that upgraded it to 233MHz. That's triple the clock speed.
I even remember having a 400MHz Pentium (II I believe) and about a year later upgraded to a 1GHz P3. That's 2.5 times, not to mention the greater efficiency per clock of a P3 vs a P2.
I now sit with a nearly *5 year old* dual core 2.4GHz CPU (overclocked to 3.3GHz mind you) and I can't find even a $1000 CPU that will give me anywhere near a worthwhile performance bump for anything other than super specific parallelizable applications like scientific computations or workstation-style 3D rendering.
This transistor efficiency stall has also hit the GPU market in the past few years. Have a look at how much Nvidia or AMD have pushed top end GPU performance in the past couple years. They're making incremental 15-20% bumps per generation -- that's nothing like back in the TNT/3dfx days when you could count on a 50-100% framerate jump with each successive generation.
Consoles are stalled because GPU/CPU technology is stalled. If CPUs and GPUs were were keeping up with the previous pace from the 90s, we'd have software/games that pushed those limits.
Re:Demographic Shock (Score:5, Insightful)
Re:Console creators don't have the motivation (Score:5, Insightful)