Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!


Forgot your password?
PC Games (Games) PlayStation (Games) Wii XBox (Games) Games

Putting Up With Consolitis 369

An anonymous reader tips an article about 'consolitis,' the term given to game design decisions made for the console that spill over and negatively impact the PC versions of video games. "Perhaps the most obvious indicator of consolitis, a poor control scheme can single-handedly ruin the PC version of a game and should become apparent after a short time spent playing. Generally this has to do with auto-aim in a shooter or not being able to navigate menus with the mouse. Also, not enough hotkeys in an RPG — that one’s really annoying. ... Possibly the most disastrous outcome of an industry-wide shift to console-oriented development is that technological innovation will be greatly slowed. Though a $500+ video card is considered top of the line, a $250 one will now play pretty much any game at the highest settings with no problem. (Maybe that’s what everyone wanted?) Pretty soon, however, graphics chip makers won’t be able to sustain their rate of growth because the software is so far behind, which will be bad for gamers on consoles as well as PC."
This discussion has been archived. No new comments can be posted.

Putting Up With Consolitis

Comments Filter:
  • by ynp7 ( 1786468 ) on Tuesday February 08, 2011 @02:29AM (#35134720)
    So you're complaining because you can spend a relatively modest sum to play any game that you want without having to worry about a $500-1000 upgrade every year or two? Get the fuck out! This is unequivocally a Good Thing(tm).
  • Re:What...? (Score:4, Insightful)

    by nschubach ( 922175 ) on Tuesday February 08, 2011 @02:38AM (#35134768) Journal

    Heck, "Consolitis" has an effect on LCD Displays. Anything over 1080 horizontal lines is getting harder to find every day. I feel like things are going backwards for PC displays.

  • by Matthew Weigel ( 888 ) on Tuesday February 08, 2011 @02:42AM (#35134782) Homepage Journal

    "Though a $500+ video card is considered top of the line, a $250 one will now play pretty much any game at the highest settings with no problem. (Maybe that’s what everyone wanted?) Pretty soon, however, graphics chip makers won’t be able to sustain their rate of growth because the software is so far behind, which will be bad for gamers on consoles as well as PC."

    Making content that looks good at 1080P (or 1920x1200 for some PC monitors) is hard. Some amazingly specialized people spend a lot of time working on it; the more powerful the graphics processor, the more that is possible, but the more art assets have to be created (along with all the associated maps to take advantage of lighting, special effects, shader effects...) and the more programming time has to me spent. Much like the number of pixels increases far faster than the perimeter of the screen, or the volume of a sphere increases faster than its surface area... the work to support ever-increasing graphics power grows faster than the visual difference in the image.

    It's not sustainable, but those advancing graphics processors are a big part of why game developers are moving to consoles: a shinier graphics engine costs more money to develop, which increases the minimum returns for a project to be successful. Anyone who looks at the business side can see that the market of people who have $500 graphics cards is much tinier than the market of people who have an Xbox360 or Playstation3. If you're going to spend that much money on the shiny, of course you're going to shoot for a bigger return too!

    When it takes a big team to develop something... well, that's generally not where the innovation is going to happen.

  • by Feinu ( 1956378 ) on Tuesday February 08, 2011 @03:13AM (#35134930)
    Lower hardware requirements are definitely a bonus, but it comes at the cost of dumbed down controls. While using a keyboard, I have about 20 buttons under my left hand, and an accurate pointing device on my right, along with several buttons. Why would I want to cycle through potential targets by pushing a button? Why do I need to hold down a button (which also has a different function), instead of just pushing a different button? Now I enter a menu, and I have to lift my hand to get to the arrow keys to navigate the menu? Not user friendly at all.
  • by Anonymous Coward on Tuesday February 08, 2011 @03:15AM (#35134932)

    you might want to update your argument to this decade. Windows 7's DPI support is close to perfection. of course, this assumes you are rating the operating system, and not the flawed applications which run on it.

  • by tlhIngan ( 30335 ) <slashdot@nOSpam.worf.net> on Tuesday February 08, 2011 @03:29AM (#35134996)

    10 years ago, a good chunk of gaming was done on PCs because consoles were crap - standard def, too-small TVs, and the like, so people bought nice high-end PCs and invested in them. Dropping $2000+ on a PC wasn't unheard of nor unusual.

    These days, spending more than $500 on a PC is very unusual - only Apple and PC gamers do that stuff, and really, it's no surprise why. And that $500 gets you a monitor, keyboard, mouse, speakers and other accessories.

    Who's the #1 graphics chip maker in the world? It's not nVidia or AMD, it's Intel. (Sure, nVidia and AMD have the discrete graphics market, but that's a really tiny chunk of the whole PC market). When PC prices plummeted below $1000 and then below $500 (and laptops became "netbooks" below $500) manufacturers know that the average PC buyer cares about Gigs (hard drive space), Gigs (RAM) and Gigs (CPU GHz). Nowhere do they really care about graphics - after all, Windows does just fine on Intel graphics, and that's all the user sees.

    The higher end PCs with discrete graphics sell far less, even one with a low-end graphics may be considered a gaming PC (and little Jonny's mom and pop aren't buy a PC for games, oh no, they want so Jonny can work).

    PC gaming is huge - after all, FarmVille and the like don't require super high end ultimate graphics chips and many popular indie tities have lightweight requirements that even the cheapest of netbooks can play them.

    The problem is, as we all know, Intel graphics are crap (though they're supposed to get better now with nVidia), and can barely do 1080p video decoding and high-def gaming.

    So people buy a console as well - and with HDTV, they get high-def and on the big ol' 52" HDTV versus their 17"/20" PC monitor (or whatever is free these days). They could buy it on a PC as well (it's easy enough to do), but that requires spending money buying more PC - they could build/configure a great PC for $600, but that's over the "cap" of PC prices of $500. (Everyone gasps at the price of a $1000 MacBook Air, comparing it to a $300 netbook (despite better graphics (Intel vs nVidia) and CPU (Core2Duo is old, but runs rings around Atom), SSD, RAM, etc.).

    Hell, I tried to convince someone to spend $1000 to buy a decent laptop and they balked.

    No, it's not consoles limiting graphics of games - it's PCs themselves. The number of people with high end $600+ video cards (or probably any nVidia or AMD graphics cards of the past say 4 years) is very small compared to the total PC market. And we know PC gaming is larger than console gaming, but they're all for games that can play on the #1 video card on the market.

    And developers go for the money - there are more console gamers out there than hardcore PC gamers with power graphics cards (and the willingness to upgrade yearly or so) - even though there are more PC gamers in general. Other than that, consoles and PCs are pretty much plug-and-play (and Sony's making the PS3 a PC experience with installers, EULAs, serial keys, online DRM, oh my).

  • Re:What...? (Score:4, Insightful)

    by Spad ( 470073 ) <slashdotNO@SPAMspad.co.uk> on Tuesday February 08, 2011 @04:41AM (#35135288) Homepage

    That's more to do with the fact that all the LCD production lines are churning out huge numbers of 16:9 panels for TVs at 720p and 1080p, so it makes sense to do so for PCs as well (made easier in no small part because the public have now been successfully sold on the idea that 16:9 = HD = Better than a 4:3 monitor somehow).

    I managed to track down a pair of 21" 4:3 LCDs that do 1600x1200 for my PC and I will hold onto them as long as humanly possible because I know that it's going to be extremely hard to get a decent sized 4:3 replacement in a few year time. 16:9 for a PC is just a massive waste of screen space for most things because 90% of apps and web pages are designed, if not with 4:3 in mind, then to support 4:3 and so you end up with horizontal letterboxing all the time.

  • Re:Nintendo Thumb (Score:5, Insightful)

    by crafty.munchkin ( 1220528 ) on Tuesday February 08, 2011 @05:42AM (#35135564)
    I'm firmly of the opinion that some games need to be played on a console, and others need to be played on a PC. Porting one type of game to another platform means that the end result is the ported platform is a poor imitation of the original.

User hostile.