Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
PC Games (Games) PlayStation (Games) Wii XBox (Games) Games

Putting Up With Consolitis 369

An anonymous reader tips an article about 'consolitis,' the term given to game design decisions made for the console that spill over and negatively impact the PC versions of video games. "Perhaps the most obvious indicator of consolitis, a poor control scheme can single-handedly ruin the PC version of a game and should become apparent after a short time spent playing. Generally this has to do with auto-aim in a shooter or not being able to navigate menus with the mouse. Also, not enough hotkeys in an RPG — that one’s really annoying. ... Possibly the most disastrous outcome of an industry-wide shift to console-oriented development is that technological innovation will be greatly slowed. Though a $500+ video card is considered top of the line, a $250 one will now play pretty much any game at the highest settings with no problem. (Maybe that’s what everyone wanted?) Pretty soon, however, graphics chip makers won’t be able to sustain their rate of growth because the software is so far behind, which will be bad for gamers on consoles as well as PC."
This discussion has been archived. No new comments can be posted.

Putting Up With Consolitis

Comments Filter:
  • by zonker ( 1158 )

    Here I thought this was going to be about Nintendo Thumb.

    • Re:Nintendo Thumb (Score:5, Insightful)

      by crafty.munchkin ( 1220528 ) on Tuesday February 08, 2011 @05:42AM (#35135564)
      I'm firmly of the opinion that some games need to be played on a console, and others need to be played on a PC. Porting one type of game to another platform means that the end result is the ported platform is a poor imitation of the original.
      • Porting one type of game to another platform means that the end result is the ported platform is a poor imitation of the original.

        Ergo, any game developed for both platforms simultaneously is a poor imitation of itself ;)

      • And some games need to be played in the arcade, which consoles killed too.
      • by dintech ( 998802 )

        Certain games suit controller input and other are better with mouse and keyboard. I actually own SNES, Genesis, N64 and PSX console controllers with their USB adapters. I don't see that the experience is too different when playing and emu with the original controller to actually playing the console itself.

        Although there are lots of counter-examples, those games good with control pads include platformers, beat-em-ups, driving games and arcade games in general. Mouse and keyboard is best suited to RPG, RTS, F

      • Well, porting software between platforms tends to have a negative impact on quality (not always, but they tend to be the exception rather than the rule!). So it doesn't really have to be a game to suffer porting problems - there are just as many applications out there that exhibit platform specific 'features'.

        There is the other minor point that if your only complaint is the control system (as opposed to major crashes, graphics and sound glitches etc), then the dev team have probably done a fairly good j
      • No, some games need to be played with mouse+keyboard, and some with multiple USB gamepads (and possibly a TV connected to the board's HDMI port).

      • Only if the controller is "unique", something like the Wiimote or Kinect. There's no reason you can't use a gamepad with a PC game. I do it quite often, Logitech makes a darn nice gamepad. And joystick games... you pretty much have to use a PC. The place for console games is when you have groups of people around a big screen, or have a novel control interface. There's no reason you can't do everything else better on a PC.

  • by Windwraith ( 932426 ) on Tuesday February 08, 2011 @02:18AM (#35134672)

    If anything games are becoming more like computer games overall. Traditional console RPGs look more like MMOs now, games require patching and even have DRM...a few quirks introduced by lazy companies that do lazy ports don't make "consolitis".

    • Re:What...? (Score:4, Insightful)

      by nschubach ( 922175 ) on Tuesday February 08, 2011 @02:38AM (#35134768) Journal

      Heck, "Consolitis" has an effect on LCD Displays. Anything over 1080 horizontal lines is getting harder to find every day. I feel like things are going backwards for PC displays.

      • This is true, I have been keeping my 24" Samsung as my main screen because of the 1920x1200 resolution (my 2nd screen is an Asus 24" 1920x1080).

        I think there's room for both consoles and PC games as there always has been, it's just some gaming companies are getting lazy and going console style for everything. Sure it works with things like DCUO, I use my 360 controller with that and it works great, but other games it's just terribad.

        The only way this will change is if people refuse to buy bad ports from con

      • Re:What...? (Score:4, Insightful)

        by Spad ( 470073 ) <slashdot.spad@co@uk> on Tuesday February 08, 2011 @04:41AM (#35135288) Homepage

        That's more to do with the fact that all the LCD production lines are churning out huge numbers of 16:9 panels for TVs at 720p and 1080p, so it makes sense to do so for PCs as well (made easier in no small part because the public have now been successfully sold on the idea that 16:9 = HD = Better than a 4:3 monitor somehow).

        I managed to track down a pair of 21" 4:3 LCDs that do 1600x1200 for my PC and I will hold onto them as long as humanly possible because I know that it's going to be extremely hard to get a decent sized 4:3 replacement in a few year time. 16:9 for a PC is just a massive waste of screen space for most things because 90% of apps and web pages are designed, if not with 4:3 in mind, then to support 4:3 and so you end up with horizontal letterboxing all the time.

        • by AmiMoJo ( 196126 )

          TVs use a different resolution and aspect ratio to monitors. TVs are 16:9 1080 lines and computers are 16:10 1200 lines. Panels made for TVs can't be used in computer monitors most of the time.

          Widescreen works well on PCs. You can have two documents at almost actual size on a 24" monitor. I often have one window for program code and another for a datasheet or web page side-by-side. If anything I could do with a higher resolution because at 1920 pixels each window is under 1024 which is the minimum for the w

      • Dunno, it seems to me like when greed is good explanation, that's probably at least a good chunk of the real explanation.

        Displays had been sold for an awfully long time by diagonal size, to the point where some people think that a 21" is a 21". In reality at the same diagonal, the closer to square it is, the bigger the surface, and the more wide format it is, the lesser the surface. It's only basic geometry.

        For CRTs it didn't make much difference for the total cost, but for flat screen panels it does. Also

    • Re:What...? (Score:4, Informative)

      by Chas ( 5144 ) on Tuesday February 08, 2011 @02:39AM (#35134770) Homepage Journal

      Take a look at Champions Online and it's follow-up, Star Trek Online.

      The engine was jacked around with to specifically enable a console port that was to be released simultaneously.

      Then the developer realized that a console port was going to be unsupportable and simply COULD NOT give the flexibility necessary.

      Boom, console port went away. But by that point, all the console-specific stuff was so firmly embedded into the system that it couldn't be excised.

      So what did we get with CO and STO? A pair of SEVERELY half-ass MMOs that were little more than button-mash-fests.

      • by hitmark ( 640295 )

        Explains why CO has a option to play with gamepad (tho such a option also shows up in DDO).

      • Also contrast Deus Ex, and very highly respected PC FPS, with it's far less respected sequal Invisible War. The latter was made for consoles, and it shows.
      • No, wait. Don't bother. It is bad. Very VERY bad. So amazingly bad you actually got to watch to admire the sheer horror of the train wreck.

        Now some FF fan will come by and defend it because it is different. So is having 11 fingers, you are still a freak. Different isn't always better, sometimes we do things a certain way because they work.

        A simple example is chat channels. Most MMO's do them IRC style. It works, has worked for decades and we all know it. Not FF14, it uses a system they named linkshells whic

      • Control scheme change is even more apparent in the GTA3 series. GTA3 itself was obviously made for the PC first and then for console; the control scheme is pretty evenly matched whichever platform you choose. But then when you get into VC, everything on the PC side is ok until you hit the first mission for Avery Carrington (flying the toy helicopter) rather early in the game. Missing that right analog stick to control your clockwise/counterclockwise motion makes that mission incredibly difficult to pull off

      • While the intended console port was a limiting factor, I think by far the biggest problem was their being made by Cryptic. In fact by the original guys who couldn't even do the maths to see that a "situational" power could be made to stack with itself twice over at level 22, in COH, or produced balance swings so extreme as to go from City Of Blasters (a devices blaster could floor any enemy's accuracy) to City Of Tankers (tanks became basically invulnerable even to hundreds of enemies at a time.) COH has in

    • It's just what we wanted. Yes, we, that almost certainly includes you. Remember those times when we were playing our precious games, misunderstood by surroundings? When we wished they would try, and understand?

      Guess what - it happened! Be happy. Games are now made for general consumption (which impacts also traditional console games / many characteristic genres almost disappeared, possibilities of controllers are also underutilized, presentation is not what it used to be, even UIs often forget that scrol
      • Don't generalize, I never wished for such a thing.
        I just wanted and want to have fun period, all my friends and I are gamers since the 80s. And most of us find the current game generation to be horrid outside of indie gaming or rare gems.

        • by sznupi ( 719324 )
          Well, generalization (etc.) is the force pushing the market in one direction or the other... and I remember quite a lot of young gamers being fed up with how many people "don't get it"

          (BTW, return sometimes to those games from the 80s - and not only to those you remember fondly, but to similarly broad selection which you denounce in current ones ... and suddenly the latter won't look so bad in comparison)
          • Heh, indeed. I grew up with a Sinclair ZX-81 and later ZX Spectrum. So at one point in 2000 or so I get a Spectrum emulator and look for some tape images online. So I see one particular game and go, "cool! I remember playing that one!" So I download it and play for a quarter of an hour and then it hits me, "I also remember I thought it sucks." :p

    • Comment removed based on user account deletion
    • Comment removed based on user account deletion
    • "If anything games are becoming more like computer games overall"

      No. Clearly Civ 5, Supreme commander 2 and a host of other games suffer from neutered game design. As well as game designers reigning in their game mechanics for accessibility (i.e. dumbing down in the belief of attracting a wider audience)

      Metacritic:
      Supcom 2 - User score: 6.1
      Civ 5 - User score: 7.0

      • I think the dumbing down of games (let's call it "haloization", although "nothing shall be more complex than Doom" would be more appropriate) is different from consolitis.

        Consolitis leads to horrible interfaces, non-user-controllable cameras in third-person games, gameplay appropriate for someone who has to aim with an analog stick and occasionally restricted saving. It happens because the restrictions of consoles are carried over to the PC when the game is ported.

        Haloization leads to stripped-down game
    • If anything games are becoming more like computer games overall. Traditional console RPGs look more like MMOs now, games require patching and even have DRM...a few quirks introduced by lazy companies that do lazy ports don't make "consolitis".

      It isn't about lazy ports. It's about fundamental design decisions.

      There's been some good discussion on Penny Arcade about the design decisions that've gone into porting Monday Night Combat over to the PC. The mouse means you've got nearly instantaneous turn speed, instead of a fixed rotation speed. So you can pull off attacks that are just plain impossible on a console. So they've had to tweak some fundamental mechanics in the game. Stuff that's simply impossible on a console because you have a limite

  • Seems more like major revisions will come in line with consoles, this doesn't necessarily mean the pace of innovation will slow, just the releases will be further apart.
    • In fact it's likely to be a good thing, programmers will need to make the most of current hardware rather than skipping out on optimisations just because they know new faster hardware is always around the corner. Just look at the way the graphics quality of games on consoles increases over the lifetime even though the hardware stays the same.
  • was in the author's mind when he wrote this? It let down so much compared to all of the other COD's.

    I really feel where the author is coming from because of all the games you hear that "are awesome" on the consoles. You try them on the PC and they are just horrible. Jittery lag, poor graphics, horrendous controls, and the list goes on and on.
    • honestly, Modern warfare 1 already was a major let-down compared to previous CODs

      Instead of participating in monumental historical battles with awesome atmosphere, i found myself playing yet another generic "hunt the terrists" style game in yet including forced plotlines to show you some scenery other then generic middle eastern ruines...

      I know we were all complaining when Medal of Honor got long in the tooth that we wanted something new and fresh, but please pretty pretty please, can i have another WWII sh

  • by Revvy ( 617529 ) * on Tuesday February 08, 2011 @02:27AM (#35134714) Homepage
    Video cards push pixels and the number of pixels has stalled in the last couple of years. 1920x1080 is the norm, and there appears to be no push to go higher. I read a great rant [10rem.net] last year that effective summed it up. You can't blame console games for the fact that PC gamers have screens with the same resolution as their TVs. Blame either the manufacturers for failing to increase pixel density or consumers for failing to demand it. You've got to go to a 30" monitor to get a higher resolution, and the price of those beasts scares most people away. Why pay $800+ for a 30" when a pair of 24" 1080p monitors costs half that?

    ----------
    Still waiting for my in-retina display.
    • There's another party to share in the blame game too, OS makers. It's 2011 and we still don't have a truly resolution independent operating system(or flying cars, but thats another rant). Gamers are only a very tiny subset of the people who buy monitors, so very few manufacturers are going to cater to their needs expressly. Unlike gamers, normal users aren't really clamoring for denser monitors because their software doesn't play well with "unexpected" pixel densities, ie everything gets smaller.

      We are
      • Re: (Score:2, Insightful)

        by Anonymous Coward

        you might want to update your argument to this decade. Windows 7's DPI support is close to perfection. of course, this assumes you are rating the operating system, and not the flawed applications which run on it.

      • There's another party to share in the blame game too, OS makers. It's 2011 and we still don't have a truly resolution independent operating system

        Nah, the graphics engines of games don't balk on higeh res displays, they shouldn't, anyhow...

        OS Has nothing to do with it. You can select font sizes for OS texts in XP...

        It's quite simple, you select a resolution, derive an aspect ratio, create a perspective transform, and presto, all 3D games can run at any resolution. Sure, you'll run into performance problems with lower end (including console) hardware that doesn't support newer higher res displays, but that's because the machines have a limited proce

    • by cbope ( 130292 )

      Screen resolution may not be increasing by much these days, but that does not mean graphics capabilities and image quality are not improving. Higher and better levels of AA, anisotropic filtering, tessellation, increased geometry of models and world objects... all of these require more graphics card horsepower. Look at Crysis, even today... years after it's release, there is still not a single GPU that is capable of pushing 60fps when running at the native resolution of a 30" panel on enthusiast setting. No

    • by Jupix ( 916634 )

      Why pay $800+ for a 30" when a pair of 24" 1080p monitors costs half that?

      Vertical resolution, PPI and having no bezel in the center of your display.

    • by Krneki ( 1192201 )
      Don't forget the TV LCD or plasma, the prices are much better then monitors.

      But i don't see any point right now in higher resolution, on my 42' plasma I can't see the pixels and I play at 1m away. I do use the 2xAA tho.

      P.S: I use a dual monitor setup, so I still have my old LCD for everything else.
    • by ghmh ( 73679 )

      1920x1080 is the norm, and there appears to be no push to go higher

      It's worse than that, they went backwards and then stalled: 1600x1200 (4:3) to 1920x1200 (8:5) to 1920x1080 (16:9).

  • by ynp7 ( 1786468 ) on Tuesday February 08, 2011 @02:29AM (#35134720)
    So you're complaining because you can spend a relatively modest sum to play any game that you want without having to worry about a $500-1000 upgrade every year or two? Get the fuck out! This is unequivocally a Good Thing(tm).
    • by Omestes ( 471991 )

      Huh... I always buy the exact middle of the road video card ($100-130), and they generally last me around the life of the rest of my computer, meaning around 4-5 years. You don't need the bleeding edge, ever. Right now I've got an old ATI Radeon 4650, its lasted me around 3 years now, and I can play Fallout 3: New Vegas, Dragon Age, UT3, and TF2 at the highest settings. WoW (when I played it) at close to the highest settings, and pretty much everything else I'd want to play at either "high" or "highest"

    • The bad thing is that while you CAN play every game on $200 card, none of them is WORTH playing.
  • by billcopc ( 196330 ) <vrillco@yahoo.com> on Tuesday February 08, 2011 @02:29AM (#35134724) Homepage

    The summary should have read "FiringSquad ad revenue is on the decline, here's an article about nothing, for you to linkspam".

    Yeah, console games usually make for shitty PC ports, which is freakin' pathetic since the console title had to be developed on a PC in the first place, and today's middleware makes the distinctions largely irrelevant. This is not news. The same was true back in the 80's (minus the middleware).

    My biggest peeve ? Not the shitty controls. Not the slightly degraded textures. Not the total lack of post-release fixes. No, my biggest peeve is when a stupid console port restricts your choice of display resolution. It is trivial to pull a list of API-sourced geometries and run with it, rather than hardcode for 720p and 1080p... or worse yet: 640x480, 800x600, 1024x768. Yeah ok, I was running 1024x768 fifteen years ago, it's kinda tired.

    • It is trivial to pull a list of API-sourced geometries and run with it, rather than hardcode for 720p and 1080p... or worse yet: 640x480, 800x600, 1024x768. Yeah ok, I was running 1024x768 fifteen years ago, it's kinda tired.

      While it certainly shouldn't be impossible, it's not trivial. There are considerations for fixed sized graphical UI elements. You can't just blow things up or even worse shrink them down. HUD displays look terrible and text gets unreadable. There are also field of view [codinghorror.com] issues.

      Now I think game makers should be professional enough to take these into account, but it certainly is far from trivially making a couple API calls.

    • Actually it's nothing short of amazing when a console port today doesn't support the full list of resolutions because you can plug your Xbox 360 into a VGA monitor, too, and the console supports about any resolution you can think of because scaling is free.

  • The one thing worse than consolitis is inline advertisements injected into the text of an article as fake links. D:

    But going back to the subject at hand the most glaring recent example of consolitis in a game has to be The Force Unleashed. That game had horrible mouse control which made one boss fight basically impossible. With a gamepad you had to hold just both sticks down, but with the mouse you had to constantly move the mouse downwards for 30 seconds at a time. Arggg.

  • by Matthew Weigel ( 888 ) on Tuesday February 08, 2011 @02:42AM (#35134782) Homepage Journal

    "Though a $500+ video card is considered top of the line, a $250 one will now play pretty much any game at the highest settings with no problem. (Maybe that’s what everyone wanted?) Pretty soon, however, graphics chip makers won’t be able to sustain their rate of growth because the software is so far behind, which will be bad for gamers on consoles as well as PC."

    Making content that looks good at 1080P (or 1920x1200 for some PC monitors) is hard. Some amazingly specialized people spend a lot of time working on it; the more powerful the graphics processor, the more that is possible, but the more art assets have to be created (along with all the associated maps to take advantage of lighting, special effects, shader effects...) and the more programming time has to me spent. Much like the number of pixels increases far faster than the perimeter of the screen, or the volume of a sphere increases faster than its surface area... the work to support ever-increasing graphics power grows faster than the visual difference in the image.

    It's not sustainable, but those advancing graphics processors are a big part of why game developers are moving to consoles: a shinier graphics engine costs more money to develop, which increases the minimum returns for a project to be successful. Anyone who looks at the business side can see that the market of people who have $500 graphics cards is much tinier than the market of people who have an Xbox360 or Playstation3. If you're going to spend that much money on the shiny, of course you're going to shoot for a bigger return too!

    When it takes a big team to develop something... well, that's generally not where the innovation is going to happen.

    • You are only looking at part of the equation though, consoles have a huge cost associated with them, namely the royalties you have to pay the console manufacturer for every unit sold. For the big games it's something like $10/game, not trivial. Now compare that to the PC world, you don't have to pay Microsoft/Apple/whoever a dime to release your game on their system. Lets face facts, the reason there isn't a bigger drive to release more PC content is simply because the sales aren't there. For most games
  • by pecosdave ( 536896 ) * on Tuesday February 08, 2011 @02:43AM (#35134784) Homepage Journal

    I really, really miss Loki.

    I still want to kick someone at Epic in the nutts for not following through with the promised Linux port of UT3. (My copy is still sitting there waiting to be played for the first time)

    If you use SDL and Open GL you can make it work on everything easier! /rant complete, my version of PC gaming covered, go back to bitching about consoles and Windows Microsoft weenie.

    • That was very annoying. I bought UT3 also thinking there would be a Linux client for it. They even showed screenshots of it running in Linux, at one point. Frankly I've given up on Epic Games. It's a shame they went the way they did because every PC game they made until the UT3 engine had Linux clients. The thing I'm curious about is if id will actually follow through with a Linux client for Rage. Since they're not an independent shop anymore, I hope it doesn't impact Linux clients, and source code
      • Dude, Epic has been awesome every since the old DOS pinball games they used to have!

        Rumor has it pressure from Microsoft put a lid on the Linux version.

        "It may be difficult to get a Linux game ported over to XBOX and certified, all the Linux code could make the certification process very difficult."
        "But it's just GL and SDL code, there is no Linux code exactly".
          "Oh, there's Linux code in there alright...."

  • "Inflammation of the Console"?

    C'mon now, you can butcher the language in more creative ways than that.
  • by Superdarion ( 1286310 ) on Tuesday February 08, 2011 @02:51AM (#35134822)

    Pretty soon, however, graphics chip makers won’t be able to sustain their rate of growth because the software is so far behind

    Well, that seems good to me. One of the deterrents of PC gaming is the everchanging hardware specs. If consoles have proved already that we can live with the hardware power from 6 years ago and still make games that look quite impressive (at least, sufficiently good), perhaps it's time that computer videocards slow down and allow the population to catch up. It sucks buying a $250 video card just to have to replace it in 2 years, whereas this-generation-consoles have lasted 6 years. The solution is, of course, to buy a $500 videocard, which will be good for a few years, but with that money you can get a console with controllers and one or two bundled games, so why bother? Not to mention buying a decent mouse, keyboard, screen and speakers.

    Perhaps we should even learn from the wii and the indie games, which can run on computers 7 years old! Why must we have a new hyper-mega-powerful new $600 video card every year?

    Sure, one could argue that video-game developers could actually take advantage of the new hardware (DX 11, anyone?) and have amazingly-looking games, but why bother? Do we really want more realism, graphics-wise, than the MOH and COD franchises currently offer? I think that the success of those franchises, specially the last three CODs, speak for itself. We don't need a new over-hyped video card every six months; we don't need a thousand different model-names that no one understands; we don't need cutting-edge technology to make games. And certainly, we don't need to have such a hostile environment like what PC gaming is, which just turns away most would-be gamers.

    That is, truly, what the consoles do right. You don't have to know anything about computers or videogames to pick up one and within minutes start playing your new videogame. You need not install, tweak or configure in any way your games or consoles. You need not update to the latest card drivers. You need not replace any part of your console (except the ones that stop working) every two years; you don't need to worry about system specs, and figuring out if your GT 250 is better or worse than a GT 260 or a HD 5730. Finally, while I'm on it, you need not worry about fucking DRM in your console games, although that's another story (and perhaps the trade is fair, for PC gamers need not fear that their PC manufacturer suddenly bricks their computer... unless sony is involved).

    Besides, everyone keeps complaining how games nowadays focus on looking stunning and having great sound effects and, basically, taking too much effort into the media part of the game, while slacking off in other areas, like immersiveness, story, character development and all that. Now they're saying "we should have better graphics now!". I call bullshit.

    • by cbope ( 130292 )

      It's all a tradeoff, or more accurately a price-shift that occurs with consoles. Ever notice that console games tend to cost quite a bit more than their PC equivalent? Thanks, but I'll take my general purpose PC that I can use for many different tasks, is upgradeable, can run games with better graphics than any console (unless it's a damn cross-platform title), and games that are cheaper.

      On a related note, practically every major RPG released for PC recently has been crippled as a cross-platform "port". I'm

    • Problem with the current generation of new consoles is simply they wont be upgradable. Lots of people bought into a console first time of their life. And they will be in for a major dissapointment when the next generation comes along. Reason they have plunked hundreds of dollars into games, and once the next gen hits, there is a huge chance the games will not play on the new console anymore.
      Every console so far has become a doorstopper to some degree after a while. Nintendo being better than the others by t

      • Uh, the previous generation of consoles does not 'become a doorstopper' when the new one comes along.

        Not only did buying a PS3 not make my PS2 stop working, my Playstation 1 still plays all the games I have for it!

        And despite the Jaguar existing, my Atari 2600 still plays all those old carts.

        Backwards compatibility in consoles isn't really needed. (Having it and then DROPPING it, like Sony did with PS2 games on the PS3, is stupid. But if they'd never had it in the first place, it wouldn't have been that big

    • The solution is, of course, to buy a $500 videocard, which will be good for a few years, but with that money you can get a console with controllers and one or two bundled games, so why bother?

      Or buy a cheaper video card and play on lower settings.

      New games generally work on older computers with reduced graphic settings. Yes, you will not be able to see all of the graphic awesomeness of the new game, but if all games targeted your video card, nobody would be able to get the better graphics. Now you can still enjoy the game with lower settings on cheaper hardware, while I can play it on higher settings on more expensive hardware.

      Also, to me, keyboard+mouse is much better than a controller.

      Finally, while I'm on it, you need not worry about fucking DRM in your console games, although that's another story (and perhaps the trade is fair, for PC gamers need not fear that their PC manufacturer suddenly bricks their computer... unless sony is involved).

      DRM is a

  • Reducing the amount of money I have to spend on video cards is not a bad thing. Control and gameplay problems are. Dead Space on PC was totally unplayable because the mouse input was converted to an analog stick-style velocity input, capping its max speed and forcing me to flail wildly at my desk just to turn around. Mass Effect doesn't let me hit escape to back out of menus. Aliens Vs. Predator was about as interactive as Doom -- point at the glowing quest object and hold down the use key; repeat fifty tim

    • by sznupi ( 719324 )
      It goes both ways. For example, tittles also revolving around pointing at things ... but a very different kind of it: proper light gun games. They virtually died out with the arrival of current console generation, apparently sort of replaced by games offering hybrid kind of gameplay.

      (yes, that's largely due to abandonment of CRT; not much of a... consolation)
    • by Tukz ( 664339 )

      This is exactly the problem.
      I've seen a lot of comments about graphics, but graphics isn't the problem with "consolitis".
      It's controls and game play.

      I've posted on this subject before in another thread, but I'll make a short recap of what I said then.

      The major problem of "consolitis", is the game play mechanics used on console, does not always work on PC.
      You cannot take a game made for console and port it directly to PC. It just won't work in most cases.
      The game play is all wrong.

      You need to dumb things dow

  • by macraig ( 621737 ) <mark@a@craig.gmail@com> on Tuesday February 08, 2011 @03:19AM (#35134956)

    It's not big deal, really... I had my consils removed from me when I was a kid and I turned out (mostly) fine. Now I game on PCs and I'm better for it.

  • CoD: Modern Warfare 2 is a pretty good example of consolitis, though certainly not as bad as Black Ops.

    When MWF2 came out there were a lot of complaints from PC gamers about the lack of a console, the lack of dedicated server support, inability to change field of view from default, etc.

    As a PC Call of Duty fan, imagine my surprise and joy when I stumbled upon AlterIW [alteriw.net], a community hacking project that fixes all that. To add insult to injury, the hack is designed to slipsteam into a SKiDROW torrent of
  • The Elder Scrolls IV Oblivion suffers from "consolitis" in that the controls just arent right for a PC. For example why cant I click on a chest and have it open automatically to allow me to pick stuff up without needing to press a button to open it?

    On a console having a separate "open chest" button made sense but not on PC with a mouse.

    • Not sure which console version you are talking about - or maybe i am misunderstanding you. Elder Scrolls: Oblivion does not have a "seperate" Open Chest button.
      on my copy you use the same button to open doors, locks, and talk to people.

      The cross hair is context sensitive changing shape depending on whether the item underneath it is a NPC or other object you can interactive in the game.
      The action button on the PS3 (X) i think is the same button used for all these actions.

      How does this work on a PC - The mech

      • by jonwil ( 467024 )

        yeah I think its the same on PC in that there is a single "action" button. But IMO it would be better if it was more like some PC RPGs where you just click on things that are activatable or actionable.

        • To do that you'd first have to hit a button to release the mouse from mouselook so you had a pointer with which to click on things.

          I'm pretty sure Oblivion will let you remap 'Use Object' to a mouse button if you really want.

          And if the base game won't, I'm positive there's a mod that will.

  • by tlhIngan ( 30335 ) <slashdot.worf@net> on Tuesday February 08, 2011 @03:29AM (#35134996)

    10 years ago, a good chunk of gaming was done on PCs because consoles were crap - standard def, too-small TVs, and the like, so people bought nice high-end PCs and invested in them. Dropping $2000+ on a PC wasn't unheard of nor unusual.

    These days, spending more than $500 on a PC is very unusual - only Apple and PC gamers do that stuff, and really, it's no surprise why. And that $500 gets you a monitor, keyboard, mouse, speakers and other accessories.

    Who's the #1 graphics chip maker in the world? It's not nVidia or AMD, it's Intel. (Sure, nVidia and AMD have the discrete graphics market, but that's a really tiny chunk of the whole PC market). When PC prices plummeted below $1000 and then below $500 (and laptops became "netbooks" below $500) manufacturers know that the average PC buyer cares about Gigs (hard drive space), Gigs (RAM) and Gigs (CPU GHz). Nowhere do they really care about graphics - after all, Windows does just fine on Intel graphics, and that's all the user sees.

    The higher end PCs with discrete graphics sell far less, even one with a low-end graphics may be considered a gaming PC (and little Jonny's mom and pop aren't buy a PC for games, oh no, they want so Jonny can work).

    PC gaming is huge - after all, FarmVille and the like don't require super high end ultimate graphics chips and many popular indie tities have lightweight requirements that even the cheapest of netbooks can play them.

    The problem is, as we all know, Intel graphics are crap (though they're supposed to get better now with nVidia), and can barely do 1080p video decoding and high-def gaming.

    So people buy a console as well - and with HDTV, they get high-def and on the big ol' 52" HDTV versus their 17"/20" PC monitor (or whatever is free these days). They could buy it on a PC as well (it's easy enough to do), but that requires spending money buying more PC - they could build/configure a great PC for $600, but that's over the "cap" of PC prices of $500. (Everyone gasps at the price of a $1000 MacBook Air, comparing it to a $300 netbook (despite better graphics (Intel vs nVidia) and CPU (Core2Duo is old, but runs rings around Atom), SSD, RAM, etc.).

    Hell, I tried to convince someone to spend $1000 to buy a decent laptop and they balked.

    No, it's not consoles limiting graphics of games - it's PCs themselves. The number of people with high end $600+ video cards (or probably any nVidia or AMD graphics cards of the past say 4 years) is very small compared to the total PC market. And we know PC gaming is larger than console gaming, but they're all for games that can play on the #1 video card on the market.

    And developers go for the money - there are more console gamers out there than hardcore PC gamers with power graphics cards (and the willingness to upgrade yearly or so) - even though there are more PC gamers in general. Other than that, consoles and PCs are pretty much plug-and-play (and Sony's making the PS3 a PC experience with installers, EULAs, serial keys, online DRM, oh my).

    • No, it's not consoles limiting graphics of games - it's PCs themselves. The number of people with high end $600+ video cards (or probably any nVidia or AMD graphics cards of the past say 4 years) is very small compared to the total PC market. And we know PC gaming is larger than console gaming, but they're all for games that can play on the #1 video card on the market.

      I wholeheartedly disagree. I have a Core2Duo from about 4 years ago, I have a 2 and a half year old graphics card that cost me $160 new at the time. The most recent games I've played was Lost Cause 2 which gave me the recommended settings of 1920x1200 with everything on max. I also played split second recently which is also a console to PC port, again recommended settings were 1920x1200 everything on max.

      So ... where is my motivation to buy the latest and greatest video card again? This is a chicken and

    • Basically your entire rant is this: The fast majority of cars are boring grey boxes on wheels used for getting from A-B so there is no market for special after-market kit...

      Oh wait.

      Or: The fast majority of people just want their cheap food in a restaurant to come with a toy, so there is no market for fine restaurants that do not ask you to supersize your order.

      Oh wait.

      Basically, just because a segment of a market is not the entire market doesn't mean that you can increase your earning by going for the whole

    • by slim ( 1652 )

      10 years ago, a good chunk of gaming was done on PCs because consoles were crap - standard def, too-small TVs, and the like, so people bought nice high-end PCs and invested in them.

      ... And 20-30 years ago, a good chunk of gaming was done on consoles because home computers were crap. Arcade cabinets had the best graphics because they had expensive custom hardware; consoles played second fiddle, but could at least handle sprites with some aplomb; home computers were in distant third place.

      Things have changed; they will change again.

  • I am still on a 3 year old mid range PC graphics card which I back then got for 150$ it still runs pretty much every new game which comes out on the PC ad mid til high end settings.
    The reason, the stallment of the update cycle caused by the last console generation.
    The funny thing is, if you want cheap gaming it is currently the PC, the games are cheaper and usually hit the bargain bin earlier, and given that the consoles lack severely on the hardware side and PC only development has come to a standstill or

    • The funny thing is, if you want cheap gaming it is currently the PC, the games are cheaper and usually hit the bargain bin earlier.

      This.

      It seems people never take into consideration the price of the actual games when they're making the PC vs Console price argument.

      In general, AAA titles will run you 10-20% more on release on consoles than on PC. Also, there are thousands of Indie games on the PC that never make it to consoles. Some of them are awesome, and most of them are dirt cheap.

      $600 might seem like a lot compared to $300 for the console hardware, but over their lifetimes the cost will even out. Plus, the PC will let you do a lot

  • ... that is under served. The game company who can serve the under served with the right game that doesn't dumb down it's game will hit it big.

    The real issue is that game developers are losing touch with gamers by trying to copy WoW and are caught up in multiplayer hysteria, when EA says it is the "end for the single player game" I laugh. These jokers are just creating markets for other more ambitious and far seeing people to move into.

  • He mentions this under "Missing Features" but I find the most annoying thing to come from ports of console titles to PC is the inability to save more often than a console user. I don't want "checkpoints", I want to be able to pause, save, quit whenever I like. Or to quicksave after a long drawn out exposition-in-cutscene by an antagonist, prior to another Final Battle. Most of all, I don't want to have to juggle "save slots", because the original console it was aimed at has limited storage. I have terabytes
    • by slim ( 1652 )

      I don't see this as a console related thing. There's nothing about console hardware that prevents arbitrary saves -- modern consoles have plenty of storage.

      Rather, it's a game design thing. It's about usability, challenge and player satisfaction. For example, being able to save immediately before a difficult jump in a platformer would ruin the challenge, and result in a boring game of { save; jump; while(!success) { load; jump; } }

      Sure, they get it wrong often. But that's not because of any constraints brou

  • First thing you learn when selling is people buy FEATURES. Flash a shiny object and hear "ooohhh...me want".
    The benefits exist to answer the objection of their more critical nature (or their wife) "do you really *need* it?".
    The $500 video card is probably just as good as the $250--maybe even better in some way.
    No, you don't need it. You want it. If you want it, then buy the damned thing.

    Buy your billet aluminum shifter knob and enjoy it. It's a complete waste of money, but you get to pretend you're a ra

  • OK, I can believe that in some way, PC gaming is affected by "consolitis" - but by the same token, console gaming has been affected by "PCitis". Console gaming is dominated by first-person / over-shoulder shooters nowadays; a leak from the world of PC gaming which (to me) isn't particularly welcome.

    FPSs with a PC heritage are probably the reason dual-analogue joypads became the norm; that's a mixed blessing because it improved the arena shooter (compare the analogue Geometry Wars with the 8-directional Robo

    • by timftbf ( 48204 )

      This.

      "Games" and "first-person shooters" are *not* equivalent, as many of those complaining about putting a cross-hair exactly where there want seem to think.

      I'd be quite happy for the genre to fade into obscurity on consoles and go back to PCs / mouse / keyboard, leaving console developers to concentrate on things that interest me.

  • Pretty soon, however, graphics chip makers won’t be able to sustain their rate of growth because the software is so far behind, which will be bad for gamers on consoles as well as PC.

    I don't agree. In my 30+ years of computer gaming, I have noticed that software never, ever falls behind. What does happen is that it moves in cycles or waves of innovation where some startup comes out with a completely new way of doing things and then all the big players either copy them, or in the case of EA - buy

  • Though a $500+ video card is considered top of the line, a $250 one will now play pretty much any game at the highest settings with no problem. (Maybe that’s what everyone wanted?) Pretty soon, however, graphics chip makers won’t be able to sustain their rate of growth because the software is so far behind, which will be bad for gamers on consoles as well as PC.

    Because I'm certain all PC gamers want to return to the days of Crysis when a top of the line computer still wasn't strong enough to run a latest release game with anything more than average graphic settings unless they were willing to drop a few hundreds on a new video card every six months.

BLISS is ignorance.

Working...