Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
PC Games (Games) PlayStation (Games) XBox (Games) Games

PC Gaming 'a Generation Ahead' of Consoles, Says Crytek Boss 412

Crytek co-founder Cevat Yerli spoke recently about the growing gap between modern PCs and consoles like the PS3 and Xbox 360, saying that the desire to develop for multiple platforms is hampering creative expression. "PC is easily a generation ahead right now. With 360 and PS3, we believe the quality of the games beyond Crysis 2 and other CryEngine developments will be pretty much limited to what their creative expressions is, what the content is. You won't be able to squeeze more juice from these rocks." One reason this trend persists is because of the perception that PC game sales are not high enough for most developers to focus on that platform. Rock, Paper, Shotgun says this indicates a need for the disclosure of digital distribution sales numbers, which could dispel that myth. Yerli's comments come alongside news of Crytek's announcement of a new military-based shooter called Warface.
This discussion has been archived. No new comments can be posted.

PC Gaming 'a Generation Ahead' of Consoles, Says Crytek Boss

Comments Filter:
  • by BadAnalogyGuy ( 945258 ) <BadAnalogyGuy@gmail.com> on Thursday November 25, 2010 @06:29PM (#34345750)

    Before you start saying that these consoles are essentially tapped out, keep in mind that the PS3 isn't near its full potential yet.

    PS3 still not maxed out - Andrew House (SCEE President)
    http://www.computerandvideogames.com/article.php?id=248275 [computeran...ogames.com]

    PS3 hard to develop for on purpose - Kazuo Hirai (SCEE Chairman)
    http://www.computerandvideogames.com/article.php?id=248275 [computeran...ogames.com]

    Now, when you've finally "tapped out this rock", then come back and complain. Until then, blame yourselves for your inability to develop good games that take full advantage of these platforms.

  • by aristotle-dude ( 626586 ) on Thursday November 25, 2010 @06:30PM (#34345756)

    Just look at the newest games and how badly they perform on supposedly "powerful" machines. These games are not more creative, just flashy and poorly coded.

    Take the Rage game on iOS for instance, it rivals some console graphics but is not running on powerful hardware. It has to run on a machine with less than 512 Mb devoted to the game and no access to Virtual memory. PC games are written by people who could not code on embedded machines if their life depended on it. Sloppy code.

  • Bullshit (Score:4, Insightful)

    by NeutronCowboy ( 896098 ) on Thursday November 25, 2010 @06:30PM (#34345764)

    I haven't seen anything innovative done on a PC that couldn't have been done on a PS2. Crysis 2 is innovative? Oh please. Two extra bullet-points on the back of a box do not make a game "innovative". Portal: innovative. Tower of Goo: innovative. Minecraft: innovative. What do they have in common? They could run on hardware that is 10 years old.

    I think the Mr. Crytek fails to see past his own problems: that the shiny that his company specializes in does very little to make a game special.

  • by mikaelwbergene ( 1944966 ) on Thursday November 25, 2010 @06:35PM (#34345794)

    And it has happened again as it has happened every single generation of consoles and as it will in every future generation.

    One platform is constantly shifting and upgrading, the other doesn't.

    What do you think happens in the gap between console releases?

    Unfortunately they're currently too busy trying to milk motion controls and using that as an excuse to not release new hardware. Hopefully Nintendo will just out of nowhere drop a magic console developed using their profits from their current gen console.

    Either way some games are better on consoles (fighting, local multiplayer, driving games etc) , while other games I prefer my mouse and keyboard support (simulation, rts, fps, etc)

  • by Average_Joe_Sixpack ( 534373 ) on Thursday November 25, 2010 @06:35PM (#34345796)

    as long as the developers target the consoles and PC then you only have to match the specs of that console generation.

  • by tepples ( 727027 ) <tepplesNO@SPAMgmail.com> on Thursday November 25, 2010 @06:48PM (#34345914) Homepage Journal

    PC is easily a generation ahead right now.

    Wii showed that graphical output isn't the only thing that defines a hardware generation. In the seventh generation, while Microsoft and Sony were moving their output forward by a generation, Nintendo moved its input forward by a generation by bundling a Bluetooth handheld pointing device with the console. It took the other guys years to come up with Kinect and Move to match the Wii Remote.

    But the major consoles are still ahead of PCs in how many simultaneous players a game will usually support. This is in part because consoles are ahead in what monitor size their makers can encourage their users to connect. Sure, using a TV as a monitor has been easy since HDTV became common starting in 2006, but home theater PCs are still a rarity for some reason. Is it usability, or is it a plain old path-dependent Catch-22?

  • Comment removed (Score:3, Insightful)

    by account_deleted ( 4530225 ) on Thursday November 25, 2010 @06:50PM (#34345932)
    Comment removed based on user account deletion
  • Re:Captain Obvious (Score:3, Insightful)

    by SuricouRaven ( 1897204 ) on Thursday November 25, 2010 @06:57PM (#34345976)
    There's another thing to blame for the Wii's library limitation: Nintendo's marketing. Like Disney, the company is very dependant upon protecting their image as wholesome and family-friendly now, and must do their best to protect it from the taint of being associated with anything sexual or violent. So they are quite strict about what they allow to be published for the Wii. It's not entirely devoid of violent games, but it has fewer of them proportionally than the other major consoles. If you like U-rated games, on the other hand, it's got loads.
  • by Renraku ( 518261 ) on Thursday November 25, 2010 @07:00PM (#34345994) Homepage

    To understand the poor coding, you must understand the game industry and the choices they make. I'll explain using analogies that everyone can understand.

    Example 1: Your task is to build a house. You can make your own brick, cut your own lumber, pour your own concrete foundation using concrete that you mixed, do your own plumbing, etc. The quality of your house is based on however much time you feel like spending to do it yourself. Obviously this would take far far too long, so you opt to use materials already created. You buy all the ingredients. Obviously some may not be up to your standard, but the loss of quality is relatively low compared to the vast amount of time you will save. You've given up a little and gained a lot.

    Example 2: Your task is to build a house. You have three days to do it. The previous house, using the components you purchased, took several weeks to build. Your only solution is to use modular components. AKA, bed room. Living room. Kitchen. Bathroom. Assemble with a crane, connect together on a foundation, voila. A house. The quality suffers quite a bit using this pre-built solution, but you got the job done on time. It was the only way you could do it. You gave up a lot to get the job done on time.

    Example 3: Your employer now realizes you can build houses in three days, and that there's a high demand for your house building services because you did such a good job in example one. Still, your employer thinks you can build it a little faster. Two days to build the house now. They know people won't care about the quality because once they've bought it, they've paid for it. As long as it still meets the most basic definition of a house and doesn't endanger the lives of the people living in it, it's suitable for sale. Your only option is to make a house factory and simply air lift the house in once complete. You don't even have time to secure the thing to the damn foundation.

    So we've gone from perfect house to shitty house that will slide off its foundation in a strong wind. This is how the game industry is. They HAVE to use shitty tools and shitty coding to slop things out the door as fast as they can, because the marketing team has promised Call of Duty Black Ops 2 and 3 to be out by February and won't even tell the developers this until January 25th. Guess what department the executives are in?

  • by fuzzyfuzzyfungus ( 1223518 ) on Thursday November 25, 2010 @07:06PM (#34346036) Journal
    I'm guessing that we should just take the president and chairman of Sony Computer Entertainment Europe, a couple of non-techie suits with a nontrivial stake in saying nice things about their product, at their word when they assure us that the PS3 will achieve photorealistic graphics and save the whales, if only those lazy developers would do it right? Isn't this the same Sony whose PS2 "Emotion engine" was supposed to have been delivering cinematic graphics, according to their marketdroids?
  • by tepples ( 727027 ) <tepplesNO@SPAMgmail.com> on Thursday November 25, 2010 @07:06PM (#34346038) Homepage Journal

    You can play pc games with almost any input device you want, including xbox 360 controllers.

    Say I've hooked up my PC to a TV [wikipedia.org] and connected four Xbox 360 controllers through a USB hub. How many controllers does a typical major-label game designed for the PC support? One. Instead of adding shared-screen play, publishers expect players to buy four PCs, four monitors, and (more importantly) four copies of the game.

  • Re:It took 4 years (Score:3, Insightful)

    by MightyMartian ( 840721 ) on Thursday November 25, 2010 @07:13PM (#34346080) Journal

    It's also an issue of market. High-end game PCs make up only a small part of the whole PC market. If you indeed did make games that required the horsepower of a $2000 gaming machine, I doubt you would see much profit. Yes, technically consoles are a generation behind, but if you're looking at selling lots and lots of copies, you want stable hardware specs. Most PC games are probably sitting in the generation, or at least half-generation, behind the full throttle systems as well, simply because you want as large a market as possible, and so have to have at least some level of playability on mid-range PCs. The same rules apply.

    I fail to see what hardware has to do with creativity anyways. Yes, better specs can certainly improve graphics, but that's only one piece of the puzzle.

  • by Nyder ( 754090 ) on Thursday November 25, 2010 @07:15PM (#34346102) Journal

    Think a lot of people are missing the point here.

    They are talking about hardware, not what the current PC games compare to their console counter parts.

    See, this is the problem. PC are capable of so more, yet we get a dumbed down console port instead of a game tailored to the extra stuff modern PC's can bring you.

    Most PS3 & 360 games are barely 720p, usually less. Crappy AA on them, etc.

    Modern PC can do the 1080p, max AA and not break a sweet. And not break your bank. Get a Nvidia 460 1gb card for $200 and you got yourself a nice card that kicks ass.

    And yes, I'm a gamer. Been so for 30+ years. I prefer my PC for gaming (even got me 3D Vision, which rocks), but I do have a Xbox 360 (jtag'd), a Wii (softmodded) and will have a PS3 whenever I get enough money for it (ya, and I'll hack it also, because that's how i roll).

    It's funny, because I remember when arcade games were the better graphics systems, and computers & consoles tried to be that good. Then the computers surpassed both the consoles & arcade games. And we, the computer gamers have been paying for it ever since.

    (sorry, when the PS3 & 360 game out, their graphics weren't really on par with computers, they were already behind, and it's a bigger gap now).

  • by el3mentary ( 1349033 ) on Thursday November 25, 2010 @07:28PM (#34346168)

    Like Disney, the company is very dependant upon protecting their image as wholesome and family-friendly now

    Yet explain how a Disney subsidiary green-lit Kill Bill.

    I would have thought that was obvious, Miramax was bought by Disney in 1993 in order to allow them to release more adult orientated films without hurting their brand. A disney subsidiary green lit Kill Bill precisely because it was a subsidiary and not the main brand.

  • Re:Captain Obvious (Score:3, Insightful)

    by Lord Bitman ( 95493 ) on Thursday November 25, 2010 @07:29PM (#34346186)

    I promise you it is not the graphics that stop me from playing games on the Wii.
    It's the fucking awful games.
    It's the controls.

    Every Wii game I've played has come in one of two flavours:
      a) Trying to use the Wii-mote as advertised, the result being horribly awkward and ultimately impossible-to-enjoy inconsistent fumblings as the Wii really sucks at motion control.
      b) Games where the developers realized the limitations of the Wii, and compensated by making the controller a prop which doesn't actually do anything. "Let's pray pretend! Now you're a sorcerer! Here, hold this stick, it's a MAGIC WAND!" entertaining for five minutes, maybe, but once you realize that your moving the stick around doesn't actually have any more effect on game than sitting on the controller at the appropriate time, it loses its appeal fast. I can play pretend all by myself /without/ standing in front of a TV.

    I have heard that WiiMotion+ improves greatly on what amounts to Nintendo saying it had a great idea for a console, then getting really hung over and writing its homework out in five minutes before class. I don't have a compelling reason to blow money on it, when apparently all it has going for it is "Makes Wii act like they said it would, on some new games designed for it." Especially when there's a new motion controller for another console which doesn't even need to make wild guesses about where your arms are.

  • by tepples ( 727027 ) <tepplesNO@SPAMgmail.com> on Thursday November 25, 2010 @07:30PM (#34346194) Homepage Journal

    I can slap a disc in a $300 box with no buttons, and play it.

    Unless the game isn't for your $300 console. Imagine that your friend has recommended a PC game to you. You check the developer's web site to see if a version is available for your console, but you find that the developer has posted a rejection notice from the console maker. Various overheads associated with becoming an authorized console game developer are part of why indie games tend to be PC exclusive. Even among major-label games, many are exclusive to a console you don't have, and by the time you've bought all three consoles, you've spent more than a gaming PC costs.

    Hundreds of dollars for video cards, extra memory, high-end CPU's? You gotta be kidding me.

    Yeah, it is silly, especially when a $300 ION nettop with a GeForce 9400 GPU can run indie games, older games, and even some newer games at lower graphics settings.

  • by Anonymous Coward on Thursday November 25, 2010 @07:33PM (#34346208)

    I would agree that there is a generational gap between true gaming PCs and consoles. That's always going to be the case. The upgrade and refresh cycles of gaming PCs are going to be much shorter than consoles. However, the console market is much larger than the true gaming PC market. In order to expand the market beyond this niche, game developers have to target "standard" PCs, and that is where the variability is hardware capabilities is an issue. If I develop a game for a console, every user is going to have essentially the same hardware (storage and peripherals may differ, but the core product is the same). Microsoft has tried to address this with WinSAT scores and games for Windows certifications, etc. However, at some point game developers have to compromise on a common denominator for hardware specs. To match the size of the console market, my guess is that the PC specifications would be comparable to or possibly less powerful than the latest generation of consoles (XBox 360, PS3).

  • Interesting (Score:2, Insightful)

    by vampirbg ( 1092525 ) on Thursday November 25, 2010 @07:34PM (#34346220)
    Maybe someone should tell him that it's the GAMEPLAY that matters, not flashy graphics. I never did like Crytek's games because they felt more like tech demos that real games. Also, consoles have one more advantage. If I want to play a game I just stick the disk in and that's it. No worries if my drivers are current, or if my combination of mb+graphics would cause a problem etc. Also it's much cheaper to be a gamer on the consoles. Sure, the games are more expensive but ask yourself how often do you have to upgrade you machine? I did it every 6-12 months and each time i spend around $500 on it (new mb, new graphics and usually a new cpu) just so I could play the latest games with details on max
  • Re:Bullshit (Score:5, Insightful)

    by blahplusplus ( 757119 ) on Thursday November 25, 2010 @07:43PM (#34346270)

    "Crysis 2 is innovative? Oh please. Two extra bullet-points on the back of a box do not make a game "innovative"."

    The great irony in you saying this is that the reverse is true, console game quality is hurting PC game quality. PC games have been dumbed down for consoles and consolized for multiplatform release.

    Also console ports for the PC get sloppy seconds due to multiplatform release. We saw the awful game for windows live inserted into Gears of War for PC. We also saw how Badly Halo and Halo 2 were ported to PC. Halo was originally a PC game they had to fit into the first xbox because MS needed a game to sell the system.

    Don't believe it console games have effected PC game quality? Check out supcom 2 and Civ 5's terrible reviews on amazon.

    Civ 5
    http://www.amazon.com/Sid-Meiers-Civilization-V-Pc/dp/B0038TT8QM/ [amazon.com]

    Supcom 2

    http://www.amazon.com/Supreme-Commander-2-Pc/dp/B002BXN6GY/ [amazon.com]

  • Re:Bullshit (Score:5, Insightful)

    by Haeleth ( 414428 ) on Thursday November 25, 2010 @08:25PM (#34346456) Journal

    Also blocky models, blurry textures, and horrible terrain pop-in. Yes, it could be done on the PS2, and it's an absolutely amazing bit of work considering the hardware limitations.

    But that doesn't mean it couldn't have been even better with PS3 technology, and even better than that with today's PC technology.

    You don't need flashy graphics to make a good game, but if you acknowledge that the quality of the visuals are one of the things that allowed SotC to become a work of art, how would it not have been improved by the ability to render those visuals exactly as its creators envisaged them, instead of having them limited by technology that was lagging well behind the state of the art?

  • Re:Bullshit (Score:5, Insightful)

    by NeutronCowboy ( 896098 ) on Thursday November 25, 2010 @08:26PM (#34346464)

    Yes, it is indeed pretty clear they're talking about graphics. It is also pretty clear that when they say "is holding back creative expression" and "holding back quality games", what they mean is that all their creative expression and quality work is going into making a game prettier. Which in turns means they have no idea how to make quality games.

    That's what I'm calling bullshit on. The fact that creative expression is identical with fill-rates or polygons/sec. I'm sorry you were so gung-ho to call me on my snobbery that you missed that point.

  • Re:Bullshit (Score:2, Insightful)

    by mikaelwbergene ( 1944966 ) on Thursday November 25, 2010 @08:53PM (#34346582)

    Technological innovation doesn't count now? I thought this was news for nerds.

  • by billcopc ( 196330 ) <vrillco@yahoo.com> on Thursday November 25, 2010 @09:00PM (#34346628) Homepage

    Disney didn't make your TV so the subsidiary doesn't care that Kill Bill is out on DVD. Contrast this with the Wii, where it matters not who made the game, it is known to its users as a "Wii Game", and thus has a direct association with Nintendo's brand and image. You play it on a Wii, it says "Nintendo" on the packaging... you get my drift.

    Nintendo's kid-friendly image is a huge part of their business strategy, they automatically win all the overprotective parents who are terrified of the Xbox and its filth-laden Live service, where everyone and everything is a "nigger" and/or "faggot" according to its prominent users. I can't speak of the PS3 since I don't have one, but I would speculate that the it is not much different, due to being marketed to the same adult / hardcore crowd as the Xbox. Hell, there was a (shitty) game on the old Xbox where victory resulted in a "Girls-Gone-Wild" style clip being presented as your reward. You'll never see vodka-doused tits on a Nintendo console, that's for sure!

  • by PopeRatzo ( 965947 ) * on Thursday November 25, 2010 @09:11PM (#34346684) Journal

    The PS3 has way too much power in certain area's that aren't necessary. In area's like GPU and memory though it's pathetic.

    For me, the biggest weakness of all consoles is the controller. PS3 and XBox controllers force game developers toward silly simplifying moves like the abominable "third person shooter". I guess if you're into puppetry it might be fun, but if you're looking for anything like an immersive experience, third person shooters aren't going to get you there. No matter what you do, you're looking over the shoulder of a character who, for some reason, doesn't seem to understand that sometimes you want to jump over the box and sometimes you want to use it as cover.

    I wouldn't mind so much if the net effect of the ubiquity of consoles was just that it slowed the development of graphics for PC games, but it's done something much much worse: it's forced PC games to adopt horrible control mechanics and idiotic point of view, and for no better reason than the limitation of the console controller.

    It amazes me that decades in to see how clumsy console controllers are. That's not to say that it's impossible to get somewhat used to a console controller, but even when you've mastered them, it's still an ergonomic nightmare. In online gaming with PCs, you can always tell when someone's using a console controller. Not that they're going to be necessarily worse than someone who's using a keyboard and mouse, but there are certain tell-tale signs.

    And the "alternative" controller schemes, like the Wii and even the Kinect are still completely unable to control fine movements. If you want to swing a bat or a sword, you can use a Wii, but if you want to strafe while picking off the enemy from a crouched position and switching to a different weapon or reloading, good luck. I'm interested in seeing where the Kinect will go, but until they make Kinect controllers for my PC, I'll never know. I did my best to warm up to a PS3 for more than a year, but finally (about the time MW2 came out) I finally just gave up and went back to PC gaming. The fact that Sony continues to be hostile to its customers was no small part of that decision.

    The best thing that can happen to PC gaming, in my opinion, is for simple hacks for the PS3 and XBox to become readily available so games can easily be copied and shared. Personally, I'm surprised that so many console gamers have chosen to accept punishment so readily for PC gamers' filesharing. Especially since there's very little evidence that filesharing has in fact hurt PC game developers.

  • by Gadget_Guy ( 627405 ) * on Thursday November 25, 2010 @09:20PM (#34346738)

    Just look at the newest games and how badly they perform on supposedly "powerful" machines.

    That is wrong. At the default (mainstream) settings virtually all games play well on the current level of gaming computers. Although I will concede that there have been some console ports that perform so poorly that you wonder whether they are running under a console emulator. But that is not representative of all PC games.

    The people who complain about poor performance are those who insist on pushing all the game settings up to maximum. The reason they have the adjustable settings in PC games is for those people who spend stupid amounts of money on their systems, to extend the shelf life of the game by future-proofing it, and to make pretty screenshots to help sell the game.

    People often use your argument as a reason for why console gaming is better, but that it because console games don't have the option of increasing the video settings to maximum. They are fixed at the mainstream level. And often the default mainstream settings on a PC game will still look better than the console versions.

    Finally, if you decide to revisit an old game in a few years time, your console game won't age as well as a PC game because you will be able to use all the maximum settings on your upgraded PC. That comparison is assuming your PS4 or XBOX 720 will actually run the old software.

  • by jjohnson ( 62583 ) on Thursday November 25, 2010 @09:29PM (#34346772) Homepage

    There was never a need a for yearly upgrades. Current games have always been comfortably playable at less-than-max settings for PCs two or three years behind the latest-and-greatest. It's gamer dick-swinging that led the misguided to constantly chase the "current" hardware--the producers of PC games always allowed for older machines.

  • by Sycraft-fu ( 314770 ) on Thursday November 25, 2010 @09:54PM (#34346880)

    Look man there is nothing wrong with liking gameplay. I am a full supporter of the "games need to have good gameplay" idea. However there is also no need to hate on graphics, which seems to common on Slashdot. A kind of techno luddism. "Oh these games would be just as good with older graphics on low end hardware." No, sorry, but that is false. A game is a rich experience. Part of that experience is visuals and good visuals go a long way to making that experience immersive.

    So holding gameplay up as the One and Only Thing is no more valid than holding up graphics as the One and Only Thing. Also, guess what? There's more that a PC offers than just graphics. A big one that DOES relate to gameplay is memory. If you want to have a game with a big world, with a lot that goes on, memory is needed. This is part of the reason that the PC still sees the best strategy titles (keyboard and mouse are another). 512MB that you have to share with the video card, or 256MB that you don't (360 and PS3 respectively) is awful tight to try and store a big, active, world in. A PC can easily give you a gig or more for your dedicated use.

    I agree that Crytek needs a bit of STFU since their gameplay is for shit they are graphics-only game makers. However let's not get up on the "Fuck graphics," techno-luddism crap. I play many games from many eras. I emulate old games from my youth, I play current high end games. While a good story (where applicable) and fun gameplay are key, good graphics and sound are great too.

    Wolfenstein 3D is probably forever the most innovative 3D shooter since it invented the genre. However I'm sorry, but some of its less innovative modern counterparts are far better. Call of Duty 4 was a great game, and part of that was the wonderful graphics and sound. You couldn't do that game on a 286 like Wolf3D. Cut all the graphics, sounds, AI, levels, and so on back to what was required and the game wouldn't even be the same thing.

    Progress on ALL fronts in game design is a good thing. Also holding "innovation" meaning doing something that has never been done before, up as the be-all, end-all is also silly. It is hard to be truly original and that isn't really a bad thing. We as a species have imagined a lot of thing, and there is nothing wrong with building on what is out there. Even most innovative things do. They are more original than some things, but you can still point out heavy influence from past works and other media.

    I have to agree with Crytek that PCs have it better when it comes to games. They can do any console game out there that someone bothers to port and can do it higher resolution, higher FPS, better graphics and so on. They can also do titles the consoles can't. Look at Civilization 4 and 5 vs Civilization Revolutions. You can't do the full Civ games on a console, they lack the memory to handle it (among other things).

  • Re:Bullshit (Score:3, Insightful)

    by Anonymous Coward on Thursday November 25, 2010 @11:04PM (#34347134)

    This comment is so blatantly ignorant of gaming that I don't even know where to begin.

    Tower of Goo is innovative? Minecraft is innovative? Tell me, how many games have you played in total? Five, maybe ten? Those two titles are both highly derivative of previous games. The fact that they're a fad now does not somehow make them "innovative." Justin Bieber isn't innovative just because he's popular. Portal is the only game you listed that fits the term.

    And if you don't understand the technology that went into Crysis (which is clearly the case), then why are you commenting? You sound like another clueless tool without a smidgen of technical knowledge or expertise. Crysis 2 is not even out yet for you to judge. Go back to sitting on your couch and playing Madden.

  • by Doctor_Jest ( 688315 ) on Friday November 26, 2010 @01:02AM (#34347592)
    They're flat. Not dead. Try reading without a bias next time.
  • by Khyber ( 864651 ) <techkitsune@gmail.com> on Friday November 26, 2010 @02:09AM (#34347808) Homepage Journal

    More like 4 generations behind. The PS3 ran what amounted to a modified GeForce 7800GTX.

  • by suzerain ( 245705 ) on Friday November 26, 2010 @06:04AM (#34348594)

    Figured I'd add onto this...the problem with first-person shooters, for me (or, say...in the F1 racing game where you can have a "looking out the windshield" view vs. a view from behind the car), is that in first-person shooters, you're in a tunnel with no peripheral vision.

    In real life, if I was sneaking around with a gun trying to shoot people, I'd be relying on my peripheral vision as much or more than my direct vision. This is why I, too, prefer the third-person view, because at least it opens up the field of view a bit.

  • Re:Ok (Score:3, Insightful)

    by CronoCloud ( 590650 ) <cronocloudauron.gmail@com> on Friday November 26, 2010 @07:34AM (#34348938)

    How about large game worlds? Consoles, with their tiny memory amounts, put real limits on that kind of shit. As an interesting study in this, look at Deus Ex 1 vs Deus Ex 2. DX1 was PC only, running on Unreal Engine 1.

    Deus Ex 1 wasn't PC only, it was on the PS2 as well.

    Also it wasn't streaming,

    Streaming worlds is smart, it enables you to have HUGE worlds with 0 load times between zones, like EQOA on the PS2. You could walk/swim from Fayspires to Qeynos and never see a load screen. Who cares if things out of your FOV don't exist and are regenerated.

    The consoles are only 720p devices (1280x720). Yes, they do basic upsampling but you gain no detail with that. Other than a few rare PS3 games (which suffer in therms of textures and so on because of it) that run at 1080, they all run at 720, and sometimes even less.

    Citation needed.

  • by Wildclaw ( 15718 ) on Friday November 26, 2010 @10:21AM (#34349728)

    However I have a hard time plopping down $1000+ for a gaming PC when games on a $300 xbox or playstation look only marginally worse.

    However I have a hard time spending $300 on an xbox/playstation when I can buy a $100 graphic card for my PC and get graphics that looks marginally better than any console.

  • by Karlprof ( 993894 ) <karlprof@gmail.com> on Friday November 26, 2010 @05:31PM (#34353052) Homepage Journal
    Well you can get a perfectly capable gaming PC for $800, and you can save money in the long term by upgrading individual parts instead of buying a whole new console. Also PC games are often much cheaper than their console equivalents. So I think really it's not true that it's much more expensive to be a PC gamer. :)

One man's constant is another man's variable. -- A.J. Perlis

Working...