Follow Slashdot stories on Twitter


Forgot your password?
PC Games (Games)

Unreal Creator Proclaims PCs are Not For Gaming 705

An anonymous reader writes "TG Daily is running an interesting interview with EPIC founder and Unreal creator Tim Sweeney. Sweeney is anyway very clear about his views on the gaming industry, but it is surprising how sharply he criticizes the PC industry for transforming the PC into a useless gaming machine. He's especially unhappy with Intel, which he says has integrated graphics chipsets that 'just don't work'."
This discussion has been archived. No new comments can be posted.

Unreal Creator Proclaims PCs are Not For Gaming

Comments Filter:
  • by PC and Sony Fanboy ( 1248258 ) on Monday March 10, 2008 @09:00AM (#22699410) Journal
    There aren't many GOOD pc games coming out lately. So, if the manufacturers drop the ball on hardware ... it doesn't REALLY matter, because the software developers aren't doing much better either.

    I don't think that it is a downward spiral, either - software companies aren't focusing on consoles because the PC hardware isn't great ... they're focusing on consoles because there is more money in consoles!
  • RTS (Score:3, Insightful)

    by hansamurai ( 907719 ) <> on Monday March 10, 2008 @09:07AM (#22699488) Homepage Journal
    I really believe the last bastion of PC gaming lies in real time strategy games, a genre that essentially requires at least a mouse. I guess many die hards would say the same about first person shooters, but I am comfortable playing with either a mouse or controller, and ever since Halo came out back in 2001, the FPS scene has been migrating to the consoles at a pretty quick rate. The PC will always have Counterstrike, but when it gets pretty popular console games such as Gears of War a year after their console release, you can tell that times have changed.

    But yeah, real time strategy games, I don't think we'll ever a decent port of say Starcraft 2 to the consoles, but I suppose if anyone can pull it off, Blizzard can.

    I'm not really sure if PC games losing to consoles is entirely a bad thing, I think people are just fed up with trying to keep their system up to date with hardware, nasty CD protection schemes that kill their drives, and console ports that can play just as well and in the comfort of their living room.
  • by amazeofdeath ( 1102843 ) on Monday March 10, 2008 @09:07AM (#22699490)
    For example:

    [...] a problem that we have today, and that is the fact that every PC should have a decent graphics card.
    Why would a computer meant for browsing the Internet and reading email need a separate graphics card?
  • by DuncanE ( 35734 ) * on Monday March 10, 2008 @09:08AM (#22699500) Homepage
    Personally I find that only games that require a mouse are worth playing on a PC now anyway. And I dont include FPS's in that either. So really I only play RTS's on the PC, but I would happily play them on a console and then wouldnt have to worry about driver issues and bugs due to odd hardware conflicts.
  • i915 (Score:4, Insightful)

    by westcoast philly ( 991705 ) on Monday March 10, 2008 @09:09AM (#22699506)
    Of course integrated graphics aren't for gaming. that's what a dedicated video card is for. If you want to use your PC for gaming (Which I do, casually.. with dual geForce 8600GTSs) you have to add on.. it's a simple procedure as everyone here is probably aware. but integrated graphics are VERY useful for office environments where they don't NEED 3d performance. wow.
  • by Brian Gordon ( 987471 ) on Monday March 10, 2008 @09:09AM (#22699512)
    Not many good pc games coming out? Who cares about new games when you have such a massive library of games available? Just have to have your madden 09? Then go away. Like a good game despite Windows 95 graphics? Break out Chip's Challenge [] and see if you still remember how to beat it. Craving something new? How about Steam's library, which is massive and is actually priced reasonably unlike any console game at all. And has free mods for the more popular games that are good for more play hours than the game itself-- how many people have bought Half-Life 2 Deathmatch just so they they can play SourceForts, and never even launched HL2DM? How about Insurgency []? PC gaming is dead? Does netcraft confirm it?
  • by iainl ( 136759 ) on Monday March 10, 2008 @09:11AM (#22699528)
    But, as ever with Epic staff, he seems to labour under the frankly ludicrous idea that the solution is to stop home and business users who don't need an 8800 from buying anything slower.

    If he's not able to label his game box clearly enough as needing a £300 graphics card, that's his problem, not Intel's. They make chipsets that are perfectly good enough to accelerate Aero Glass, and there are plenty of consumers that only need that.
  • by Lumpy ( 12016 ) on Monday March 10, 2008 @09:17AM (#22699616) Homepage
    Have you even tried to play Unreal III? It takes far more PC than most people have. and that same problem plagued ID on it's last 2 releases for almost 2 years. Hell I know people that STILL dont have a pc capable of running Doom III at any playable speeds. Gaming companies are killing themselves. They are selling games that require a 4ghz dual core, 4 gig ram, and a $500.00 video card. While the world is happy as hell with their 3 year old Pentium 4 3ghz running that $45.00 Geforce 6600 card.

    you cant sell a crapload of games that runs on hardware that most people dont have.
  • Creativity (Score:3, Insightful)

    by c_g_hills ( 110430 ) <> on Monday March 10, 2008 @09:18AM (#22699630) Homepage Journal
    I fully agree with the sentiment. In the good old days, you had to be creative to get the most out of the hardware you had, and gameplay was at the centre (or center) of attention. These days it is all about how many frames per second you can push from your graphics card and cpu.
  • by Evil Kerek ( 1196573 ) on Monday March 10, 2008 @09:20AM (#22699654)
    and stop having an anuerism everytime someone tries to add a mouse, I'd pretty much stop using my PC to game with.

    I will NEVER use a joystick to play an FPS. Period. It's inferior. Period. A good mouser can beat the best joysticker everytime, given a level playing field (and before you start, it's almost NEVER a level playing field - so don't tell me how good you are on a console. The target areas are programmatically larger. The AI is dumbed down. Etc, etc. These are facts - look it up)

    If you even START to suggest adding a mouse option to consoles, the kiddies starting pitching a fit and immediately begin insulting your mother. It's pathetic - the fear of having their asses handed to them in combat is funny. I really enjoy my 360 - but not having a mouse as an OPTION prevents access to a lot of what is cool on it.

    Until that time, the PC platform will remain strong. Consoles need a mouse. It's just silly they don't have them. If M$/$ony will EVER gets some balls and support a mouse, I think you'll see the PC side take a huge hit. I'd rather play on my 65" HD.

  • Define games (Score:3, Insightful)

    by Thaelon ( 250687 ) on Monday March 10, 2008 @09:22AM (#22699690)
    From the perspective of type of resources "modern" games require, he's right. But large portions of the gaming industry seem to have lost sight of the fact that games do not need to be pretty, only fun. They are games after all.

    In the last six months I've logged more hours playing Mahjong on my N810 than I have playing UT3, EVE Online and Half Life 2 mods combined.

    So from a wider perspective he's not only wrong, but lost sight of what is important in a game. Not that I don't personally think that UT3 is fun as hell, I actually bought that one. But some perspective on his part would be beneficial to him and his customers.
  • by acvh ( 120205 ) <geek.mscigars@com> on Monday March 10, 2008 @09:31AM (#22699786) Homepage
    this article and interview are NOT about mouses and joysticks. this article and interview are NOT about PC vs. Console.

    This article and interview ARE about how the overwhelming majority of PCs sold in the US do not come remotely close to being able to run current game software. It is almost a plea to Intel to stop making integrated graphics chips, because they suck at running games. If 90% of the PCs sold can't run the software you write and publish, then you aren't going to be a big fan of PC gaming at the moment.

    Yes, we know, if you're posting here you can build your own PC, upgrade your graphics card every six months, and use your mouse and keyboard to headshot Osama Bin Laden in his cave from orbit. That doesn't change the fact that you are a part of a minority, and can expect that other game publishers will begin thinking of bailing out on the PC as a platform.

  • by IamTheRealMike ( 537420 ) on Monday March 10, 2008 @09:33AM (#22699824)

    I like Tim, I especially liked his presentation on programming languages in games, but his comments about 64-bit Vista seem rather out of touch.

    Sweeney: Let's be clear with it. The switch to exclusively 64-bit would clean up all the legacy viruses and spyware programs that have been plaguing us for years. The requirement for much more system memory cannot be an excuse, because most owners of 64-bit processors have at least 1 GB of system memory installed.

    Yeah? It'd also have cleaned up all the "legacy" software people are using. Like iTunes. Not to mention all the actual legacy software like kids educational software, drivers for old hardware, etc. I also don't know why he thinks this would have cleaned up viruses and spyware. These guys adapt fast and the extra anti-patch systems in 64 Vista aren't all that strong.

  • by pak9rabid ( 1011935 ) on Monday March 10, 2008 @09:38AM (#22699892)

    I will NEVER use a joystick to play an FPS. Period. It's inferior. Period. A good mouser can beat the best joysticker everytime, given a level playing field (and before you start, it's almost NEVER a level playing field - so don't tell me how good you are on a console.
    Amen. If I had some mod points, I'd give them all to you.
  • by p0tat03 ( 985078 ) on Monday March 10, 2008 @09:38AM (#22699894)

    Because you, the consumer, demand flashier and better graphics. Not to mention that the level of graphics we're talking about is *impossible* to implement on CPU - the GPU trounces your CPU's performance many times over for matrix math and other calculations.

    Scalability is certainly a problem that game developers face - your game should look fairly decent even on a relatively old card, but PC gaming (especially of the 3D graphics variety) has always been an enthusiast thing. If you're not willing to buy a new $200 video card every year or so, you have no hope of keeping up.

    I object to your description of game devs as "lazy". The usage of the GPU is a matter of necessity, and it's not easy either. Game developers are not taking the lazy way out by "not writing code" (they are), and relying in GPU functions - what does that mean anyway? Do you think there's a magical "awesome graphics" API on your graphics card that we can call to make things shiny? The kind of work we do on the card (shaders) is sometimes a LOT more complex than what we do on the CPU.

    Oh, and DOOM works fine on integrated chipsets because... *drumroll* it doesn't use it! All your 3D work is done on-CPU, and I'm sorry to say that as fast as our CPUs have gotten, they are FAR from fast enough to power all of the pretty graphics you're used to seeing. We are, what, 100 times faster than the CPUs of the DOOM era? But our performance needs for games have progressed leaps and bounds beyond that.

    I'm so tired of buying games for my 10 year old, then having to disappoint her when it won't install because it doesn support pixelshader 1.N, and 10^27 polygons per second etc

    Read the requirements on the box! Every PC game I've ever bought has been *perfectly* clear about its video card requirements up front. After all, PC developers don't want pissed off consumers any more than you like getting disappointed when a game won't run. And seriously, if you're buying things like Lego Star Wars for your child, anything higher than a GeForce 6600 will run it buttery smooth, and that's a $50-100 card these days.

    Honestly speaking, IMHO PC devs have been doing a good job with scalability. The only game recently that required a massive upgrade just to play was Crysis, everything else (Portal, TF2, C&C3, etc.) scales VERY well down to some downright low-end hardware.

  • by Volante3192 ( 953645 ) on Monday March 10, 2008 @09:43AM (#22699976)
    FPS's can be played with a controller, but you have to add an autolock feature (i.e., Metroid Prime) which seriously drops the difficulty level.

    Multiplayer, an autolock is akin to cheating, even if it's game supplied, so sorta screwed there.
  • by dreamchaser ( 49529 ) on Monday March 10, 2008 @09:43AM (#22699982) Homepage Journal
    Show the the console that can play Crysis. For that matter show me a console that plays the newer 'Total War' games. Show the the console that supports various MMO's.

    Drink your coffee before you post!
  • by reidconti ( 219106 ) on Monday March 10, 2008 @09:44AM (#22699988)
    Maybe we care more about having fun than about worrying about optimum input devices, highest possible mouse resolution, upgrading our video cards every 6 months, and so on. All to end up with a "gaming" PC that makes too much noise and crashes all the time (or is down for repairs).

    I like to come home, flip on my 360, know it'll work (joke's on me I guess) and play games for an hour or two.. then put it away and go on with my life. It's nice to have a system that just does what it's supposed to do. The game makers know what hardware I'll be using and optimize the game for it. Perfect.

    Go ahead, tar and feather me as a Mac user, but I work with computers all day; the last thing I want to do is come home and mess with one too. I love my job, but home time is relax time.
  • by JSBiff ( 87824 ) on Monday March 10, 2008 @09:45AM (#22700016) Journal
    "Better hardware. You can always throw in a new (or extra) graphics card (relatively inexpensive) or more memory (cheap) in three years and bring your PC up to spec for the latest games. You have to buy a whole new console system at $400-$600 every three years."

    That's sorta true. . . but not so much. . .

    • You can't upgrade the CPU (usually) without upgrading the mobo (that is, while you might be able to upgrade to a slightly faster CPU, usually you can't upgrade to the next generation of CPU which gives the big performance gains vs. the incremental upgrade from 3.0 to 3.2 GHz)
    • You might be able to upgrade the graphics card, once; about every 2-3 generations of graphics cards and mobos use a new physical interface (i.e. the recent transition from AGP to PCI-X), which requires a new Mobo
    • You can upgrade the amount of ram, but ram is constantly getting faster, and to use the faster ram requires a new mobo
    • Then the new Mobo might possibly require you to get a new hard drive (if, e.g. it supports only SATA, and not PATA. . . or it supports the same physical interface standard, but at a slower speed, e.g. the transitions over the years from 33Mbit/s to 66 to 100 and beyond) - yes, you could by a PCI card to provide the old interface, but at that point it might make sense to use the money instead to get a new hard drive (so that the HD isn't a performance bottleneck in your upgraded system.
    • Then when you upgrade the Mobo, so that you can upgrade everything else, the new mobo might require a new case and power supply, or other new components (almost certainly it requires new RAM, but you were planning to buy that anyhow)

    By the time you finish upgrading your computer, you've spent enough money that it might have made more sense to by a medium-spec next gen machine, instead of trying to upgrade your last-gen machine to high-spec (for that generation). Because the medium spec machine will likely be more powerful than the high-spec last-gen machine. Or, you have, really, bought a new computer, one part at a time, anyhow, and probably spent $400-$600, at least, to do it.
  • by GauteL ( 29207 ) on Monday March 10, 2008 @09:47AM (#22700046)
    If 90% of all PC's sold can't play 90% of the games sold, who's fault is this? Is it the hardware manufacturers that sell people PCs at a reasonable price, or the game manufacturers who target hardware only found in 10% of PCs? Even if only 1/9th of all the people buying low-end PCs wanted to buy games, that would still double the target market (and that is assuming that all of the people buying "capable" machines want to buy games).

    Games manufacturers could easily start to target the 90% instead if they wanted to increase their market. Even an Intel GMA 950 (which is in an awful lot of PCs and laptops) should be capable of playing 3D games if the graphics are scaled down properly.

    Personally I think a lot of games manufacturers are pissing away the chance for a large increase in their sales, by being way too '1337'. They want to show off their game, and they want to make it look super slick, which is fair enough... but don't come complaining if this rules the game out for a large part of the market.
  • by Anonymous Coward on Monday March 10, 2008 @09:53AM (#22700154)
    Dear Sir, I'm sorry but you appear to have wedged your head in your ass.

    This kind of attitude is exactly why everyone except hardcore PC gamers, /really hate/ hardcore PC gamers.
  • by tepples ( 727027 ) <> on Monday March 10, 2008 @09:54AM (#22700162) Homepage Journal

    Do real-time war sims require a mouse, or do they work well with a DS touch screen
    if you have a DS and an M3DS simply or R4DS card, you should look up "a touch of war" a simple homebrew touch-screen controlled RTS.
    Which brings me to the next point. The console makers have preferred to lock out smaller developers rather than embrace them. Once every generation, at least one console maker sues retailers that carry some product that allows homebrew, and at least one console maker continuously updates newly manufactured consoles with code that blocks the exploits that homebrew uses to boot without the console maker's digital signature. With PCs and PDAs, independent developers and players of their games don't run nearly as much of a risk of losing their hardware supply channels as they do with the major game consoles and handhelds.
  • by Evil Kerek ( 1196573 ) on Monday March 10, 2008 @09:56AM (#22700196)
    Yes I'm aware of that, however pretty much no game supports it. The support is half-hearted at best and is basically non-support.

    And, yeah, it is Sony's (and MS's) fault. Say 'ok you can plug in a mouse' is not 'You must also support a mouse'.
  • by MightyYar ( 622222 ) on Monday March 10, 2008 @10:00AM (#22700250)

    I think he sees laptops as analogous to handheld game systems.
    It is, and that is an excellent point. What I don't understand is why he is willing to see portability as an acceptable reason to trade off performance, but cost is not... All he has to do is design a low-end version of his game - no different then making one for the DS or one of the consoles. If he doesn't do it and there really is the market he seems to think there is, one of his competitors will be happy to eat his lunch.

    Which would be consistent with the rest of the article: With the consoles, end users are guaranteed to use "the gear that the developers do".
    Then his developers need to go out and buy some low-end Dell machines. They'll never sell stuff developed on a 16GB RAM dual-quad-core $2000 workstation to folks running the 3-year-old low-end P4. He needs to accept that the market is stratified and develop a laptop version of his games. People aren't going to spend an extra $50 on a better graphics chip just so his development process can be slightly easier.
  • by Altima(BoB) ( 602987 ) on Monday March 10, 2008 @10:06AM (#22700344)
    What you're saying makes a lot of sense, especially given the fact that the computer mouse and keyboard was designed specifically to ergonomically play First Person Shooters and game controllers were designed to type letters and open folders.
  • by Ron_Fitzgerald ( 1101005 ) on Monday March 10, 2008 @10:13AM (#22700452)
    I bought the Wii mostly for fun family games for me and my girls. I purchased Resident Evil 4 for Wii just for laughs but I was blown away. Not even using the gun molds for the controller either just the remote and nunckuk. Absolutely fun to play it this way, I was very surprised how fun it was.
  • Irony (Score:3, Insightful)

    by kerrbear ( 163235 ) on Monday March 10, 2008 @10:16AM (#22700510)
    > criticizes the PC industry for transforming the PC into a useless gaming machine

    Humorously ambiguous sentence
  • by phozz bare ( 720522 ) on Monday March 10, 2008 @10:16AM (#22700518)
    I personally think that the PC, particularly under Windows XP, is a terrible gaming platform. I find myself cringing every time people complain about how bad Linux/Mac are for games, as opposed to the great and wonderful Windows. Here's a little list of annoyances I can think of off the top of my head right now:

    * The need to install a game on your hard disk. Why can my Gamecube run any game within seconds of plopping the CD in and turning it on? (...and it's not like I can legitimately run the game without the original CD anyway.)
    * The horribly slow and ugly process of switching from the Windows desktop to full screen. First the screen flickers. Then the screen turns black. Then the desktop shows up for a second, "magnified" (because the resolution is lower). Then more blackness. Finally, the game shows up. Hard disk grinding throughout this time. Reverse this process when the game is over.
    * Occasionally some stupid popup (like an instant message or a warning about my swap space running low) will force the game out of full-screen mode and back to the desktop. This cuts you out of the action for at least 30 seconds, as the disk grinds its way to swap everything back in and the resolution change as described above occurs yet again.
    * The occasional background process causes the game to stutter or jump slightly every once in a while.
    * I've rarely ever seen a 3D, or even a 2D game on the PC that has consistent smooth moving animation and scrolling at the refresh rate of the monitor with no tearing - things that are a given in almost any console game. That is, it should be that FPS == refresh rate, and refreshes occur while screen is not updating.
    * When quitting a game, very often all windows that were previously open are now confined to the upper-left corner within bounds equal to the size of the game's full-screen resolution.
    * Sometimes the same goes for all desktop icons. So what if you've spent time arranging them in a particular way? They're all bunched up in a 320x200 corner now, sorry.
    * No matter how good your hardware, a game will always give you the impression that something needs upgrading (see the stuttering phenomenon mentioned above).

    In my experience the Mac is much better in most of these respects. I've never tried gaming under Linux or Vista, and I do realize some of these points may have been fixed in Vista.
  • by totally bogus dude ( 1040246 ) on Monday March 10, 2008 @10:18AM (#22700550)

    If you're constantly have to repair your PC or if it "crashes all the time", then you're using it wrong. I get home from work, flip on my PC, surf the 'net, check my email, watch a video, play some games, and it just does what it's supposed to. Has done since I built it, and I even swapped out the motherboard to replace my Athlon CPU with an Intel Core Duo a year or two ago and it still works (okay, I admit I was a bit surprised by this).

    Yes, every now and then I may replace a component; I got a new video card about 6 months ago for example, and while the cards I had then were pretty good it did give a noticeable boost to performance, and it was worth it. On a console, you get what you're given, and the only way to upgrade it is to buy a new one when it comes out. That has its benefits and its drawbacks; clearly you think it's a benefit and I can understand that, but I do like to be able to make my gaming PC more powerful whenever it suits me and my budget rather than having to wait until a new console is available with games to make it worthwhile. I suspect the XBox 360 will be showing its age compared to PC titles by the time it gets a replacement, but this is the first generation of console games that have actually been comparable to gaming PCs so I could be wrong.

    Also, games for the consoles seem to be noticeably more expensive than PC games. It might just be because it's easier to pirate PC games, but it may also be to help make up for the manufacturer's losses in selling you the console hardware in the first place.

  • by Anonymous Coward on Monday March 10, 2008 @10:20AM (#22700578)
    I suppose you are stuck in the mindset that the more precise your aiming the better the control scheme is. I won't begin to argue that dual analog or any of the weaker console shooting layouts compete with the mouses. But the Wii remote I find to be a better aiming scheme in some contexts for the very reason you think it is worse. That reason is that aiming isn't dead easy, much like shooting a real gun (despite an order of magnitude or two less kickback) shooting with the remote will move your hand shooting the same position 10 times is actually challenging and fun as opposed to just 10 clicks. IF you can 180 turn instantly how can a game scare you from behind? If you never miss how can an enemy with a moving weakpoint be intimdating?

    Maybe being able to shoot the hell out of everything doesn't make the game better. I'm not saying we should move to Resident Evil-style convoluted controls, but a human-esque skillset when you play human characters would be a step forward in many cases.
  • by AvitarX ( 172628 ) <`gro.derdnuheniwydnarb' `ta' `em'> on Monday March 10, 2008 @10:21AM (#22700606) Journal
    Lets not leave out Turn Based Strategy either.

    Civilization, or even XCOM would be awful with a control pad. I think computers excel when lots of information needs to be displayed, since even though console resolutions now beat/match the PC, there is something to be said for being 18-30 inches from a 17" monitor as far as reading and seeing details.

    Of course many wondered how Puzzle Quest would fit on a DS, but it did, so there are ways to get around these things.
  • by Weegee_101 ( 837734 ) on Monday March 10, 2008 @10:27AM (#22700740) Homepage
    I think you're missing the point here though. It sounds to me like he's more pissed off with the fact that Dell, HP, and the other vendors are slapping the cheapest video card they can into the computers, ripping out the PCI-X slots, then selling the computer for $800 and marketing it as a "Entertainment PC". I admit I agree with you a little, but the Intel chipsets really are pretty terrible. Usually they pull out most of the flashy shaders and such for video games leaving the developers a tiny toolset that they could make an engine reminiscent of the Quake engine.
  • by AIkill ( 1021773 ) on Monday March 10, 2008 @10:30AM (#22700802)
    From what I have been seeing, the main problem for this season was not the game companies themselves, but rather the fact that there was a major OS upgrade this year. I have a couple of friends in the game dev world, and they say that they had to change gears due to vista's release.
    The other problem is M$ game dept. It should be noted that they have been trying to strong arm all games that are slotted for PC to be released to 360 only, and then release to PC after about 5-6 months. As a case and point, Mass Effect. That game was slotted to be both 360 and PC, but then was changed to 360 only almost overnight and with little explanation. However, now its slotted to be released to PC sometime in May. Another case would be Star Wars: Force Unleashed. That game was slotted to be released to all consoles (PS2 and DS included) and PC. Then, suddenly, its not being released for PC but 360 instead.
    However, I do have to agree with you about the performance fanboys. Most games these days (and consoles haven't been spared either) seem to be more like tech demos to show off better and prettier graphics, while sacrificing gameplay. All game devs need to see that these kinds of games we don't need. What we really need is for game devs to see that some games (in particular those on 360) can be ported with little effort to PC for the most part. In terms of the 360, there is no reason why all games for 360 cant be played on the PC. Of course, that will never happen because M$ is too short sighted to see the long term profit from that.
  • by crossmr ( 957846 ) on Monday March 10, 2008 @10:31AM (#22700830) Journal

    I don't get people who claim how frequently they have to upgrade their machine, or how much time they allegedly spend maintaining it. I'm calling it BS and the person who modded you up some clueless console fanboy.

    I upgraded last summer to a core 2 duo, an 8800 GTX, and a SB X-Fi. I bought the machine 3 years ago. In that 3 years the only thing I'd done was add 1 GB of ram to it and a TV Tuner card. During that time I played all the latest and greatest including first person shooters all the way along.

    I have no plans to upgrade 6 months from when I bought that unless I travel back in time, and likely I won't upgraded the graphics card for another year and a half.
    I can't recall the last time I had a problem so severe on my machine that I had to stop anything I was doing and focus on it rather than do what I wanted to do on the machine.

    But if you fool yourself in to thinking that a Radeon 9250 is a good upgrade choice, or that you'll get a free ipod for punching that damn monkey, I could see why you might have to upgrade often or spend a lot of time "maintaining" your machine.

    Not everyone who plays a PC is some kind of hardcore lan player who spends hours every day optimizing his water cooling device and trying to squeeze another MHz out of his overclock. However optimum input goes hand in hand with fun. Its not much fun stumbling your way through bad controls, which used to happen on the PC, when some developers thought it was a good idea not to let players map controls (that only happens in bad console ports now). Anyone who can look at it objectively should be able to realize that there are certain types of games which just lend themselves to a mouse/keyboard input and that joysticks fail at.

    As another benefit, should something actually go wrong with my PC, I'm only inconvenienced for as long as it takes me to get a part and put it in. If its something non-critical, like one of my storage drives, optical drives, sound card, tv tuner, etc. I'm only without it for as long as it takes me to power it down and put the new one in and turn it back on.

    I don't have to sit around twiddling my fingers while Microsoft, Nintendo, or Sony get the unit back to me.

  • by Talchas ( 954795 ) on Monday March 10, 2008 @10:34AM (#22700882)
    I believe his point was that if you played against PC players when you were on the Xbox, you would get slaughtered due to mouse/keyboard >>> controller for an FPS.
  • by guidryp ( 702488 ) on Monday March 10, 2008 @10:35AM (#22700898)
    I really agree with Tim here. This was the perfect opportunity to transition to 64bit. Most compatibility issues with Vista are Vista related, not 64 bit related. This would have given us more access to memory beyond 2GB and accelerated 64bit application development and might have even given me a reason to go with Vista. If you are breaking a lot of drivers and programs anyway, why not got 64bit at the same time and gain some benefit in the process. Heck Apple managed to swap to a whole new CPU architecture with minimal pain. You need to have stones to move forward.

    But by giving everyone a choice again and all the OEMs pushing 32bit, there is practically no movement to 64bit and practically no new capabilities exercised, no 64 bit games. etc..

    Another thing is MS should have upped the minimum HW requirements for Vista. 64bit processor 1Gig memory and graphics capable of at least running the interface. That is how bad Intel Integrated is. It can't even run Vistas bloated interface (hence lawsuit). No surprise it can't run games.

    There should be some kind of game certification as well and the bar needs to be high enough that Intel Integrated fails even the minimum standard.

    It needs to be made absolutely clear than standard integrated graphics are incapable of running games.
  • Don't blame Intel (Score:5, Insightful)

    by cerelib ( 903469 ) on Monday March 10, 2008 @10:35AM (#22700900)
    Blaming Intel and integrated graphics for the decline of PC gaming is a cop out. These game companies have been operating under the principle that a game with better graphics is a better game. Instead of creating new an innovative was to game on a PC, they enhance the graphics of an old game and call it a new game. Don't blame Intel if your game does not work on their GPU platform and you are using the latest, cutting edge, extensions and expecting the latest amounts of video ram. The fact that some of these companies are listing specific graphics cards as system requirements should indicate that there is a problem. At that point you are limiting your audience on your own. If you want a big audience, you should target machines with integrated graphics and then find ways to scale up when there is more power instead of targeting the latest and greatest and then complaining that you can't scale back to make it work. By promoting the idea that better graphics equals better game, they entered into a stupid race and they can only blame themselves.
  • by stonecypher ( 118140 ) <> on Monday March 10, 2008 @10:43AM (#22701024) Homepage Journal

    I object to your description of game devs as "lazy".
    So do I. The fault lies at the game designers' feet. If you look at the top ten selling games of all time, you'll find that none of them are graphics quality powerhouses - the sims, diablo, roller coaster tycoon, grand theft auto. Yes, making a game visually crispy will get a lot of dollars, but it doesn't win the top of the tree, and the last time it did (quake 1) was largely coincidental. What makes epic dollars is gameplay. Always has been, always will be. Is the industry drowned out by stupid companies that focus solely on visuals? Yes, and some of them are breathtakingly profitable. But the real winners are games like Civilization, whose graphics for their complexity are so rudimentary and choppy slow that it's kind of embarrassing to play.

    The problem is that most designers don't know how to make a new game, and instead of stepping down, they throw themselves into the eye candy columbine.
  • by crossmr ( 957846 ) on Monday March 10, 2008 @10:46AM (#22701082) Journal
    The problem is the people who buy a business class machine, like one of the Dell machines intended solely for office work, e-mail/surfing and expect it to be a gaming machine.

    There is nowhere that this is more apparent than The Sims franchise where people who are not gamers suddenly want to play a game and find they can't or that the performance sucks.

    The problem lies with the fact that PCs are not consoles and people have choice. If every PC was sold as something capable of handling games, the price would be much higher. You wouldn't be able to get those $300 desktops for grandma to check her email on.

    Don't blame the industry for giving consumers a choice. Blame the consumers for not educating themselves and making a proper choice. Better yet, educate consumers. Run an ad campaign, set up a website as a resource for explaining the difference between an e-mail machine and a gaming machine.

  • by Blakey Rat ( 99501 ) on Monday March 10, 2008 @11:03AM (#22701374)
    I agree that a joystick is inferior, but so is a mouse.

    Try a trackball, it'll kick your ass at games.

    I love all these people posting that the mouse is the ultimate game control device who act as if they've studied and critiqued every device imaginable for the role. You ask, "ever tried a trackball?" and the answer is no, even though trackballs are cheap, available, and work better for games. Just admit you like the mouse out of habit.

    (If you HAVE used a trackball and rejected it, I apologize, but the vast majority of gamers have not.)
  • by Directrix1 ( 157787 ) on Monday March 10, 2008 @11:03AM (#22701390)
    That's amazing. Now how about you give the prices of: case, power supply, CD/DVD drive, keyboard, laser mouse, monitor, speakers, and value of your time spent on building it. :-P
  • by Ash Vince ( 602485 ) on Monday March 10, 2008 @11:07AM (#22701444) Journal

    Maybe we care more about having fun than about worrying about optimum input devices, highest possible mouse resolution, upgrading our video cards every 6 months, and so on. All to end up with a "gaming" PC that makes too much noise and crashes all the time (or is down for repairs).
    Sorry, but you must have last played PC games quite a long time ago.

    I currently own a PC bought several years ago (Athlon XP 3200+, GTX6800 and 1 Gig Ram). Ok, this was fairly expensive when I bought it but it has been good for me ever since. We are not talking about 6 months between upgrades, we are talking 3-4 years, long before your 360 came out. That discounts your first point about upgrades, I will only need to upgrade when games I want to play start comming out Vista only and that hasn't happened yet.

    Optimising mice and video cards? If you mean selecting what resolution to run each game this is hardly a chore, most games will autoconfigure by looking at your PC specs now. It is also amazing how many games still run at the top resolution my monitor (1280*1024) even though the PC is now several years old.

    Makes too much noise or crashes all the time?? Nope, never. If a PC crashes nowadays then something is wrong with it, probably in hardware. I know windows has a reputation for being buggy, but I have had very few issues with windows XP.

    So now I have shot down all you bad points about PC gaming let me elabourate on the better points:

    1) Multifunctional

    With a PC you can do other stuff as well as play games. You need to write the occasional letter, no problem. Almost all of us nowadays need to do the CV thing occasionally and alot of companies now accept word document CV's so you do not even need a printer.

    2) Higher Resolution

    PC's can support much higher resolutions than your TV, this has been true for years.

    3) Cheaper games

    Since your 360 is actually a cheap PC in disguise that was sold at cost Microsoft have to make money somehow, they do that by adding an extra licence fee to the games. They then use a patent or hardware device to prevent people producing software for the system without paying MS a licence fee. This fee makes console software more expensive.
  • They are better (Score:3, Insightful)

    by Sycraft-fu ( 314770 ) on Monday March 10, 2008 @11:12AM (#22701522)
    But still crap overall. The major problem is that they use system RAM. Graphics is very RAM bandwidth intensive and the system RAM just can't provide that. Part of the problem is that you are fighting for access to it with the CPU, but the other part is that it is just slow by graphics standards. I mean consider that the brand spanking new high end RAM for a motherboard is DDR3-1333. That's 1333MHz in RAM speak (meaning 1333 million transfers per clock). Most people don't have that, even with high end systems since it is brand new. Most are DDR2-667 or DDR-800 which are, of course 667MHz and 800MHz. Ok, now compare that to a high end graphics card. These days they have RAM in the 1800-2000MHz range. What's more, they have a very large memory controller (or rather a lot of parallel 64-bit controller), between 256-bit and 384-bit on today's high end.

    The upshot of it is a high end motherboard might have a theoretical max 10GB/sec of memory bandwidth, a high end graphics card can have as much as 10 times that (the 8800 Ultra has a theoretical max of 103GB/sec).

    Now if you talk more realistic systems like where you'd actually be using integrated graphics, it isn't even that high. You have a system with DDR2-533 and, well, that's 4.2GB/sec peak and remember that's shared with the CPU. Even the cheap 8400 has more than that (6.4GB/sec peak) and, of course, that is all dedicated for it, no sharing.

    So while the Intel chips themselves aren't all that bad (they aren't great either, don't get me wrong), they are just going to be permanently crippled with regards to games so long as they are sharing slower system memory. Doing graphics operations on lots of pixels just demands lots of memory bandwidth. Doesn't go over so well when the bandwidth is low, and you have to fight with the CPU for access to it.
  • by sqldr ( 838964 ) on Monday March 10, 2008 @11:25AM (#22701746)
    "Those are fundamental flaws of Windows, not PCs."

    and linux. and bsd. and mac os.
  • by poot_rootbeer ( 188613 ) on Monday March 10, 2008 @11:27AM (#22701790)
    So yeah, the guy's right, Intel's graphics adaptors are terrible.

    Terrible at polygon shading, maybe, but that doesn't matter for 95% of what the PCs that have them are used for.

  • by GauteL ( 29207 ) on Monday March 10, 2008 @11:35AM (#22701932)
    "Those are fundamental flaws of Windows, not PCs. Don't blame the hardware manufacturers for Microsoft's blunders."

    Please.... you can't blame Microsoft for everything. It isn't Microsoft's fault that ATI has shoddy drivers. It isn't Microsoft's fault when two hardware manufacturers implement a spec subtly differently so that the system crashes if you combine the two pieces of hardware in one system. It isn't Microsoft's fault when a hardware manufacturer releases a firmware update that fixes a bug which some other manufacturer actually depends on as a 'feature'.

    The number of different combinations you have to test to cater for every possibility is simply staggering, so the best you can do is to test the most likely combinations and hope that most follow the specs so well that this works for most people.
  • Re:Proof that (Score:3, Insightful)

    by TheLink ( 130905 ) on Monday March 10, 2008 @11:37AM (#22701964) Journal
    Yeah, PCs are "useless gaming machines" for the games the "Unreal" creators keep trying to make.

    But they are perfectly fine for WoW, Counterstrike, Warcraft 3, Starcraft, The Sims, Bejewelled, Freecell and many other games that millions around the world are playing _NOW_.

    I've been playing Guild Wars on my years old Athlon XP system, and what bothers me more is network latency than system "grunt" - high ping makes playing hard.

    If the latest UT didn't sell well or doesn't work on computers that 90% of the _target_ market own, I think it's more Tim Sweeney's fault than Intel's fault.

    If they weren't targeting the mass market then no problem right? If they were, then maybe they should start giving out free 8800GT video cards with their game. But that costs $$$? Uhuh, so why should customers subsidize your game when it's not really better than UT2k4?

    Many people aren't downgrading to Vista from XP for similar reasons.

    I know that more than one colleague upgraded their vid cards to play games like Crysis and Bioshock. And after a few days, one said he was spending more time playing Warcraft 3 "DOTA" and the other was playing the GTA series. They both agreed the Crysis and Bioshock were nice to play, but I suppose they don't have as much "staying power".
  • by tepples ( 727027 ) <> on Monday March 10, 2008 @11:43AM (#22702078) Homepage Journal

    Well, buying a console means playing on your TV.
    And buying a PC means having to buy three more PCs so that people who live with you and people that you or they invite over to your house can play with you. Not enough PC games support shared screen play through a USB hub and TV output, even when their console ports do.
  • by Ephemeriis ( 315124 ) on Monday March 10, 2008 @11:54AM (#22702232)

    The PC is fundamentally flawed ... by being a moving target. How fast is a PC? What graphics chipset does a PC have? A developer has to make the game tweakable, so that it works on everyone's PC and the people with the lithium-cooled turbofan graphics card can stop moaning that it doesn't play at 15241x19841 in 64 bit colour.
    I've always thought this was part of the appeal of PC gaming myself.

    Sure, not everyone wants to stay on the upgrade treadmill... I fell off it a while back myself, and my system is nowhere near "bleeding edge" anymore... But it's nice to be able to constantly push the limits of what the hardware and software can do. The Wii/360/PS3 is only capable of a certain level of performance even under the best of conditions. And in a few years it'll be obsolete, and replaced by a new console. And everyone will rejoice because the new console lets you do new and wonderful things that you couldn't do before.

    But on the PC you don't need to wait a few years for your entire platform to be declared obsolete to get new and wonderful things. All you have to do is throw in a new video card, or physics accelerator, or more RAM, or a faster CPU, or whatever. This lets developers constantly push the envelope. And it isn't even just a matter of making new games do cool things. I can throw a new video card in my system and see better performance in my old games as well.

    And, to be honest, most PC titles are fairly scalable. I was able to play Oblivion on a machine that had not been substantially upgraded in about four years. It didn't look great, but it played, and I enjoyed myself quite a bit. The same thing can be said for Half-Life 2, and Portal. So you certainly don't have to constantly upgrade your machine if you don't want to...
  • by MightyYar ( 622222 ) on Monday March 10, 2008 @11:57AM (#22702292)

    So yeah, the guy's right, Intel's graphics adaptors are terrible.
    But you aren't seriously arguing that every computer buyer should have to pay extra money to make the life of game developers easier, are you? Because I maintain that integrated graphics will always suck as long as they are using system memory, and giving them their own memory will cost money. If for no other reason, integrated video will always exist for the corporate market - and that by extension makes it available to the broader market.
  • by Ephemeriis ( 315124 ) on Monday March 10, 2008 @12:15PM (#22702598)

    I don't get people who claim how frequently they have to upgrade their machine, or how much time they allegedly spend maintaining it. I'm calling it BS and the person who modded you up some clueless console fanboy.

    I upgraded last summer to a core 2 duo, an 8800 GTX, and a SB X-Fi. I bought the machine 3 years ago. In that 3 years the only thing I'd done was add 1 GB of ram to it and a TV Tuner card. During that time I played all the latest and greatest including first person shooters all the way along.

    I have no plans to upgrade 6 months from when I bought that unless I travel back in time, and likely I won't upgraded the graphics card for another year and a half.
    I can't recall the last time I had a problem so severe on my machine that I had to stop anything I was doing and focus on it rather than do what I wanted to do on the machine.
    I used to be a pretty big gamer myself... Used to spend almost every spare dollar upgrading something. Birthday, Christmas, whatever - the ideal gift was an upgrade of some sort. I never had the income to be "bleeding edge", but it was a fun hobby.

    That all ended about four years ago. I changed jobs, my lifestyle changed, bought a house, and I just didn't have the time or resources to put into constant upgrades like that. That computer served me very well over those four years. I was able to play pretty much anything I wanted to - World of Warcraft, Condemned, Half-Life 2, Portal, WarCraft III, Oblivion. Sure, I had to turn down the options on some of them...some of them ran a little slow...but I was still able to enjoy myself.

    This year, for Christmas, I decided it was time to upgrade. I spent approximately $600 to build a new PC from scratch. Dual-Core CPU, 4 GB RAM, decent video card, LCD monitor... Nothing bleeding edge, but a substantial upgrade for me. I can play absolutely anything on the market right now, most of it with the settings completely maxed out. And unless the industry changes dramatically in the next year or two, I should get the same 4+ years of use out of this computer.

    And my old computer has been recycled into a very nice media center PC.

    The folks who claim that you have to constantly pour hundreds of dollars and hours of time into PC gaming are simply doing it wrong. Sure, some folks get a kick out of being bleeding edge... But you don't have to do that just to play games on the PC. You can get a perfectly good gaming PC for nearly the same price as a console, and get nearly the same life out of it.
  • by Evil Kerek ( 1196573 ) on Monday March 10, 2008 @01:18PM (#22703724)
    If the console say, came with a mouse and some sort of layout controller (ala a nostromo for instance) then there would be more defacto support. I am not saying force the devs ( and whatever on content control - the devs are currently 'forced' to a standard controller) to be forced - if a mouse/keyboard was a standard part of the consoles it would take care of itself.

    And yah, sometimes I don't know why I bother to post here *smile*.

  • by Z34107 ( 925136 ) on Monday March 10, 2008 @01:47PM (#22704440)

    Absolutely right; upgrade treadmill is easier than ever nowadays.

    Get a nVidia chipset that can support SLI. Buy one "second-from-the-top" video card for today (8800 GTS or GTX) and when it becomes obsolete, pick up a second one from the bargain bin. 2x videocards doesn't necessarily mean 2x the framerate, but it helps.

    Intel just switched to a 45nm process and is rolling out their new architecture, so I doubt any new CPU sockets are going to crop up. Heck, I heard a lot of existing motherboards may support Nehalem with a BIOS patch. Plus, Intel's low-end dual- and quad-core chips overclock extremely well - instead of upgrading, overclock until you burn the shi*t out of it. By the time that happens, what you were originally going to upgrade to will be dirt cheap.

    DDR3 memory is coming out, but probably won't supplant DDR2 for quite a while yet. If your motherboard doesn't support DDR3, you'll still be good for a long time. <baselessprophecy/> Memory is cheap - $120 last year got you 1GB; nowadays, that'll get you 4, at least according to Maximum PC.

    Storage is cheap, and the new terabyte drives will eventually come down in price. $1500 can get you a "no compromises" PC, and with planning, will be upgradeable for a long way to come. My little brother's gaming rig was purchased January of 2001 for <$2000 and has had no work done to it other than a vid card upgrade (nVidia 8600 something-or-other.) But, it does just fine on everything but Crysis.

    Interestingly enough, I play Team Fortress 2 on a LCD HDTV through the component out dongle on my 8800 GTX video card. It kicks the pants off of the Xbox 360 version. Oh well for console superiority.

  • by Anonymous Coward on Monday March 10, 2008 @02:08PM (#22704854)
    This is so true... a little while back I had the idea to upgrade my processor. JUST the processor. I had plenty of RAM, a decent video card, gobs of hard drive space... just wanted a bit more juice for the CPU.


    Unless I wanted a completely trivial upgrade, I needed a new motherboard for a decent CPU upgrade. And of course the new motherboard was PCI-E and my old one was AGP, so I needed a new graphics card. And the memory was DDR2 instead of DDR1 so I needed new RAM. And the drive interface was SATA not IDE so I needed a new hard drive.

    At which point I said, well fuck it, get a new case too and now it's a brand new system, not an upgrade. :P
  • by try_anything ( 880404 ) on Monday March 10, 2008 @05:47PM (#22708560)

    I've always thought this was part of the appeal of PC gaming myself. Sure, not everyone wants to stay on the upgrade treadmill... I fell off it a while back myself, and my system is nowhere near "bleeding edge" anymore... But it's nice to be able to constantly push the limits of what the hardware and software can do.
    I think this is the point of the article -- PC gaming only works for people who think this way. People who want to be able to play games without understanding what goes on inside their computer are screwed. Say a kid's dad buys him a computer for his birthday. Then the kid can't play any games, because his father didn't understand that most computers have insufficient graphics to play currently available games. PC gaming is *only* for people who can at least name the major components of their computer, decipher the requirements on game boxes, read gaming websites to get the scoop on upcoming technologies and games, and spend their money efficiently to get the hardware required to play the games they want.

    I don't blame PC retailers. They should sell cheap computers. They shouldn't artificially inflate the price of PCs just so every PC is a gaming box. The responsibility falls squarely on game creators who only want to work with the latest technology and make the prettiest games. Obviously this attitude alienates them from the majority of potential buyers, because most people aren't obsessed with the latest computer technologies. Given $300 to spend, many people will not choose to upgrade their computer. They might spend it on a tent, a new outfit, brewing equipment, clothes for their kids, medical care, or a trip to see their favorite band instead of blowing it on gaming technology. According to Tim Sweeney, this makes it impossible for him to market games to those people. Those ignorant bastards, spending money on their favorite pastimes instead of spending it on video cards and RAM upgrades. How horrible for the gaming industry to have to put up with that kind of behavior.

    The computer industry should not take that choice away from them, and it seems to me that is what Tim Sweeney is asking for. He wants to close the 100x spread between gaming boxes and cheap retail boxes and reduce it to 10x or less. From the interview, this is how I understand his logic:

    1. I only want to create games targeted at the newest and most expensive gaming platforms.
    2. I can only scale a game's processing requirements down by about 10x.
    3. Many people prefer to buy cheap computers with integrated graphics, which are 100x slower (for gaming) than high-end machines.
    4. Clearly we would all be better off if every PC owner could buy current games.
    5. Therefore, the PC industry should not enable people to make the "mistake" of buying cheap computers with integrated graphics. Retail sellers should only allow people to buy machines that have at least 10% of the gaming capacity of the newest machines.

    Obviously if Tim Sweeney is concerned about people with integrated graphics, he could design games that run on integrated graphics. He wants to have his cake and eat it too -- he wants the prestige of designing for bleeding-edge gamers and the large market inherent in designing for average computer users. And he wants the retail PC industry to accomplish this for him by forcing everyone into a narrow range of hardware choices.

Perfection is acheived only on the point of collapse. - C. N. Parkinson