Can a New GPU Rejuvenate a 5 Year Old Gaming PC? 264
MojoKid writes "New video card launches from AMD and NVIDIA are almost always reviewed on hardware less than 12 months old. That's not an arbitrary decision — it helps reviewers make certain that GPU performance isn't held back by older CPUs and can be particularly important when evaluating the impact of new interfaces or bus designs. That said, an equally interesting perspective might be to compare the performance impact of upgrading a graphics card in an older system that doesn't have access to the substantial performance gains of integrated memory controllers, high speed DDR3 memory, deep multithreading or internal serial links. As it turns out, even using a midrange graphics card like a GeForce GTX 660, substantial gains up to 150 percent can be achieved without the need for a complete system overhaul."
This just in, duh (Score:4, Informative)
Re:Older = how old? (Score:5, Informative)
The thing is, most serious gamers willing to plunk down $400 for a video card aren't going to skimp on upgrading the rest of the computer.
And a GTX 660 is not a $400 card, it's more like $200.
The real issue is that most games are designed to run on consoles with their ultra-crappy CPUs, so they do very little on the CPU even on a PC. I've rarely seen my i7 go over 20% CPU usage in any game I've played in Windows with the CPU monitor running.
Re:This just in, duh (Score:5, Informative)
That said, ~48% of Steam users still have a dual core on Steam according to their hardware survey
SSD (Score:5, Informative)
One thing that helped boost my older system was switching the drive to an SSD.
Re:no surprise there (Score:4, Informative)
Re:no surprise there (Score:4, Informative)
I'm still using a core2 quad and am getting by fine (GeForce 9800 gpu). Sure I don't turn the graphics full up on games but I'm doing alright.
Re:no surprise there (Score:5, Informative)
I had a comparable processor, which I bought on the christmas of 2009. However, some new games such as Mechwarrior Online and Planetside 2 are heavily CPU-bound and the machine was lacking when running them. I upgraded to i7-3770K and the improvement was dramatic (30-40 -> 60fps for MWO and 40-50 -> 90fps for Planetside). The graphics card did not change, as it was already rather powerful (Radeon 6970) and not a bottleneck on the detail levels I was using.
This was literally the difference between unplayable and playable, so if you play those games, there absolutely is a reason to upgrade.
Re:Older = how old? (Score:5, Informative)
To be fair, at 25 years old and over 200 games bought on steam I think I fit the target market for PC games pretty squarely, and I just upgraded my 8800 GTS to a GTX550Ti on my computer that is around 6 years old.
I went from needing to run at medium/low settings at 1080 to being able to run just about everything maxed out at 1920x1200 for about $120.
Re:no surprise there (Score:2, Informative)
I'm still using a Core2Duo E6300 from 2006 with 4GB of RAM. Until recently I was using a 9800 GTX, which, yeah, was fine as long as you didn't turn everything up to full. I recently traded up for an nvidia GTX 570, which was passed down from another more recent machine. Quite a nice improvement. The article is right that if you're going to upgrade anything on an old machine, the graphics card is probably it. Midrange now (i.e. ~$200) is usually a substantial improvement. On the other hand, I thought that was usually obvious. Graphics is the most common bottleneck.
I've never paid $500+ for the cutting-edge graphics card of the day. It's not worth it. Too much of a premium. When configuring new systems I usually buy the $200-$250 "midrange" card, then wait a few years and buy what is then the $200-$250 card, which by then usually has the performance of what I would have paid $500+ for originally. That way the equivalent costs are spread out over more time, and now I've got two cards (sometimes I can re-sell the old one for $100 or $50, or re-task it in a different machine).
My Experience (Score:4, Informative)
About a year ago I stuck a GTX 550 Ti in a machine that was at the time pushing five years old.
I generally upgrade video cards at least twice after the initial build of my computers, every 2 years or so. My needs for upgrading other components are generally low, because...really...who needs a top of the line processor? I generally stick to the top of the mid tier and it does anything I might need done for the next 5-6 years. As far as RAM goes, whenever I get a new motherboard I just put as much RAM as it supports in it, and have been known to spend more on RAM than CPU when building a computer.
I just recently rebuilt my computer (new motherboard, CPU, RAM, and a second GPU) for about $550, and that got it to a point where it can play Crysis 2 with max settings. I expect it will be able to play any game the makers throw at it for another two years before performance starts to become a real issue. Maybe longer, because it seems to me that game-makers are getting better at building games that still run (albeit less prettily) on older hardware.
If it hadn't been for some recent hardware failures I'd probably STILL be rocking the last machine, which would be over 6 years old now. I just didn't feel like throwing money down the drain buying a replacement motherboard that used and old-ass socket.
I think the only reason to buy absolute top-of-the-line hardware these days is to stroke your e-peen.