Forgot your password?
typodupeerror
Graphics Games

Can a New GPU Rejuvenate a 5 Year Old Gaming PC? 264

Posted by samzenpus
from the making-the-best-of-it dept.
MojoKid writes "New video card launches from AMD and NVIDIA are almost always reviewed on hardware less than 12 months old. That's not an arbitrary decision — it helps reviewers make certain that GPU performance isn't held back by older CPUs and can be particularly important when evaluating the impact of new interfaces or bus designs. That said, an equally interesting perspective might be to compare the performance impact of upgrading a graphics card in an older system that doesn't have access to the substantial performance gains of integrated memory controllers, high speed DDR3 memory, deep multithreading or internal serial links. As it turns out, even using a midrange graphics card like a GeForce GTX 660, substantial gains up to 150 percent can be achieved without the need for a complete system overhaul."
This discussion has been archived. No new comments can be posted.

Can a New GPU Rejuvenate a 5 Year Old Gaming PC?

Comments Filter:
  • No (Score:2, Insightful)

    by Hsien-Ko (1090623)
    AGP bridges suck.

    PCI-E DDR2 rigs aren't even that old or even considered "obsolete" either.
    • AGP bridges suck. PCI-E DDR2 rigs aren't even that old or even considered "obsolete" either.

      You obviously didn't read the article. They tested whether there was any benefit to upgrading the graphics card, and the figures show that there is. They didn't use an AGP motherboard. And it doesn't matter whether you call the system old or not, because the topic was whether you could improve a 5 year old system.

      The answer is yes. It doesn't matter what your theory says, because in practice you can extend the life of an old system with a single hardware upgrade.

  • Older = how old? (Score:4, Insightful)

    by girlintraining (1395911) on Friday January 25, 2013 @01:42AM (#42688007)

    The thing is, most serious gamers willing to plunk down $400 for a video card aren't going to skimp on upgrading the rest of the computer. That's why nobody reviews it: Because you, McThrifty, aren't the target market and nobody's going to send you free hardware to test since your readers are, well... cheap.

    Most of those hardware reviews you see online get the newest video cards for free specifically because their reviews are tailored to the guy who has a McDuck-sized vault of cash ready to be spent getting that extra .8 FPS out of Crysis.

    • Re:Older = how old? (Score:5, Informative)

      by 0123456 (636235) on Friday January 25, 2013 @01:46AM (#42688027)

      The thing is, most serious gamers willing to plunk down $400 for a video card aren't going to skimp on upgrading the rest of the computer.

      And a GTX 660 is not a $400 card, it's more like $200.

      The real issue is that most games are designed to run on consoles with their ultra-crappy CPUs, so they do very little on the CPU even on a PC. I've rarely seen my i7 go over 20% CPU usage in any game I've played in Windows with the CPU monitor running.

      • Re:Older = how old? (Score:5, Interesting)

        by Rockoon (1252108) on Friday January 25, 2013 @05:14AM (#42688821)

        And a GTX 660 is not a $400 card, it's more like $200.

        ..and its 140W TDP, significantly more than the 8800 GT or 9800 GT NVidia card that was $200 when they pieced together their 5 year old system, so they need a new power supply too.

        • Maybe they do, maybe they don't. 1000W power supplies have been available for a very long time.

          My desktop system has a 750W Cooler Master power supply at the moment, and I'm using maybe half of its capacity under load. That's for an i5 2500k overclocked @ 4.8GHz, a Hyper 212+ heatsink*, 16GB of RAM (4x4GB), a Radeon 6970 graphics card, a DVD burner, a 60GB Intel 520 SSD as cache, and a 3TB mechanical drive, on a Z68 motherboard. I could buy a video card with a 500W draw and still have some juice left over.

          I

        • Hardly. Gamers rarely stress even a 400w supply, and even 5 years ago 500-550watt were the most common PSUs to be had. In fact, in the last 5 years, the only reason why I needed to upgrade my PSU was because mine lacked the power connectors for my HD6950. That is, mostly bad luck.
    • However, a $100 graphic card of today is most likely going to leave any high end card of 5 years ago well back in the dust. Probably worth sticking one in, should be good enough for most games.

    • Re:Older = how old? (Score:5, Informative)

      by B1oodAnge1 (1485419) on Friday January 25, 2013 @04:41AM (#42688721)

      To be fair, at 25 years old and over 200 games bought on steam I think I fit the target market for PC games pretty squarely, and I just upgraded my 8800 GTS to a GTX550Ti on my computer that is around 6 years old.
      I went from needing to run at medium/low settings at 1080 to being able to run just about everything maxed out at 1920x1200 for about $120.

    • by sa1lnr (669048)

      The top end/most expensive are not what Nvidia or AMD make their money on.

  • This just in, duh (Score:4, Informative)

    by redmid17 (1217076) on Friday January 25, 2013 @01:44AM (#42688013)
    Is Ric Romero posting stuff to Slashdot? "Upgrading the largest bottleneck for game performance can substantially improve your playing experience!" Whether or not it's worth doing is another matter, but anyone who's built their own computer or even reads websites like tom's hardware or benchmarking sites knows this.
    • The question was whether it improved it enough to be a viable upgrade. And the answer is yes, assuming the CPU in the system is a quad core or better. Dual core, no. Luckily people stopped buying those around 2007 (e8400)

      • Re:This just in, duh (Score:5, Informative)

        by illaqueate (416118) on Friday January 25, 2013 @02:03AM (#42688101)

        That said, ~48% of Steam users still have a dual core on Steam according to their hardware survey

        • Re: (Score:2, Insightful)

          by Anonymous Coward

          That is easy to explain as a fair bit of new laptops are still dual core. People try to game on laptops.

      • by redmid17 (1217076)
        People stopped buying dual cores around 2007? Maybe you missed all the current computers with Core i3s and Core i5s?
        • by 0123456 (636235)

          Maybe you missed all the current computers with Core i3s and Core i5s?

          Most or all desktop i5s are quads. Laptop i5s are duals with hyperthreading and 'turbo' mode. I think both desktop and laptop i3s are dual with hyperthreading.

          So finding a dual-core that can't run four threads is becoming difficult.

          • So finding a dual-core that can't run four threads is becoming difficult.

            They are trivial to find, they are just marketed under Intel's lower end brands "celeron" and "pentium".

        • i5s are quad core in about half the cases: http://en.wikipedia.org/wiki/Intel_Core#Core_2_Quad [wikipedia.org]
      • by Osgeld (1900440)

        not really true, I dropped a rather inexpensive HD6870 in my tri core and game performance noticeably increased, more detail at longer distances with a higher framerate ... on a pci-e 2.0 box

        I plan on upgrading the rest eventually, but going from a gts250 to the ati = pretty substantial difference

  • .... take that money you spend on the GPU, and spend it on a motherboard with i7 and integrated GPU, and you'll likely get a speed up as well. with faster processing for everything else.
    • Some games hit the CPU much heavier these days than they used to. Many games really don't perform well if they aren't given multi-core CPUs with reasonable speed.

      So how much upgrading a given component makes a difference depends on what else you have in your computer. If your system has a CPU that was top of the line 5 years ago, but an integrated GPU, then ya a new GPU will probably be the best use of money. However if the CPU is underpowered, then a new GPU will do little if anything.

      Also you are right in

      • by EvanED (569694)

        Back in the P3 days I recommended a discrete GPU to everyone because the integrated ones were that bad. Now with Sandy/Ivy Bridge they are quite good. You can game on them, even new games. No they don't do as well as a discrete GPU, but they really are more powerful than you might think.

        Hmmm, my research from a few months ago suggested otherwise, at least to some extent. My home desktop is from 2008; it had a GeForce 8800 GTS in it which unfortunately decided to go kaput. The timing was kind of bad because

        • by adolf (21054) <flodadolf@gmail.com> on Friday January 25, 2013 @03:20AM (#42688471) Journal

          My home system is from 2008 also, and sports a pair of 9800GTs.

          I've gone through many of the same thought processes as you, and come to many of the same conclusions.

          Here's what I've gleaned:

          1. A five-year-old video card (or a pair of them) should be trivially-cheap to replace with an efficient and modern equivalent, but it's not.

          2. The prettiest games I want to play today bog my Q6600 CPU more than my video cards, which just loaf along on such titles.

          3. I need more RAM. 4GB isn't enough and DDR2 is fucking expensive. A motherboard+CPU sidegrade is damn near free with 2x4GB DDR3, compared to 2x4GB of DDR2 by itself. And getting a significantly faster CPU at the same time isn't significantly more expensive.

          4. Integrated graphics, no matter the claims by people who say they're quite good enough, suck in comparison to even quite old dedicated hardware.

          5. Conclusion: To upgrade my 5-year-old gaming rig piecemeal, keep the GPU(s), replace everything else, and ignore integrated graphics.

          • by SScorpio (595836)

            Did you do this upgrade yet? It would be trivial to run off integrated graphics, and then test again with the GPU(s). Since you were running SLI, you should get better performance. But it would be interesting to compare a single 9800GT to a 4000HD.

      • by turing_m (1030530)

        Some games hit the CPU much heavier these days than they used to. Many games really don't perform well if they aren't given multi-core CPUs with reasonable speed.

        One thing to bear in mind with gaming benchmarks - they are performed running just the game, to keep everything else equal. In real world use it's nice to have the flexibility not to have to close down your browser and other applications, especially if you aren't the only user logged into the system. For that reason, you want more cores than you ne

  • SSD (Score:5, Informative)

    by Barlo_Mung_42 (411228) on Friday January 25, 2013 @02:03AM (#42688105) Homepage

    One thing that helped boost my older system was switching the drive to an SSD.

    • EVERY component is relevant to gaming performance: HD/SSD, RAM, CPU, and GPU are all important, especially with some of the latest games. And you need to get enough juice from the power supply (without immediately killing it), you have to be able to keep it all cool, you want a motherboard that isn't itself a bottleneck or otherwise a hindrance, and of course you don't want to watch the action on a 17" CRT. So while I wouldn't recommend relying on integrated graphics or a $50 card, you can't forget about a
      • by Slugster (635830)
        I have a computer several years old I upgraded to SSD because the mechanical drives were failing. I saw a significant improvement in gaming after the SSD swap: with FPS games, previously I had to turn most of the visual effects off because the video was rather choppy. Now they're all left on and the game still runs just fine.

        SSD's don't cure everything, nor do they speed everything up (some stuff takes just as long, because it's set to take X amount of time anyway). But for a lot of things the instant-re
        • by prefect42 (141309)

          You must be comically RAM starved to get a frame rate boost from upgrading to an SSD (swapping of some form), or the game has to be shoddily written to be constantly hitting disk and blocking on it.

  • A newer/better GPU can indeed improve the graphics and gaming performance of an older computer, but it won't make it perform like a newer machine with other superior hardware. Duh.

    It seems like perfect common sense, but obviously not everyone gets it so I'll state it like this: If you took a shiny 2012 BMW V8 engine and plopped it into your rusty 1982 BMW 733i, your car would be faster and more fuel efficient (assuming you could even mount the new motor and get everything hooked up), but it wouldn't autom
  • Here, have a look at this Anandtech E-350 review:
    http://www.anandtech.com/show/4499/fusion-e350-review-asus-e35m1i-deluxe-ecs-hdci-and-zotac-fusion350ae/15 [anandtech.com]

    They pair very low-end AMD CPU with best GPU on the market at the time. Results: the CPU does affect the performance. No suprises there..

    You need to be more specific with your hardware.

    Also, take a look here:
    http://www.anandtech.com/bench/CPU/48 [anandtech.com]

    • They pair very low-end AMD CPU with best GPU on the market at the time. Results: the CPU does affect the performance. No suprises there..

      The quoted test was probably more limited by the PCIe slot running at x4 instead of x16 rather than the low spec CPU. If you can't get data to the graphics card quickly enough then even the fastest CPU will be hampered.

  • I've lately upgraded the GPU every other generation (i buy mid range cards like the 660) and the CPU every 4 years or more. It's been fast enough for my purposes.

  • If card A has a performance of x (which I'll define as 1) and card B a performance of x+2, wouldn't that mean it's two times better?

    The article keeps saying three times better, but wouldn't the correct way to phrase that be "It's three times as good?"

    Similar things with percentages. If something has 200% the value of something else, it's twice as valuable and not two times more valuable, right?

    I notice similar things in German, which is my main language. Am I just a grammar Nazi (badum-tis) or does that bot

    • If card A has a performance of x (which I'll define as 1) and card B a performance of x+2, wouldn't that mean it's two times better?

      The article keeps saying three times better, but wouldn't the correct way to phrase that be "It's three times as good?"

      Similar things with percentages. If something has 200% the value of something else, it's twice as valuable and not two times more valuable, right?

      I notice similar things in German, which is my main language. Am I just a grammar Nazi (badum-tis) or does that bother you too?

      Yeah, since when you talk about how much BETTER something is than something else, you are describing the difference between the two things. So if card A has performance 1 and card B performance 3, then the difference is 2. So card B is 2 better (or, in this case, even "two times (the performance of A)" better). On the other hand, if you talk about how something is "x times as good", you are looking at the whole thing, not just the difference, so in this case, B would be 3 times as good.

      Same with percentages

  • I'm running a new-ish HD5970 card on a five-year-old Intel Core 2 Duo, 2.66GHz. Over those years I've also added an SSD (boot + apps), some extra hard drives, and an extra monitor. The machine is very reliable and quick enough that I really don't need to upgrade. Although I definitely will upgrade this year; five years is really old for a PC.

  • But I can emphatically say: "no". Excuse my grammar. I have a P4 HT clocking around 3.4ghz. I would overclock it if the mobo wasn't an HP OEM and what-have-you. I also have a 1Gb Nvidia card. My point is that my computer meets the minimum specs for prettymuch every modern game sans the CPU. You simply can't run a newer game on a single core machine without serious gameplay consequences. If I had even a core 2 duo, the rest of my setup would beat the shit out of games like crysis. So, once again: a new GPU w
  • Just install a 10 years old OS and games. You'll be blown away by the performance :P
  • I had an Intel Q6600 system (quad 2.4Ghz cores), and it wasn't able to keep up with some new updates in games my son wanted to pay. Bought a new GPU, and now I can play what he wanted to play (WoW) at maximum settings, no problem. Your mileage may vary, but it worked for me.

  • I think one of THE biggest bottlenecks for any computer is insufficient amount of RAM in the computer.

    That's why I've always suggested that if you can afford it, install the maximum RAM allowed by the motherboard. Most motherboards that support CPU's with x86-64 instructions can support 8 GB of RAM, and with 8 GB of RAM, the performance improvement can be quite high since 1) you no longer need to use the hard disk as virtual memory and 2) programs have more "breathing room" to run.

    I used to run a computer w

    • by kenh (9056)

      First off, your two reasons are exactly the same.

      Second, the OS and apps obviously need RAM, but to base your opinion on the vast improvement between 512 Meg and 2,048 Meg on an XP box is kinda pointless. Two Gigs of RAM was the sweet spot for XP and typical desktop use. Four gigs made the machine more responsive, but the cost typically didn't justify upgrade to 4 Gigs.

      Windows 7 really runs well with 4 Gigs, and when the next 4 Gigs only costs another $20 why not go to 8 Gigs - but that's on a modern deskto

  • My gaming machine is actually 5 years old - built in 2008, it originally housed a Geforce 9800GTX. The CPU is an intel Quad core Q9300 - pretty low end at the time, plus 4GB of DDR2 RAM - very old, very out of date.

    On that machine, I could play the likes of BF3 on low settings reasonably well. I swapped the graphics card for a Geforce 560 Ti and now I can play BF3 on med/high at 1920x1200. Nothing else has changed, same old CPU, same DDR2 RAM.

  • This varies on CPU in use. If the machine is not CPU bound, then a new GPU will work. If its already CPU bound in some games then while I don't doubt you'll get some improvement - you're already on the limit.

    A new CPU and motherboard is often cheaper than the GPU upgrade, so its something you could factor in later. Call it your personal Tick/Tock in line with your gaming :)

  • I just went through this very decision process, but for a desktop machine, not a gaming system. I picked up a Dell Optiplex 755 with a decent Core 2 Duo CPU at a surplus sale. I upped the RAM to 8 Gigs and was quite happy, but then I started thinking about the graphics subsystem. This box had integrated Intel graphics, and that left something to be desired under Windows 7. So one quick trip to local computer store later, and for less than $40 I dropped an HD6450 in it and am quite pleased. The system now su

  • When you want to toss an upgraded GPU in an older system, keep an eye on the PCI spec level, no sense buying the latest whiz-bang video card if your system only has a first-gen PCI Express slot.

    Another concern will be power - many older systems have smaller power supplies or power supplies that provide just enough power for power-hungry older system components.

  • My Experience (Score:4, Informative)

    by JobyOne (1578377) on Friday January 25, 2013 @10:04AM (#42690049) Homepage Journal

    About a year ago I stuck a GTX 550 Ti in a machine that was at the time pushing five years old.

    I generally upgrade video cards at least twice after the initial build of my computers, every 2 years or so. My needs for upgrading other components are generally low, because...really...who needs a top of the line processor? I generally stick to the top of the mid tier and it does anything I might need done for the next 5-6 years. As far as RAM goes, whenever I get a new motherboard I just put as much RAM as it supports in it, and have been known to spend more on RAM than CPU when building a computer.

    I just recently rebuilt my computer (new motherboard, CPU, RAM, and a second GPU) for about $550, and that got it to a point where it can play Crysis 2 with max settings. I expect it will be able to play any game the makers throw at it for another two years before performance starts to become a real issue. Maybe longer, because it seems to me that game-makers are getting better at building games that still run (albeit less prettily) on older hardware.

    If it hadn't been for some recent hardware failures I'd probably STILL be rocking the last machine, which would be over 6 years old now. I just didn't feel like throwing money down the drain buying a replacement motherboard that used and old-ass socket.

    I think the only reason to buy absolute top-of-the-line hardware these days is to stroke your e-peen.

  • I'm using an Intel i5-2400/16GB RAM with two GeForce 9800GTs for dual-head. Older tech, but only one year old to me. Why? For a bargain price, it works great for me and was a significant improvement over the AMD Athlon XP-3000/Geforce 6200 I was using.

    Latest/Greatest hardware is nice, but expensive. I also tend to play older games like Quake, Unreal, COD, MOH which run great on more modern hardware.

  • I threw a GTS450 into my socket 939 board with an AMD Toledo Athlon X2 and 3GB of dual channel, low timing DDR1. It was faster than my Geforce 8600 but not by much. I brought my 450 over to my new i5-2400 system and it was like night and day. This thing tore my games a new ass framerate-wise. It would seem the x16 PCI-E slot was holding it back on my old board compared to the new x16 2.0 or 2.1 slot or whatever. Plus, the PCI-E controller is in the i5 itself if I'm not mistaken. So as long as your boa
  • by Meeni (1815694) on Friday January 25, 2013 @02:05PM (#42693045)

    You can play almost everything from last year with quality visuals with an old CPU teamed with a new GPU. But here are the tricks:
    * You need at least 2GB of memory. If you don't have this, don't even try.
    * The CPU must be dual-code, at least. Single core CPU don't work anymore (tried both on the same machine, difference is night and day, it just happened that I could access a compatible dual-core CPU for free, otherwise it would have been impractical). If the CPU is not dual core, it does prevent decent performance, even with a top notch GPU.
    * Upgrade the HDD to SSD. The older HD that comes with your 10 y/o rig will slow everything down. This is the second most beneficial upgrade beside the video card.

UNIX is hot. It's more than hot. It's steaming. It's quicksilver lightning with a laserbeam kicker. -- Michael Jay Tucker

Working...