Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Power The Almighty Buck Games Technology

Digging Into the Electrical Cost of PC Gaming 162

New submitter MBAFK writes "My coworker Geoff and I have been taking power meters home to see what the true cost of PC gaming is. Not just the outlay for hardware and software, but what the day-to-day costs really are. If you assume a 20 hour a week habit, and using $0.11 a KWH, actually playing costs Geoff $30.83 a year. If Geoff turns his PC off when he is not using it, he could save $66 a year."
This discussion has been archived. No new comments can be posted.

Digging Into the Electrical Cost of PC Gaming

Comments Filter:
  • PC gaming? (Score:3, Insightful)

    by MsWhich ( 2640815 ) on Tuesday May 29, 2012 @09:19AM (#40141951) Homepage
    I'm not sure how this has anything to do with the cost of PC gaming, considering that my mother, who only uses her computer for Facebook and TurboTax, could see the exact same benefits by doing the exact same things the article suggests.
    • Re:PC gaming? (Score:5, Informative)

      by 0123456 ( 636235 ) on Tuesday May 29, 2012 @09:22AM (#40141977)

      Running PC games can easily take 300-500W with a high-end graphics card. Posting on Facebook probably uses 30-50W on a modern desktop PC (plus whatever the monitor uses in both cases).

      • Re: (Score:3, Funny)

        by Anonymous Coward

        i can't play video games anymore since i'm running a bitcoint mining operation with my graphics card. it's pretty expensive to run.

      • by MBAFK ( 769131 )

        I actually looked at this when I had the power meter out. To play Spider Solitaire is about 102 watts on the same machine that needed 157 to play Dawn of War 2. That machine idles at 100 watts.

        • by gmack ( 197796 )

          It really would have been helpful to know what hardware you tested on. I get that the CPU and GPU both likely downclocked when idle but Does it have a HD that spins slower (WD greendrive and friends) when not in use? Does it have an SSD card? What monitor did it have? Was it an LED backlight or one of the older ones that use more power?

          I also can't imagine anyone not setting their monitor to power off when idle.

      • by antdude ( 79039 )

        I play computer games once in a while so I still need those high end cards (not SLI/Crossfire). I really wished there was a way to make those video cards to act low power used by idling most of its features when not needed. Sure, I can swap the hardwares, but that's annoying.

      • Comment removed based on user account deletion
        • by Smauler ( 915644 )

          So I'm sorry friend but there really isn't a point in ePeen cards unless you are just going for bragging rights or are doing serious GPGPU work because the games just ain't stressing the systems that hard.

          This page [tomshardware.com] has benchmarks for that card with modern games. The 4850 seems to average 30-40 fps in most games at 1680*1050 (Crysis 2 was worse), the benchmarks there don't show a minimum (which is usually about half the average). That's a bit crappy.

          I run at 1920*1200, and thinking about getting one of

    • by Sloppy ( 14984 )

      If your mother only uses her computer for Facebook and Turbotax but draws 100W while idle, then your mother needs building advice. Nudge her into moving to Ivy Bridge Core i3 (and use the integrated graphics; don't add graphics card) when they come out in a couple months.

      (Actually if that's all she does, maybe even an Atom or Bobcat system will be enough, but in 2012 I don't recommend going that way.)

      • So spend $500 on a new PC to save $40/year on electricity?

        • Re:PC gaming? (Score:5, Interesting)

          by TheLink ( 130905 ) on Tuesday May 29, 2012 @10:47AM (#40143261) Journal

          Yeah every now and then Slashdot has these silly articles about PC power consumption, "kill a watt" etc.

          The power consumption of modern PCs (post P4) has gone down to a level where most home users would usually be better off looking for savings in other areas. Driving more efficiently, not using as much cooling/heating (and making it more efficient - insulation, sealing etc).

          As for gaming, sure a high powered gaming rig will use a few hundred watts (and usually less if you're not doing SLI). But that's far from the most energy hungry way of having fun. Your hobby could be drag racing, or hiking/rock climbing somewhere that requires a 1 hour drive, or even baking cakes. FWIW even cycling and other sports might be more energy hungry if you replace the calories burnt by eating more of stuff that requires a fair bit of energy to produce ( e.g. US corn fed beef).

          From various sources:
          1 pound of beef = 13-15 pounds of CO2 ( http://www.usatoday.com/news/nation/environment/2009-04-21-carbon-diet_N.htm [usatoday.com] )
          1 kWh = 2.3 pounds of CO2 ( http://cdiac.ornl.gov/pns/faq.html [ornl.gov] )
          so 1 pound of beef = 5.6-6.5kWh

          So if all that exercise makes you eat an additional half pound of beef (400kcal), that's about the equivalent of running a 300W gaming rig + monitor for 9 to 10 hours.

          In contrast 1 pound of chicken = 1.1 pounds of CO2.

          I've even seen many people here who say they still prefer to use incandescent lighting. It doesn't take that many bulbs to use as much as a gaming rig, even fewer for a facebook/browsing PC/notebook. A single fluorescent tube lamp uses about 40W already.

          • by mcgrew ( 92797 ) *

            I've even seen many people here who say they still prefer to use incandescent lighting. It doesn't take that many bulbs to use as much as a gaming rig, even fewer for a facebook/browsing PC/notebook.

            People who refuse to use CFLs because "the color's not right*" or "it takes too long to start up" aren't the kind of folks who are worried about electric bills or global warming. Also, the flourescents are far cooler, so your AC costs drop with them.

            * That "the color looks wrong" is scientifically incorrect. You

          • by mjwx ( 966435 )

            Yeah every now and then Slashdot has these silly articles about PC power consumption, "kill a watt" etc.

            Still, a lot of people still haven't got the message "If you turn shit off when you're not using it, your power bills go down". That seemed to be the overriding message behind the summary. This goes for a lot of things, TV, aircon/heater, lights and what not. The only device I have on 24/7 in my house is the fridge. Some people are actually surprised I don't have a A$1000 per quarter power bill (I pay about A$80-120 per quarter at A$0.22 ish per KW).

            If I do have a PC on for a download or something, I use

        • All depends where you live, electricity prices seem to vary massively across america and presumablly even more arround the world.

          Further complicating matters your local climate, building design and heating or cooling systems affect the real cost of indoor electricity usage for you. If you live in a cold climate and use resistive electric heating then running your computers is effectively free because it just displaces heating. OTOH if you live in a hot climate where you are running aircon all the time then

        • by kesuki ( 321456 )

          why not grab a kindle fire show her how to use it and realize it uses less TEI (total environmental impact) for way less than said $500 system (though i3s sell at walmart for roughly $375) if she can handle a bw eink the DX has unsurpased screen size and lifetime 3g for whispernet, though it costs you as much as a wal mart pc, and unlike the fire is not in color (yet, color eink has been POCed) i've heard that rooted kindle last 8 hours a charge, and if left unrooted last 1-2 months with wifi/3g disabled r

    • Comment removed based on user account deletion
  • Kill-a-watt meter (Score:5, Interesting)

    by stevegee58 ( 1179505 ) on Tuesday May 29, 2012 @09:20AM (#40141965) Journal
    I bought a kill-a-watt meter a while back when I started dabbling in Bitcoin mining and it was a real eye-opener.

    It's a very similar problem to OP's situation since Bitcoin mining and gaming both use high performance video cards.
    • I bought a kill-a-watt meter a while back when I started dabbling in Bitcoin mining and it was a real eye-opener.

      It's a very similar problem to OP's situation since Bitcoin mining and gaming both use high performance video cards.

      You can't say that and leave us hanging - did it cost more in electricity than you gained by mining bitcoins?

      • Re: (Score:3, Interesting)

        by Anonymous Coward

        Back when BTC were above $8, and you were using modern Radeon cards, it was roughly break even. Now if this is in a room that needed to be air conditioned, I would ballpark triple the energy costs. I decided it wasn't worth it unless it was the winter.

        • A/Cs have an efficiency of about 3 (3 watts of cooling for every watt of electricity), so you need only add about 30% to the figure when adding in A/C.

      • Even at $5/BTC I'm still profitable with electricity at $0.07/kwh
        • by DavidB ( 23006 )

          I ran a kill-a-watt test recently. It costs about $5.50/month to run the PC idle, $13.50 a month to run a miner with an ASUS EAH6850 graphics card. I mine at about 230 Mhash/sec which makes about $22.63/month at current difficulty and exchange rate.

    • Did you factor the cost of the meter into your calculation? I'm not sure Geoff here did that.
  • Components (Score:4, Interesting)

    by SJHillman ( 1966756 ) on Tuesday May 29, 2012 @09:22AM (#40141981)

    What about switching out power hungry gaming cards for newer, more efficient cards? This year's mid-end model may have comparable performance to last year's mid-high end model but might draw half the power. Over time, the lower power consumption adds up, not to mention you can get by with a smaller power supply. Likewise, trading in your hard drives for a solid state drive (maybe using a green HDD for extra storage)? And for old timers, switching out CRTs for LCDs? Overall, I think it'd be easier for people to upgrade to more energy efficient components than it would be for them to change their PC usage habits. Lowering the sleep/HDD shutoff/monitor shutoff timers can make a big difference too without having to remember to shut down your PC every day or waiting for it to reboot. Not an option for everyone, but gamers usually aren't on a shoe-string budget or else they wouldn't be able to afford the PC and the games in the first place.

    • by gl4ss ( 559668 )

      ..you'd be playing for couple of years to justify the cost of upgrading just for that reason.

      the whole debate is stupid, time spent (presumably happily) / money for electricity ratio is pretty much nothing if compared to just about any hobby, hell, even just buying sneakers is more expensive per year.

      not to mention the energy costs acquired when the equipment was made.

      just buy a phone and play with it? uses much less energy. the games suck though.

    • by Rogerborg ( 306625 ) on Tuesday May 29, 2012 @09:45AM (#40142245) Homepage
      Indeed, I scrap my hardware every 2 months so that I can be absolutely sure that I'm saving money and preserving the environment.
      • Smart choice! My water usage also dropped tremendously when I started to buy a new set of dishes instead of washing them.
    • by xdroop ( 4039 )

      This doesn't work for the same reason that virtualization rarely yields absolute savings. Instead of "doing the same with less", the pointy heads see all this newly-freed up hardware and decide to re-use it. You end up "doing even more with the same". So your costs-per-work-unit go down, but your absolute costs stay the same (or go up once virtualization costs are factored in).

      The same goes for people buying hardware. We rarely say "oh, I can buy this computer that has A) the same performance and B) bet

      • by laffer1 ( 701823 )

        It really depends on the situation. For example, I build packages for my open source project. The computer science department donated 20 machines for use in a cluster while I was there. I could build around 2000 packages in 10 days. After I left the university, I had to do it with my own computing equipment. Today, I can build the same software in about 2 days with my desktop computer. If I were paying for electricity use to run 20 Dell optiplex systems with pentium 4 1.7Ghz-2.0Ghz + IDE disks to the

    • I don't think the parent is suggesting that you buy components to replace fully functioning and useful parts just to save electricity. Potentially, though, you could save real, actual money buy buying newer parts than upgrading your current, old hardware.

      I ran an 8800GTX until it died, but it was around 6 months ago and I decided I needed an upgrade (before it failed). If I had gone ahead with the upgrade, I would have paid £100 for the card, and another for a 1kW PSU to handle the draw. Those
      • I should have stipulated that the graphics card upgrade was going to be a second 8800GTX in SLI, meaning graphics alone would have drawn around 650W.
      • Yes, I was referring to regular upgrades you might do anyway. For example, the Radeon HD 7850 (this year's mid-end model) and the 6950 (last year's mid-high end model) have comparable performance, but the 7850 draws about 2/3rds the power or less depending on benchmarks. The 6950 sells for less, but the power consumption may make the total cost of ownership similar to or greater than the 7850.

      • Even with a lousy HDD-of-no-particular-importance, I find that the big timesuck on boot isn't the booting; but the "getting all the browser pages and documents and whatnot back to where I left them(yes, even in applications that support session restore, you still run into issues like webpages that have decided to nuke the contents of form fields and such)" problem.

        For that reason alone, the only real choice is between suspend-to-RAM and suspend-to-disk. With your contemporary soft-off PSU burning a few w
  • Wow, earth shattering news here, turning off your PC when your not using it saves you a significant amount of money! What about factoring in cooling costs. High end gaming machines put out a lot of heat too. Since many gamers are using SSD's these days, sleeping your computer is great, they resume so fast. It's just common sense. I make sure everyone in my house shuts down or sleeps their machines at night if there is not a valid reason why they are on. It really does help. The real problem with this
    • Wow, earth shattering news here, turning off your PC when your not using it saves you a significant amount of money!

      Significant? $5.50 a month is hardly "significant".

      Nor is the $30-and-change per year cost of gaming "significant"

      • Significant? $5.50 a month is hardly "significant".

        Nor is the $30-and-change per year cost of gaming "significant"

        Especially when you factor in that the gaming will keep you away from other hobbies that might be more expensive. Such as: RC airplanes/cars, porn, collecting items, cars, girls (plus you don't need to worry about having kids, which cost even more money!), along with many other things.

    • by mcgrew ( 92797 ) *

      sleeping your computer is great, they resume so fast.

      Another advantage (in Windows, Linux doesn't have this problem) is that when you boot the machine, you have to restart every application. I don't mind booting my Linux box, but I HATE booting Windows. Ironically, I almost never have to boot the Linux box but am forced by its updates to boot the Windows box.

  • Now do a calculation of how much of your employer's time you wasted doing your calculation!

    If you make all the bad assumptions the RIAA makes, I bet you can make it hit a cool million, easy!

  • True costs - where is the vitamin d deficiency, light sensitivity, prices for bawls and redbull, price for pizza, radon exposure from your mom's basement,depends for long raid nights, divorce costs, hardware costs and software licensing and general lowering of testosterone levels. Of course the benefits are, water savings because of less baths, no social costs (coffee shops, movies, dates, video rentals, vacations, etc), not expensive presents for friends, less electricity used in the house because no ot
  • by Cylix ( 55374 )

    I would suspect C3 sleep states are supported on a majority of systems by now. Perhaps I was just lucky when I picked up the hackintosh board a few years ago. Now, I simply use a reasonably long idle timer and the system goes to sleep/power off. It takes a few seconds to come back out of that state and wholly beats a cold start.

    I guestimate my home system gets about 3-4 hours of usage each day during the weekday. In addition, there are plenty of other device around the house which support other core service

  • by nashv ( 1479253 ) on Tuesday May 29, 2012 @09:33AM (#40142111) Homepage

    All in all, that is really peanuts in terms of electicity bills. If you are spending roughly 2 hours a day gaming, a normal person with a full-time job and a family would have very little time to do much else that can sink money.

    Considering that yearly electricty bills routinely reach about a $1000+ for a standard household [eia.gov], this added 10% due to gaming is pretty insignificant when compared to other hobbies...like racing cars for example.

    Sure, there may be cheaper hobbies, but I honestly don't think anyone well-settled enough to be practising a daily hobby and deriving enjoyment from it finds it a problem to spend 8 bucks 50 cents a month for their recreation.

  • If only my phone service was that cheap.
  • As compared to...? (Score:4, Insightful)

    by geekmux ( 1040042 ) on Tuesday May 29, 2012 @09:53AM (#40142345)

    I'm not sure what exactly the article is trying to convey here, as measuring electrical consumption is merely fine-tuning an existing expense related to a hobby, and an obscenely small amount of money being measured at that (c'mon, ~$30/year? People who will spend twice that much in a month on caffeine just to play said hobby).

    Compare playing video games to spending money on cable TV. Or going to the movies. Or riding a bike outside. Discussing literally pennies of electrical savings per day seems rather pointless when you're spending considerably more to sustain that kind of hobby in the first place.

  • As of my last month's bill I am paying 28.8 cents per kWh. I'm not sure how much power my computer uses, but with my Nvidia GTX280 and an overclocked 4 Ghz dual core CPU I would assume at least 400 watts. Particularly while playing a game. So let's say 12 hours for a day of gaming. So 4.8 kWh or $1.38 per day of marathon gaming. If you assume 4 days per week that would be $22.12 per month or $265.42. Of course my computer may actually use 500 or 600 watts while gaming. What interests me more is how much pow

  • Keep the inside of your computer clean. Clogged filters and fans consume more power to keep the computer cool.
    • by ledow ( 319597 )

      A fan uses a handful of watts. Literally. But your graphics card can easily pull hundreds of watts.

      A fan, dirty or not, is the least of your worries power-wise. Most storage devices take more than they do.

      Again - yet another case of worrying about minor pittances when you're pulling kilowatts from your house heating / air conditioning for hours on end.

      • Hey, there is the cost of dead components from heat-related deaths, both in terms of your pocketbook and to the environment. Of course, by turning the PC off when you're not using it helps keep it from getting clogged up with dust so fast.

  • by drdrgivemethenews ( 1525877 ) on Tuesday May 29, 2012 @10:32AM (#40142985)
    You're working on one of the smallest possible incremental changes in your house's electrical usage. What's the point?

    The wall warts (AC adapters) scattered about your house almost certainly use and waste more electricity than your PC. The US EPA guesstimated in 2005 that around 200 gigawatts (6% of US total power) goes through these things, and a significant portion of that (30 - 50%) is wasted.

    See http://www.buildinggreen.com/auth/article.cfm/2005/3/1/Efficiency-Standards-for-AC-Adapters/ [buildinggreen.com] Getting all your wall warts onto centrally controlled power strips would seem like an interesting and money-saving challenge. If anyone has done that, I'd love to hear about it.
  • Turning off your computer saves electricity!

    I mean seriously, wtf.

  • Reality sucks, eh? (Score:4, Insightful)

    by TheSkepticalOptimist ( 898384 ) on Tuesday May 29, 2012 @10:49AM (#40143301)

    Someone just moved out of his parents house and realized that electricity actually costs money. Spoiler alert, 40 minute long hot showers also costs a lot on the water and gas bills.

    Its hilarious me when teens / early twenty-somethings leave the protected isolation of their parent's nest or university dorm and suddenly get a good ol' does of reality.

    • by dkf ( 304284 )

      40 minute long hot showers also costs a lot on the water and gas bills.

      In fact, heating water is one of the more expensive things in energy terms (water has quite a large thermal capacity, after all). A quick back-of-an-envelope calculation leads to the cost of a 40 minute shower as being somewhere in the region of 10-12 kWh. (Standard US shower flow rate is 2.5 gallons per minute, and assuming that you're looking to raise the water temperature by around 50F.)

  • If he weren't interested in gaming he could likely make do with a much less powerful GPU and/or possibly a more power-efficient CPU. The combination of those two would reduce his power consumption even further during non-gaming-related computer usage (or idling).
  • To really figure the electrical cost of gaming, you have to figure out what else people would be doing if they weren't playing games. Some activities, like watching TV, would use as much or more power.

    My guess is if we calculated the energy use of those other activities, gaming might be a net energy saving activity.

  • ... too much expensive.

    I used to use a old notebook for day-to-day computing. A Celeron M450, to be exact.

    But the damn thing died, and I endup ressurrecting my Athlon XP 3.0G with an ATI HD 3850 to do the job.

    (ok, I'm hearing a lot of laughs, but this machine was, a long time ago, a power computer! =P)

    The crude fact is that my electric bill raised 25%. (sigh). In one year, this accumulated difference will be more than the market price of this computer.

    Things could be worse, however. My "Media Center" is a A

  • I can pretty much find a bunch of equivalent expenditures and compensate in one manner or another.

    If you want to see real money then figure the hours spent gaming instead being used towards a second income. That might make you wince.

"Pok pok pok, P'kok!" -- Superchicken

Working...