Please create an account to participate in the Slashdot moderation system


Forgot your password?
Displays Power Entertainment Games

Power Consumption of a Typical PC While Gaming 211

cliffski writes "How much does your PC really draw in terms of power when idle, when in sleep, and when playing a demanding game? I don't trust everything the manufacturers of hardware say, so I thought I'd get myself a watt measuring device and run a few tests on some of the gear I leave on all the time, and the gear I go to the trouble of turning off. The Linksys router drew 8 watts, the monitor drew a fairly noticeable 30-31, but what surprised me was how little power the base unit drew, even when playing Company of Heroes. Also, the variance of power draw for Vista seemed minimal, regardless of what you got the machine to do."
This discussion has been archived. No new comments can be posted.

Power Consumption of a Typical PC While Gaming

Comments Filter:
  • by mr_mischief ( 456295 ) on Wednesday June 25, 2008 @02:56PM (#23938855) Journal

    Any decent electronics project book will verify that any copper or aluminum wire will gain resistance with increasing temperature.

    If you want a quick link, though, how about this article at Dan's Data about power supplies [] which actually gives some basic theory? It's a little suspect in that it's a review of a particular brand of power supply, and Dan's Data isn't as widely known as Tom's Hardware or Anandtech. What do you want from the very first Google result for the search "warmer power supplies draw more current", though? It also happens that he's right (about the issue, anyway -- I've never reviewed or purchased Topower power supplies).

  • I need one of those (Score:2, Informative)

    by legoman666 ( 1098377 ) on Wednesday June 25, 2008 @02:59PM (#23938911)
    I have 4 monitors on my machine. 3 19" LCD's and 1 22" LCD. The PC itself is a Q6600 @ 3.1ghz and 2 HD3870's, also overclocked. The CPU and 2 GPUs are watercooled. There are also 4 hard drives and a sound card.

    I've think I've estimated the power draw at around 450w under full load (not including the monitors. 3 of them are turned off when I play games).

    Luckily I don't pay my electric bill.

  • by cliffski ( 65094 ) on Wednesday June 25, 2008 @03:00PM (#23938931) Homepage

    Hi. Theres a single hard drive in there. a typical 'shipped with the box' PSU, no wireless card or anything fancy.
    The PC is from mesh Computers, about a year old.

  • by ivan256 ( 17499 ) on Wednesday June 25, 2008 @03:04PM (#23938995)

    I can hardly believe that the router mentioned was using 8 watts, what is the time period there? I know the power supply on my linksys router is in the milliamps so, basic conversion would indicate to me that is not possible. I am probably thinking of something backwards though.
    Time is a component of the "watts" unit. One watt is one joule per second. So the time period is irrelevant.

    8 watts at 120 volts (simplistically speaking []) would only be about 66 milliamps.

  • by compro01 ( 777531 ) on Wednesday June 25, 2008 @03:30PM (#23939411)

    450W at idle?!

    The power rating on the power supply indicates how much it is capable of supplying. it will not draw more than is needed, plus some for the thermal losses in the power supply (a good many supplies are better than 80% efficient).

    My fairly powerful system (3ghz core2 duo, 2GB ram, 500GB hard drive, 8800GTS, 700W generic brand power supply, and a 17" CRT from 7 years ago,) draws about 370W running flat out according to my UPS (a 780W/1200VA APC unit) and sits at about 200-250W at idle (for various definitions of idle).

  • by Amouth ( 879122 ) on Wednesday June 25, 2008 @03:30PM (#23939421)

    on top of that the ratings you see on the power blocks is normaly the dc output - the kill-a-watt meausres the power draw on the AC side before conversion - there is no doubt that router is running on less than 8watts as you lose in the conversion and heat. and most bricks arn't what you would call effecient devices they are cheap.

    but as the power total isn't much the effeciency isn't that big of a deal

    if you are only 50% effecient in conversion but that loss is only say 10 watts no big deal compared to a loss of several hundrad watts with an old AT non switching powersupply

  • by sm62704 ( 957197 ) on Wednesday June 25, 2008 @03:54PM (#23939767) Journal

    Laser printers are power hogs by the very way they work. They're space heaters! If you have a laser printer running, it most likely consumes more power than all your other hardware combined, and certainly puts out more heat.

    The way a laser printer works is that the laser beam puts an electrostatic charge wherever it lands - which wouold be where you want the paper to be black.

    The charge on the paper attracts the toner, which is black plastic ground into fine powder. A heater in the unit, at 1800 degrees f, melts the black plastic on to the paper.

    If you care about your wallet (let alone global warming from the coal they have to burn for its electricity), you'll keep the laser turned off most of the time.

  • by Anonymous Coward on Wednesday June 25, 2008 @03:54PM (#23939781)
    Really? []
  • by tepples ( 727027 ) <> on Wednesday June 25, 2008 @04:19PM (#23940225) Homepage Journal

    Time is NOT a component of the "watts" unit. [...] The joule is is the SI unit of energy measuring heat, electricity and mechanical work.
    The watt is a unit of power, and power is the rate of energy transfer. A watt is one joule per second (1 J/s); there's a time component on the bottom of the fraction.

    A watt is current times voltage; e.g. at 100 volts one ampre, you have 100 watts.
    Right, but voltage itself is derived from power. A volt is a watt per ampere, or a joule per ampere-second, or a joule per coulomb.

    but if you check your electric bill you'll see that you aren't billed for watts used, but kilowatt hours used.
    I seem to remember reading that electric companies bill for both provisioning and usage: one rate for peak power (in watts) that could be used and another for energy (in kilowatt hours or megajoules) that is used. I will grant that some electric companies don't show provisioning as a separate line item on single-family residential accounts.
  • by camperslo ( 704715 ) on Wednesday June 25, 2008 @04:53PM (#23940787)

    Any decent electronics project book will verify that any copper or aluminum wire will gain resistance with increasing temperature.

    While what you say is true, there is no reason to believe that resistance losses are a significant portion of the total losses in our power supplies or that it that those losses increase by a significant percentage over the temperature range seen. Without proper analysis, facts can be used to jump to the wrong conclusions. (compare with dangers/effects of high power microwaves and discussion of WiFi for example)

    In power supplies I've built that were similar but not identical to PC supplies, most of the losses were switching/conduction losses in the power transistor(s) and in the rectifiers. In the case of the rectifiers the conducting voltage drop actually gets SMALLER at higher temperatures.
    In practice, the main concern about elevated temperatures in a PC is an increase in the failure rates of components. Some simply fail if too hot. Thermal cycling can also cause cracks in solder connections over time. That means high temperature operation elevates failure rates both when it occurs and to a lesser extent later.

  • Re:Consoles (Score:5, Informative)

    by CastrTroy ( 595695 ) on Wednesday June 25, 2008 @05:02PM (#23940899) Homepage
    Here's a comparison [] that shows XBox 360 vs. PS3 vs. Wii vs. PC in many different areas including standby, idle, gaming, and movies (Wii not included in movies).
  • by Macman408 ( 1308925 ) on Wednesday June 25, 2008 @05:23PM (#23941151)

    Typical air conditioning can remove heat from your house with a 30-45% penalty; eg running a 100W appliance might cost another 35W in air conditioning. Incidentally, at least for air conditioning planning, I've seen a human listed as producing about 600 BTUs per hour, or 175 Watts. So your room might be warming up as much from you using the computer as it is from the computer itself.

    Also, some people seem surprised that their computer has a 450W power supply even though it is only drawing 150 Watts. This is because a power supply needs to supply the peak power for all accessories that might ever be installed. If you buy a computer from Apple or Dell, the power supply needs to have enough capacity to handle not only what you're getting, plus power for extra hard drives, PCIe cards, USB devices, FireWire devices, and anything else that might be added later. Furthermore, the power budget that they work with during design likely takes into account the maximum power specified for every single chip on the motherboard, even though it is unlikely that any one of them could reach that limit, much less all of them at the same time. The CPU's specifications might require 80W, even if it's only for a few milliseconds, and for the worst combination of operating temperature, manufacturing variance, and CPU load. There's a large margin built in to the design to ensure that your computer's power needs won't exceed what can be supplied.

    In my case, my Dual 2.7 GHz PowerMac G5 has a 600W power supply, even though the peak usage I've measured is around 250W. Another 90W or so is reserved for PCI/PCI-X slots that I don't use, plus there's capacity needed for another hard drive, and 4 more sticks of RAM. Add in 15W available for bus power on FireWire, 2.5W for USB bus power... Then there's the difference between the actual sustained peak usage and the specification's instantaneous peak usage, which increases the requirements significantly. Pretty quickly, it adds up to something pretty close to 600 Watts.

    I borrowed a Watts Up meter from the local library (the local power company supplies them to area libraries). I'd suggest that those interested in learning about their power consumption check if there is a similar program in your area, or ask if the power company, library, environmental group, or other organization would be interested in starting one. Or, offer to buy one and donate it when you're done, and encourage others to do the same.

  • by IYagami ( 136831 ) on Wednesday June 25, 2008 @05:50PM (#23941527)

    You should take a look at []

    This is the main information:
    Power Consumption in Games
    PS3: 185.9 Watt average
    XBox360: 176,54
    PC (see link for more information): 156,6
    Wii: 16.8

  • by mollymoo ( 202721 ) on Wednesday June 25, 2008 @05:59PM (#23941629) Journal

    So does your computer, powersupplies get less efficeint the warmer the room is. So while your useing only 200 watts, at 70 degrees, at 85 degrees, it's probably past 250.

    Resistance does increase with temperature and a thermally controlled fan will spin faster and draw more current. But enough for a 25% rise in consumption from a 15 degree (in unspecified units, I guess you mean Fahrenheit) temperature rise? That's seems like a hell of a lot for a fairly modest rise in temperature.

    For an 80% efficient power supply, an increase of 25% overall consumption is more than double the power loss. The reality is very complex, but we can pick out a few relevant numbers to get a feel for the magnitudes involved. Empirical testing would be easier than an analysis, but here's some food for thought:

    For copper, the resistance rises by about 0.4% per degree Celsius rise. Your roughly 7 Celsius rise would increase it by a whopping 2.8%. You'll have melted the insulation well before even a 50% rise in the resistance of your copper wire.

    If you look inside a power supply, you'll see a big fat heatsink. Attached to that are rectifiers and switches - diodes and FETs. That's where a big proportion of your power supply's inefficiency comes from. Looking at the first power FET datasheet I have to hand (for a Fairchild HUF75337P3), the on resistance increases by something like 1% per degree Celsius rise. For diodes because the forward voltage drop actually decreases with increasing junction temperature - they get more efficient. For an International Rectifier 12CWQ03FN it looks to be about 0.2% lower per degree Celsius rise.

    The YS-Tech 80mm fans in this box next to me consume 0.84W at full speed. That's a slow fan though, I wouldn't be surprised if more typical ones used 2-3W at full speed.

    Hardly a complete analysis, but just can't see where you're getting this additional 50W from. I think you're out by an order of magnitude.

  • Re:No sources needed (Score:3, Informative)

    by pushing-robot ( 1037830 ) on Wednesday June 25, 2008 @06:08PM (#23941747)

    So, if you have a 200 watt power supply, making 200 the 80%, you would be drawing around 250 watts of power.

    A very common fallacy is that a PSU always draws as many watts as it's rated for; in other words, a 500-watt PSU constantly draws 500 watts or more. This is incorrect; your PSU only supplies (and draws) as many watts as your computer currently needs.

    "80-plus certified" means the PSU was tested to be 80% efficient at 25%, 50%, and 100% load. Assuming you have a fairly low-end system, your 200-watt PSU may never supply more than 100 watts, and therefore (being 80% efficient) never draw more than 125 watts. If you added a component to your system that consumed an extra 20 watts, your PSU would supply an additional 20 watts, and draw an additional 25 watts (again, 80% efficiency). Simple as that.

  • Re:Accuracy (Score:4, Informative)

    by Quelain ( 256623 ) on Wednesday June 25, 2008 @06:31PM (#23942051)

    I don't think they are accurate at all on switch mode power supplies. I have one which is definitely wrong when measuring a PC PSU.

    I think they expect to see peak current at the peaks of AC voltage, but a switch mode PSU will take small bites of current which may or may not coincide with the voltage peaks.

"Well, social relevance is a schtick, like mysteries, social relevance, science fiction..." -- Art Spiegelman