How-To Geek

What Exactly Does the Wattage Rating on a Power Supply Unit Mean?

Your PSU is rated 80 Plus Bronze and for 650 watts, but what exactly does that mean? Read on to see how wattage and power efficiency ratings translate to real world use.

Today’s Question & Answer session comes to us courtesy of SuperUser—a subdivision of Stack Exchange, a community-drive grouping of Q&A web sites.

The Question

SuperUser reader TK Kocheran is curious about power supplies:

If I have a system running at ~500W of power draw, will there be any tangible difference in the outlet wattage draw between a 1200W power supply vs, say, a a 800W power supply? Does the wattage only imply the max available wattage to the system?

What is the difference? And what, for that matter, do the 80 Plus designations mean on modern PSUs?

The Answer

Contributors Mixxiphoid and Hennes share some insight into the PSU labeling methods. Mixxiphoid writes:

The wattage of your power supply is what it could potentially supply. However, in practice the supply won’t ever make that. I always count 60% of the capacity as the truly maximum capacity. Today however, there are also bronze, silver, gold, platinum power supplies which guarantee a certain amount (minimum of 80%) of efficiency. See this link for a summary of 80 PLUS labels.

Example: If your 1200W supply has a 80 PLUS label on it, it will supply probably 1200W but will consume 1500W. I think you 800W supply will be sufficient, but it won’t guarantee you safety.

Hennes explains the value of a system-appropriate PSU:

The wattage implies the maximum available wattage to the system.

However note that the PSU draws AC power from the wall socket, converts it to some other DC voltages, and provides those to your system. There is some loss during this conversion. How much depends on the quality of your PSU and on how much power you draw from it.

Almost any PSU is very inefficient when you draw less then 20% of max rated power from it. Almost any PSU has less than peak efficiency when you draw close to the max rated power from it. Almost any PSU has their optimum efficiency around 40% to 60% of maximum load.

Thus if you get a PSU which is ‘just large enough’ or ‘way to big’ it is likely to be less efficient.
[But note that your PC does not consume a fixed or constant level of power. At idle, when not much is happening, the DC power consumed will be low. Perform a lot of processing and I/O operations, then power demand goes high.]

A nice example of areal world efficiency graph is this:

will there be any tangible difference in the outlet wattage draw between a 1200W power supply vs, say, a a 800W power supply?

The 800 Watt PSU would run at 62.5% of max rating. That is a good value.
The 1200 Watt PSU would run at only 41% of its maximum rating. That is still within the normally accepted range, but at the low end. If your system is not going to change than the 800 Watt PSu is the better choice.

Note that even with a good (bronze+ or silver rated PSU) you are still loosing about 15% during conversion. 15% of 500 Watt means that your computer would use 500 Watt, but the PSU would draw 588 Watt from the wall socket.

Clearly, you should aim to have your PSU sized appropriately for your system–putting a high-load PSU into a basic desktop machine doesn’t increase your safety margin and decreases your efficiency costing you more money in the long run.

Have a useful link or comment to add to the discussion? Sound off in the comments below. Want to read more answers from other tech-savvy Stack Exchange users? Check out the full discussion thread here.


Jason Fitzpatrick is a warranty-voiding DIYer who spends his days cracking opening cases and wrestling with code so you don't have to. If it can be modded, optimized, repurposed, or torn apart for fun he's interested (and probably already at the workbench taking it apart). You can follow him on if you'd like.

  • Published 11/27/12

Comments (22)

  1. srsly tho

    It’s really important to buy a quality PSU too.
    Buy a cheap one and it could blow up and take all the other components in your computer with it.

  2. LadyFitzgerald

    Even more importnt than overall wattage is the amperage rating of each rail. One has to take care not to exceed each rating.

  3. Two Replies

    For a second there I thought you were going to explain the power supply input voltage switch (and the dangers of messing with it).

    It would be a good subject for another HowToGeek article! ;-D

  4. tech27

    So.. I still don’t understand the whole thing… Does buying a high-load PSU that surpass the actual wattage on your system causes problem on your system? Does it damage the components? Say my actual system wattage only need a 500W PSU, if I buy a 1200W and replace my 500W PSU and I put it on my system the 1200W PSU, then does it causes any problem??? or can you explain the disadvantages on using higher suggested PSU on your system?

  5. Bill

    tech27 – A device only draws as much power as it needs – it doesn’t draw as much is available – unless it’s shorted. LadyFitzgerald has the most accurate answer… to really get the optimum match, you’d have to measure all your individual loads under worst-case conditions (drives running, chips working at their hardest) and then give yourself a safety margin (e.g. 20%) and then pick a supply that matches. I usually just get a rough idea motherboard with processor, drives, etc., you can look up burdens (loads on the system for pretty much every component) and buy a quality supply somewhere above that. Usually the burdens listed by manufaturers are max loads anyway, so you automatically get some margin.

  6. sorinc

    @tech27 — having a larger PSU than necessary would make it run inefficiently, i.e. draw more power from the socket than necessary. A smaller than necessary one would make your system unstable, e.g your video card could crash your system. Your PSU should be sized to be as close as the 50% vertical in the graph above. So, if you add the components’ consumption on your system (video card + CPU + HDD + a factor for the rest of the components) and you come up with 400W, you should buy a 800W PSU.

  7. john3347

    tech27 – the efficiency factor that the author talks about would come into effect in your proposal to replace a 500 watt power supply with a 1200 watt power supply in a computer application that needs a maximum of 500 watts to run everything that it is responsible for running. Your 1200 watt power supply would be running at a very low wattage most of the time and would be drawing considerably more power from your power meter that you pay your light bill on than the 500 watt power supply would under the same conditions. In other words, if your computer is using 50 watts of power your 500 watt power supply would be drawing considerably less power from the wall outlet than the 1200 watt power supply would – at the same 50 watt need. This wasted power is given off in the form of heat. Heat is the biggest enemy of electronic components.

  8. 1200burpees

    Everything you asked is explained in the article.

  9. Chris

    If you want to understand power supplies go to Hardware Secrets. The Antec web site has an excellent calculator you can use to see just how much power you need. Some of the responses here show a total lack of understanding.

  10. Salvatore Hidalgiano

    How this information is gonna help me when buying a new PC ?

  11. Frank

    wow – so much uninformed discussion … all you need is a plug-thru meter (cheap from supermarket/hardware stores) to tell you how much power you’re using at any point in time, e.g. I’ve typically measured 80W for old CRT-screen PCs (remember those?) and maybe 60W total for LCD/flat-screen desktop PCs.

    I’ve measured an office laser printer to use 600 Watts, but only for a second of two while it’s actually burning the toner on a page – otherwise it goes back to a tiny fraction of that on idle.

    My old desktops I’ve had open had either 150W power supply or newer ones 200W, so I’m not familiar with who would need 1200W – I presume only the most heavy duty application – maybe servers or top-gaming – I’m not familiar as I don’t use it – my 10″ netbook typically draws between 15-20 Watts so hardly gets warm after 2 hours of use.

    1200W is more than my bathroom bar radiator heater – in other words – that’s a lot of heat!

    by the way – comments on the usual teaser graph
    – ‘areal world’ – where is that?
    – unlabelled left axis – I figured out it meant % efficiency
    – unexplained purpose of 230VAC, 120VAC, 90VAC – people have a choice to buy different ?
    – so I’ll guess someone did the usual copy’n’paste without a lot of understanding
    – and the shock horror maximum difference in efficiency displayed seems no more than about 4% – not something I’d get out of bed to chase …

    so – a fake technical document written by someone who doesn’t really appear to know what they’re talking about – way to impress … maybe not.

  12. Mike

    You can buy a over sized psu, its only downside is wasting electricity.

  13. Private

    Hmmm, for sake of an argument that you have distributed across all outputs on your power supply a load of 85 Watts and that your power supply has an efficiency of 100% (impossible of course). That means that all the input power gets converted to the output and that your power supply generates no heat because of the 100% efficiency.

    Your next door neighbor, on the other hand has a power supply load of 100 Watts and his efficiency is 90%. His power supply dissipates 11.11 W in waste heat at that load.

    Where the problems arise is when you’re running the power supply close to near maximum loads. Let’s use your neighbor’s power supply as an example. He goes out an purchases the newest, most powerful video card, maximum memory and the largest hard drives he can lay his hands on. This drives his power supply output up to 1000 Watts and he’s dissipating 111.11 Watts in heat. For a 1000 Watt power supply he’s pushing it to an early death due to the temperature of his power supply components. If his apartment (or mother’s basement) has no air-conditioning and the temperature rises to 100 F then if he’s lucky his power supply will go into thermal limiting and shut down before any permanent damage gets done. If he’s not then he will experience the release of the magic smoke before too long.

    A 50% loading gives you plenty of headroom as to efficiency vs cooling.

    The preceding numbers assume a resistive load, linear behavior, etc (all unrealistic of course). PCs have dynamic loading that varies from idling to maximum and everywhere in between.

    The 110/230 switch allows those with access to 220 VAC to use it. A 1 A load on 110 VAC dissipates 110 Watts. A 1 A load on 220 VAC dissipates 220 Watts. Running your power supply on 220 VAC draws half the current that it would on a 110 VAC line. Lots of power supplies have the 110/220 switch built into the circuitry so it switches automatically.

  14. Ray Cooke

    I have an intel quad core @2.4g + 8gig ram. 3 sata hdds + 2 dvd writers. I have a 450w psu and it’s running absolutelu fine.
    Why should I change it?

  15. Private

    “I have an intel quad core @2.4g + 8gig ram. 3 sata hdds + 2 dvd writers. I have a 450w psu and it’s running absolutelu fine.
    Why should I change it?”

    Don’t change it. Symptoms of marginal power supply include shutting down for no apparent reason, difficulty in booting, and strange errors. If you’re running fine then that isn’t an issue.

  16. pbug56

    An oversized PSU does not always waste power. A high quality one (ie ‘gold’), is fairly efficient. I built a desktop with an early I7 (overclockable), video card, 2 HDs, video input digitizer, etc. It has an 800 or 850 Gold PSU. I ran it through a wattmeter, and average use when active was only about 100 watts.

  17. clamo

    it means how much power can be drawn from it to power your computer. the more WATS the more you can run off a PSU with out causing it to force a reboot/power down if to much is drawn.

    it doesn’t mean how much power your computer is actually drawing. the average computer pulls around 300 WATS.

    Hi end computers will pull more. I use a 1000W PSU aka 1killawat. with my Hi end system x58 main board 1366 core i7 2.8 and a gtx 470 @ full load it pulls around 480 WATS max.

    to measure a computers power draw all you need is a Wat meter. Newegg has them. I got mine a wile back, and they are great for other appliances to.

  18. Salvatore Hidalgiano

    Great comments and lessons !! Thanks for the info.

  19. Ekator

    I think there is some misunderstanding in these comments. Efficiency cannot be measured strictly by the power draw from the wall alone. There’s a difference between power drawn and power consumed. The difference between the two is power wasted. So no… A wall meter by itself will not suffice if efficiency is what you are trying to assess.

  20. Dic

    Without the benefit of a meter, how can one tell which outlet from the PSU is what voltage? This matters to me at the moment, as I’m about to add a 5 v. SSD to a PC.

  21. John

    Seems there is some confusion here also in the sense that a basic formula is amps x volts = watts. This was the principle used above by ‘Private’ who said [ copy/paste “A 1 A load on 110 VAC dissipates 110 Watts. A 1 A load on 220 VAC dissipates 220 Watts.” ] now this formula doesn’t include resistance. Resistance is less in 240v than 120v so it is cheaper because you pay for resistance. If your pc uses 400watts… it uses 400 watts worth of power regardless of whether it runs on 120v or 240v (hence the formula AxV=W rewritten like 440w

  22. John

    Seems there is some confusion here in the sense of power. I will provide you a basic formula & side-note, you can use. Note that this is not the true-complete formula used (that is too complicated for many people…) but a part of it… we all can use. Here is basic formula: Amps xVolts = Watts. By-the-way… this was the principle used above by ‘Private’ who said [ copy/paste= “A 1 A load on 110 VAC dissipates 110 Watts. A 1 A load on 220 VAC dissipates 220 Watts.” ] Now this formula doesn’t include resistance. Resistance is less in 240v than 120v so it is cheaper because you pay for resistance. You can look at resistance like this. If you were to fill your mouth with water and push it out to fill a glass through a big malt-straw, then did the same thing through a real small straw… and by-the-way you had to do it with the big straw as fast as you could, then match the time through the small straw.. well I think you get the idea. The extra hard work you do with the small straw someone will charge you for it as resistance… it takes you more work to do it, it does the generator too… it uses more gas when it lugs under a harder load. So you are charged 480watts that your pc used + resistance depending upon your voltage*(in a minute…) If your pc uses 480watts… it uses 480 watts worth of power regardless of whether it runs on 120v or 240v (hence the formula AxV=W rewritten like 480w = 120v x A makes Amps = 4 which reversed again says 4Ax120V=480 watts. Look at it in 240volts. 480w = 240v x A makes Amps = 2 which reversed agains says 2Ax240v=480w.
    Now that said we see 480 watts used regardless of voltage. The missing complicated factoring is only the resistance differences between pushing 120v versus 240v through the straw. There is very little difference in such small usages like a computer because the windings in the power supply are the same size straw so the only resistance factor is related directly to the voltage difference effect itself. When used in commercial applications they use bigger straws to compensate for large volume which helps reduce resistance (you only have to pay for the large wire once… you have to contantly pay for the electricity.
    Last… I want to comment on something above I ‘thought’ was going there… then didn’t. And that is that there is an advantage of having a bigger power supply. I used to be a computer technician and would get calls where systems would have irratic symptoms… and these are common to inadequate (under-powered) power supplys. Let’s say your system needs 400 watts out of your 480w power supply. What needs mentioning here is that if you supply your 480w PSU with 120v and it is 565w less a 15% efficiency as is mentioned above so I mention this here for that purpose only… THEN the output of 565w @ 85% = 480.25 watts. THAT is AS-LONG-AS the input is 120v. Now if your neighbor (or neighborhood) has usages that are significant by device (machines, large equipment, things that use LOTS of power: this is common in business zoned areas BTW…) or if that is near enough to affect the power lines going to your place… this is what happens. Let’s say power surges take the voltage down to 90v from 120v. Substituting the Ohms law (cheater we just used) AxV=W we get this 4Ax90V=360Watts. Oops… we needed 400 and we only have 360. [I used 90 because it was easier to understand than 100 to me anyway, 90 is a little extreme overall but short-term drops can happen] So those smaller drops still have an effect and when combined with the ‘moment’ your pc has all its power-consuming things happen all at once then you can experience these odd problems. Many places install UPS systems (Uninteruptable Power Supplys) which PROVIDE the necessary voltage when it drops from battery storage so your pc experiences NO ill effects. We used to for residential zoned areas such conditions as you live in an apartment building that at different times of the day can have wild fluctuations because they all run not directly from the power company to each room, but off the main circuitry of the building itself… because UPS systems are so expensive this is how we resolved about 98% of these problems. Put in a larger power supply and figure it out like this. Let us use the 90v example to cover any instance of that. Our power supply Ohms application would have to be like this. AxV=W or to make it easier using values in the order we know them like this: W=VxA so filling in the blanks to match the original 480w PSU we get this:
    480 = 90xA this makes Amps = 5.33 amps. Now to proportionally change that back to 120v PSU that we will buy, we go: AxV=W or 5.3x120v = 636watts. Now you know if you buy a PSU close to this range you get AUTOMATIC protection from flucuations in power SOURCE and your power supply output ratios STILL PROVIDE ADEQUATE POWER YOU NEED. And don’t even bother worrying about the difference in power usage. Put your wattage meter on it you’ll find you probably won’t be able to tell the difference. I never used a power meter on one, and whatever accuracy they have I don’t know but my guess is you won’t see any difference.

    Hope this helps. Bigger to this degree considering any update compensations for hardware in the future doesn’t hurt anything and in fact provides extra protection… depending upon age of some units… years ago such voltage changes could burn out your electronics and motors… and the complexity of power supplys today is so awesome that they themselves aren’t damaged by such changes because they will protect your equipment and themselves because they will shut off in most cases before physical damage will happen… so your pc still works when you restart it with the deficient power supply but the physical damage isn’t to the parts (most likely) thanks to smart PSU’s but what about the work in the open-files not saved or the settings and configurations and certain kinds of things… even a page on the internet you took 2 hours to find that the address wasn’t saved because it conked out just before you were going to save it. I’ve had it happen.

    Hope this helps,

Enter Your Email Here to Get Access for Free:

Go check your email!