How much energy do your smartphone, laptop, and tablet chargers really use? Should you unplug them when you aren’t using them to save power and money? We measured exactly how much power a variety of common chargers use, and how much keeping them plugged in will cost your each year.
You’ve probably heard of “vampire power”—the amount of power a device uses in standby mode when you aren’t using it. But just how much vampire power does a charger use, and is it worth the hassle of unplugging them when you aren’t using them?
How We Measured It—and How You Can, Too
We used a Kill A Watt electricity usage meter to measure the power usage of a variety of popular chargers. They’re currently under $25 on Amazon, giving you an easy way to measure your devices, too. Plug the meter into an electrical socket, and then plug another device into the meter. The meter sits between the two and tells you how much energy the device is using. This is very useful if you want to measure your energy use, allowing you to identify power-hungry appliances and devices that should be replaced or adjusted. Look up the rate your electricity company charges you and you’ll be able to figure out exactly how much that electricity will cost you, too.
So, with a meter in hand and a variety of chargers lying around, we got to work and tested them so you wouldn’t have to.
How Much Vampire Power Does a Charger Use?
We plugged in a variety of chargers—iPhone, iPad, MacBook, Android phone and tablet, Windows laptop, Chromebook, and even Nintendo’s 3DS charger. It was immediately obvious there was a problem with the very idea of our test. Having heard about the evils of vampire power and the need to unplug devices when we’re not using them, we were surprised to see that not a single charger used a detectable amount of vampire power when it was plugged into an outlet.
In other words, the meter’s display read a big 0.0 watts, no matter what charger we plugged into it.
But Surely They’re Drawing Some Power!
It’s not entirely accurate to say that each charger was using 0 watts, of course. Each charger is using some fraction of a watt. And it should certainly be detectable at some point!
With that in mind, we had a new idea—plug a power strip into the meter, and then plug multiple chargers into the power strip. Then, we could see just how many chargers it takes for the meter to be able to measure some noticeable electrical draw.
The power strip itself—despite its red LED light—registered 0.0 watts when we plugged it in. We started plugging in chargers and watched the meter continue reading 0.0, even after several chargers were plugged in.
Eventually—with six separate chargers plugged in—we arrived at a solid, measurable reading.
The total vampire power draw of our power strip, combined with chargers for an iPhone 6, iPad Air, MacBook Air (2013), Surface Pro 2, Samsung Chromebook, and a Nexus 7 measured a grand total of 0.3 watts.
Aha! How Much Money is That?
Finally, we have a measurement to work with: 0.3 watts.
Let’s assume these are all plugged in 24 hours a day, 7 days a week, over an entire year. There are 8760 hours in a year. That equates to 2.628 kilowatt hours (kWh).
According to the EIA, the average cost of electricity in the US is 12.98 cents per kWh. This means that those 2.628 kWh of electricity will cost about 34.1 cents over an entire year. Even using the most expensive electricity rates in the US—30.04 cents per kWh in Hawaii—that’s only about 79 cents per year.
The real cost is actually lower, as you’ll be charging your devices with these chargers sometimes, so they won’t always be drawing vampire power. You’ll probably unplug them to take them with you sometimes, too.
But let’s use the highest number—79 cents per year. Divide that by the six different chargers here (being charitable and ignoring the power strip), and you get 13 cents per year for each charger in Hawaii. That’s about five and a half cents on the average US electrical bill.
This Isn’t Meant to Be Precise, But It Answers the Question
This isn’t meant to be a completely scientific or precise test, of course. Some of the chargers likely use more power than others, so the real cost to leave your smartphone charger plugged in for an entire year is probably below 13 cents.
Either way, this shows us that the amount of vampire power consumed by your chargers is extremely small and really isn’t worth worrying about. If you like the convenience of leaving your chargers plugged, go for it.
RELATED: How to Make Your PC Use Less Power
Yes, it’s true that you could save a tiny amount of electricity by unplugging your chargers, but you could save a much larger amount of electricity by looking to heating, cooling, lighting, laundry, your computer and other more significant power drains. Don’t sweat the chargers.
These are all relatively modern chargers, of course—the oldest one here is from 2012 or so. Much older chargers might actually use a noticeable amount of vampire power. For example, if you still have a cell phone or other portable electronics device from the 90’s, its charger might continually use a noticeable amount of power if you leave it plugged in—but even that amount of vampire power probably won’t make a noticeable dent in your electricity bill.
- › Are Smart Homes Worth the Investment?
- › Will a Smart Plug Pay For Itself?
- › Nothing Beneficial Comes From Mindless Scrolling
- › How to Add More Ethernet Ports to Your Router
- › Remembering VRML: The Metaverse of 1995
- › What Does “BB” Mean, and How Do I Use It?
- › 6 Tips for Making Microsoft Excel Charts That Stand Out
- › How to Clean Your iPhone’s Charging Port