If you're anything like me, you have a number of gadgets within arms reach that are either on charge right now, or will need charging at some point during the day. Maybe it's a smartphone or tablet, or a laptop, or a wearable. They all have one thing in common - they need power.
And that power costs you money. But how much money?
Over the years I've seen a lot of estimates as to how much power smartphones, tablets, laptops, and other devices consume, but it has always concerned me that these numbers either appear to be pulled out of the air, numbers copied off another website that pulled them out of the air, data derived from published battery capacity figures, or are figures derived from lab testing as opposed to real-world usage.
As regular readers will know, I'm a big fan of real-world testing. The only drawback of real-world testing is that my "real world" is going to be different to your "real world," which means that your mileage can, and probably will, vary. But, as long as a few variables are nailed down, these differences shouldn't be too great.
I also have to make some assumptions. I'll outline those as I go.
Finally, I have a lot of ground to cover, and that means I might use terms that are unfamiliar to you. I'll provide links to information where you can learn at your leisure.
I'm going to be using two bits of kit:
Power is priced in kilowatt-hour (kWh), which is 3.6 million joules of energy. A device rated at 1,000 W running for one hour will use 1 kWh, while a device rated at 100 W will take 10 hours to consume 1 kWh.
Because consumer electronics draw so little power, I will also be using watt-hour (Wh), where 1,000 Wh equals 1 kWh.
According to figures published by the US Energy Information Administration for January 2016, the average cost per kWh in the US was $0.12. This is the figure I will be using.
I've taken different approaches depending on the device I'm testing.
For smartphones and tablets, rather than find how much power it took just to charge the battery from 0 percent to 100 percent, and try to fudge that into some real-world figure, I did what most people do and put these devices on to charge overnight and measure the power consumption. I chose this method for two reasons:
I know that some people make heavy use their devices while they are on charge (I'm one of those people usually), but this didn't feel representative to me. I'm not looking at how much power I can use, but instead looking for a more normal usage pattern.
When it comes to laptops, I want to know two things:
Again, I'm trying to replicate "real world" (or my "real world" as best as I can.
None of this is perfect, but overall I'm happy with the approach. If readers have any suggestions, I'm open to feedback and commentary.
OK, with all that out of the way, let's answer those burning questions you have about charging devices.
My test subject is the iPhone 6 Plus, which has the biggest battery that Apple offers. I'm also a pretty heavy user, and this meant that going all day was sometimes tricky (the things I do for you).
During this time I chose not to charge the device during the day or to charge it while in the car.
Here's what I found.
On average, during an overnight charge, the iPhone consumed an average of 19.2 Wh. A miniscule amount, but over a year that translates into 7 kWh, which will set you back $0.84.
Smaller iPhones will consume less, while heavier users will see that bill rise, but you'd have to be a really heavy user for your iPhone to cost you more than a few dollars a year.
Following a similar methodology, an iPad Air 2 consumes around 35.3 Wh during an overnight charge. Over a year that works out at 12.9 kWh, costing $1.55.
Heavy users might find they need to charge the battery more than once a day (or they might keep it tethered to the power adaptor for longer), while lighter users might get away with only charging it every few days, reducing the cost of ownership.
The test subject here is a 15-inch MacBook Pro with Retina display, which has a pretty beefy battery.
On average, I found that an overnight charge of the battery consumed 128.5 Wh during an overnight charge. That works out at 46.9 kWh per year, or about $5.63.
But, remember that laptops pull more power when they're connected to mains power.
I found that an hour of usage consumed an average of 65.2 Wh. Five hours of usage per day, that translates into 119 kWh over a year, costing you $14.28 per year.
Anyone who owns one of these will know how long they go on a single charge. I find that I only charge mine overnight once a month, consuming only 7.7 Wh. Over a year that's less than 0.1 kWh, which costs a little over a cent.
Yes, but only a small amount. My testing suggests that a genuine Apple iPhone charger uses about 130 Wh of power a month, or about 1.5 kWh a year, equaling around $0.18.
Doesn't seem like a lot, does it? But take the following into account:
Factor the total environmental cost of these chargers. Millions of chargers left plugged in 24/7 are consuming millions of kilowatt hours every year. And each kilowatt-hour results in about a pound of CO2 being released into the atmosphere.
Are these figures high? Are they low? What do they mean?
On the whole, these figures are pretty low, especially when you consider that a gaming PC can eat through about 3 kWh a day, and a regular desktop about 0.5 kWh per day.
But, remember it's not nothing either. Every device you have needs power, and the more devices you have, the more it adds up. Every smartphone, every tablet, every laptop, every TV, IoT thermostat or smoke alarm, every router, every piece of home entertainment gear.
It all adds up.
And don't forget the chargers. They add up too. And when you think about how many devices are sold every year - last quarter Apple sold 75 million iPhones alone - that's a lot of devices sucking on the power grid, which in turn makes lot of CO2.
See also: