AskPablo: Computer Standby

67437-click-hibernate.gifA few weeks ago I received the following question from A. Stevenson: “Any idea what a desktop computer uses when not turned on, or when it’s in ‘sleep’ mode?” This was in response to my Phantom Power article from January 1, 2007. Since I use a notebook computer I had to ask my dad to put his Kill-A-Watt meter to work on one of his office computers.

The particular computer is a typical office desktop computer. It is about two years old, has a CRT monitor, and about 8 different peripherals, including a printer. Just to be thorough, the line voltage was 118V at 59.9Hz.
The first measurement is made with the computer on, monitor on. During its normal operation the energy consumed varies between 120 and 130 W. Most of this energy is waste heat, some light (from the monitor), and some sound (from the fans), making it a very effective “personal space heater,” as Sun Microsystems co-founder Scott McNealy likes to call them.
When the screensaver comes on the energy used drops to 110 W, a reduction of only about 10 to 20 W. Remember this the next time you walk through a corporate cubicle farm at night and you see the company logo bouncing around, from corner to corner, on every screen, and in perfect synchronization. When you add up all the energy that is wasted in US offices when no one is even there besides the janitor it would be enough to make an oil executive giddy with giggles and “yee-haw’s.”
But what about standby mode or hibernate mode? Surely I am doing a good thing for the old mother earth by using this energy saving feature, right? Well, yeah. But no, not really. See, the measurements show that, even while a computer is in hibernate mode and the monitor is in standby mode it still uses 70 W.
But before you start to cry, let’s see how much those peripherals are actually contributing to the problem. Well, with the computer turned off completely, the meter still registers 50 W. These 50 W, equal to three full-power CFL bulbs, are wasted just so that these peripherals can be ready on a moment’s notice.
So, if we attribute 50 W of every measurement to the peripherals we actually get, 70 to 80 W for the computer when it is on, 60 W when the computer is on and running the screensaver, and 20 W when the computer is hibernating.
So, to figure out how much energy you or your office is wasting by leaving computers in hibernate mode (this can also be done for screensavers) you just need to make some assumptions. Let’s say that you use the computer for 8 hours per day, like in a typical office. The rest of the time, 16 hours, the computer is in hibernate mode. So, 16 hours times 20 W is equal to 320 Watt-hours, or .32 kWh (roughly $0.05) per day, or 116.8 kWh per year. This is just for the computer though. And since most people probably have their computer on the same power strip as their peripherals, they are probably going to be on as well. So, based on the data from my dad’s office, a computer left on standby for 16 hours a day, along with its peripherals, uses 409 kWh per year (16h x 70W = 1.12 kWh/day, or 408.8 kWh per year)!
Here’s my favorite part of the analysis. If you turn off the power strip the computer and the peripherals will be off. So, a computer and peripherals that use 0 W while off for 16 hours every day will use 0 kWh per day (16h x 0W = 0kWh) and 0 kWh per year! Amazing! When in doubt, turn it off, unplug it, put it in a box, sit on it and don’t let it out.
Pablo Päster, MBA
Sustainability Engineer
View Pablo Paster's profile on LinkedIn