I read recently – and I can’t find the link – that energy efficient light bulbs emit less heat, so central heating needs to work a little harder. This doesn’t cancel out all of the saving, because it is more efficient to burn gas at home than to generate electricity and heat your home with light bulbs, and where there is air-conditioning not generating heat from lighting in summer is a double saving. It set me thinking about what was a real saving and what was just paying lip service to the green agenda.
Desktop PCs are obvious consumer of electricity and Windows Vista was the first operating to allow organizations to manage power settings centrally with group policy, it was more reliable than Windows XP going into and out of sleep, resuming in a second or two, An up-to-date organisation can ensure its desktops go to sleep after 15 or 20 minutes and Vista and Windows 7 can wake from sleep to run scheduled tasks like backup, disk defragmentation and updates. People have challenged my advocacy of sleep saying “But sleep still uses some power”. When Hibernated or Shutdown you’d expect a machine to use no power at all, but I found the new “Net-top” I was building drew the same current when shut down and when sleeping – “off” is sometimes “Standby”.
Watts and Kilowatt-Hours can seem a bit abstract so as a rule of thumb: using 1 Watt for 1 Year costs £1 (that’s 8.76 KWh so £1 implies 11.4p per KWh – which will do here as a representative cost on a UK home tariff). Obviously prices vary around the world, and so do emissions depending on the Power stations feeding the grid and losses as electricity is transmitted. The carbon trust’s emissions figure for electricity off the UK grid is 544g of CO2 per KWh consumed, so for a second rule of thumb: 1 watt for 1 year creates 5KG of C02. For comparison that’s equivalent to about 2 litres of petrol or 1.8 litres of diesel. Tax policy means 5Kg of C02 emissions saved from Petrol or diesel saves about £2.50 but 5Kg saved from domestic electricity only saves £1, so I would say cutting down on travel is easier to do and has the bigger financial impact than saving electricity. A logical conclusion from that is using more energy in IT in order to reduce Travel is a good thing. But that’s not my subject for this post.
I recently bought a plug in power meter from Maplin, I wanted to compare my newly built computer against the old one, to look for savings, and basically because I’m a bit nerdy like that. I remember seeing the Standby saver the BBC’s Dragons Den; it “sees” a remote switching the TV off and cuts the power to the TV and things connected to it (like a DVD player or Xbox), so they don’t bleed power. When the remote turns the TV back on, power is restored to everything. The saver’s web site proclaims “reduce your electricity bill by up to £43 per unit”. The first company I found selling it claimed “On Average this product will save the user over £37 per year”. So I wondered “Would I average a net saving 40Watts over a whole year?”. There is a second version which connects a computer via USB when the computer goes to sleep the monitor, printer and other peripherals get powered down – is that worth investigating ?
The answer for my household appears to be resounding no. Our secondary TV is an old 14” CRT one that is used so rarely it’s left unplugged; just as well it consumes 12W on standby (compared with 50W when watching TV). A large CRT TV might get most of the way to the 40W target: but I haven’t had a one of those for some years. My LCD TV is never truly “off”, registering 1.4W on standby – (against 120W when running). A standby saver would power off my early-model Xbox 360 which registers 2.8W on standby (and 180W when gaming – the new models should be lower), and it’s steering wheel – 1W on standby and up to 13W when running and Kinect, which registers zero on standby and no more than 7W when running. That’s a total of 5.2W. The TV has integrated free-view (no set-top box), and one wouldn’t normally put any kind of recorder onto the standby saver: my 20 year old VCR uses 10W when idle and it’s used so rarely I could save myself a tenner a year by disconnecting it. Even including it, I can’t get to a third of the saving promised.
Turning to my computer: I bought its 15” LCD monitor back in 2000, like the TV it only has a standby mode which uses 4W (against 25W in use). I’ve bought a new 20” monitor for the “net-top” which uses just under 1W when idle and 37W in use – 50% more power for double the screen area. My inkjet printer draws 1.2W when idle (I didn’t test it printing but a self test was just short of 10W) Of all the peripherals my scanner is worst drawing 9W whether being used or not, worse than TV, Xbox Monitor and printer put together. Even if I left it on (which I don’t) I can’t get the total idle power past 15W.
The shock in all the Power consumption numbers is the old computer. Dell say it shipped in November ‘03, and its graphics weren’t state of the art even then. The new machine is based around the Intel Atom 510 which has integrated graphics, they support aero-glass but wouldn’t cut it for gaming. But it only uses 29W when running – there are graphics cards out there which use several times that. With 2 cores with hyper-threading, it out-processes the 2.2GHz Celeron in the old machine – even with a clock speed of 1.66GHz (its speed is fixed unlike its cousins found in Netbooks). In sleep or standby it uses 5W. I get a system total with the new monitor of 66W running and 6W sleeping.
The old machine also goes to “standby” rather than “off”- drawing about 1W, and when running the old Celeron board needs about 85-90W, giving a total with monitor around 110W. But what does it use in sleep mode? 10W ? 15W ? No. 35W. I thought I’d missed a decimal point on the meter, but it is thirty five watts. The atom uses less power awake than the old machine uses when it is sleeping.
This isn’t absolute proof of much… newer should be more efficient, and Atom based systems can run business applications quite well enough with a power consumption more like that of a terminal. Beyond that your mileage will vary, but if you are thinking about refreshing hardware it’s hard to find out how much energy is used in it’s manufacture, but it may be worth getting a meter and doing some tests of your own to see how much will bee used in it’s working life.
For those who might be interested, here are my numbers. Feel free to use the comments to share yours.
|Active typical||Max recorded||Sleep / Standby||“Off”|
|Acer Monitor 225 HQ 22″ (19×10.5″) monitor
(2010 model year)
|37 W||44 W||1 W||1 W|
|Self Build Atom 510 PC||29 W||36 W||5 W||5 W|
|Dell Optiplex GX 270 (2003 Model year)||80 W||90 W||35 W||1 W|
|Samsung SyncMaster 520TFT 15″ (12×9″) Monitor
(2000 model year)
|24 W||30 W||4 W||4 W|
|HP Scanner||9 W||9 W||9 W|
|Epson R2400 Inkjet Printer||8 W||1 W|
|Sony 14″ CRT TV (approx. 1990)||50 W||12 W|
|Samsung LE26R41 LCD TV(Approx. 2006)||118 W||128 W||1 W|
|Sanyo VHS VCR (Approx. 1990)||21 W||28 W||10 W|
|Xbox 360 Console (Approx. 2006)||170 W||186 W||3 W|
|Xbox 360 force feedback Wheel||10 W||13 W||1 W|
|Xbox Kinect||5 W||7 W||0 W|