According to a recent study by Jonathan Koomey of Lawrence Berkeley National Laboratories, the energy consumed by US servers accounts for .6% of overall US electricity consumption. If you add in the energy used to cool these systems that number doubles to 1.2%, the same amount of energy consumed by all US televisions. If current trends continue, "server electricity usage" could increase 40% by 2010 as computing needs expand exponentially. The whole thing sounds like operator error to us.In the last five years US servers burned through 5 million kw of power — that's the equivalent of five 1GW power plants, or more than "the total possible output from the Chernobyl plant" when it was working. But as they say, one mans trash is another mans treasure and computer companies everywhere can see business opportunities in inefficiencies. AMD, the company sponsoring the study, is offering energy-efficient chips and efficient processors. Intel is also now offering efficient chips. The US Environmental Protection Agency is trumpeting the study and hopes that other companies will see an opportunity to reduce energy consumption. Is this another case of hoping our technology will save us from our technology?
Check out this TH article for more ideas on how to reduce the footprint of your own personal computer.