Server power

A new study by the Lawrence Berkeley National Laboratory (pdf available here) finds that:

Aggregate electricity use for servers doubled over the period 2000 to 2005 both in the U.S. and worldwide Almost all of this growth was the result of growth in the number of the least expensive servers, with only a small part of that growth being attributable to growth in the power use per unit.

Total power used by servers represented about 0.6% of total U.S. electricity consumption in 2005. When cooling and auxiliary infrastructure are included, that number grows to 1.2%, an amount comparable to that for color televisions. The total power demand in 2005 (including associated infrastructure) is equivalent (in capacity terms) to about five 1000 MW power plants for the U.S. and 14 such plants for the world. The total electricity bill for operating those servers and associated infrastructure in 2005 was about $2.7 B and $7.2 B for the U.S. and the world, respectively.

Nicholas Carr comments:

The estimate that servers account for 1.2 percent of overall power consumption in the U.S. is, as the San Francisco Chronicle reports, considerably lower than some previous estimates, which put data center power consumption as high as 13 percent of total U.S. consumption. It should be noted that the study, underwritten by AMD, looks only at power consumption attributable to servers, which represents about 60% to 80% of total data center power consumption. Electricity consumed by storage and networking gear is excluded. The study also excludes custom-built servers, such as the ones used by Google. The number of servers Google runs is unknown but is estimated to be in the hundreds of thousands.

It all goes to explain why Sergey Brin & Co are getting so exercised about power consumption.