The amount of electricity consumed by data centres worldwide grew by 56% between 2005 and 2010, a new study has found. That growth was significantly slower than it had been between 2000 and 2005, when the data centre energy consumption doubled.
The study’s author, Stanford consulting professor Jonathan Koomey, attributed that decline to the adoption of virtualisation technology, which had made data centres more efficient, and the economic downturn of 2008, which slowed demand for data centre services.
Koomey estimates that in 2010, data centres accounted for between 1.1% and 1.5% of the world’s total energy consumption. Data centres in the US accounted for a slightly higher proportion of the country’s consumption – between 1.7% and 2.2%, Koomey estimates.
The figures were calculated by combining market research company IDC’s server shipment statistics with estimates for the average electricity consumption for each server. Koomey then added estimates for the typical consumption of storage, communications and data centre infrastructure such as cooling and power supply.
He than added to this an estimate of web giant Google’s electricity consumption, in acknowledgement of the fact that is builds its own servers and is therefore not be included in IDC’s server shipment figure. Based on reports of Google’s data centre operations, he estimated that Google’s data centres used around 0.01% of the world’s total electricity consumption in 2010.
Koomey implies that this is less than it might have been. "This result is in part a function of the higher infrastructure efficiency of Google’s facilities compared to in-house data centres [and the] lower electricity use per server for Google’s highly optimised servers."
The report predicts that cloud computing will reduce the global energy footprint of data centres, "because cloud computing installations typically have much higher server utilisation levels and infrastructure efficiencies than do in-house data centres."