Data centers’ power use less than was expected

Data centers’ unquenchable thirst for electricity has been slaked by the global recession and by a combination of new power-saving technologies, according to an independent report on data center power use from 2005 to 2010.

The report, by Jonathan G. Koomey, a consulting professor in the civil and environmental engineering department at Stanford University, found that the actual number of computer servers declined significantly compared to 2010 forecasts because of this lowered demand for computing and because of the financial crisis of 2008 and the emergence of technologies like more efficient computer chips and computer server virtualization, which allows fewer servers to run more programs.

The slowing of growth in consumption contradicts a 2007 forecast by the Environmental Protection Agency that the explosive expansion of the Internet and the computerization of society would lead to a doubling of power consumed by data centers from 2005 to 2010.

In the new study, prepared at the request of The New York Times, Mr. Koomey found that electricity used by data centers worldwide grew significantly, but it was an increase of only about 56 percent from 2005 to 2010. In the United States, power consumption increased by 36 percent, according to Mr. Koomey’s report, titled “Growth in Data Center Power Use 2005 to 2010.”

“Mostly because of the recession, but also because of a few changes in the way these facilities are designed and operated, data center electricity consumption is clearly much lower than what was expected, and that’s really the big story,” said Mr. Koomey.

Though Mr. Koomey was unable to separate the impact of the recession from that of energy-saving technologies, the decline in use is surprising because data centers, buildings that house racks and racks of computers, have become so central to modern life. They are used to process e-mail, conduct Web searches and handle online shopping as well as banking transactions and corporate sales reports.

Moreover, in the period studied, more services that depend on data centers, like cloud computing and streaming of music and movies, became popular.

The influential report issued by the E.P.A. in August of 2007 estimated that national energy consumption by computer servers and data centers would nearly double from 2005 to 2010 to roughly 100 billion kilowatt hours of energy at an annual cost of $7.4 billion. It predicted the centers’ demand for power in the United States would rise by 2011 to 12 gigawatts of power, or the output of 25 major power plants, from 7 gigawatts, or about 15 power plants.

Industry consultants and executives agreed with Mr. Koomey’s new analysis, but they also indicated that the slower growth might be temporary.

“Of course, the market is expanding,” said Jimmy Clidaras, principal engineer for platforms and infrastructure at Google. “We’re doing stuff today in the cloud that we never would have thought of. Music used to be at home and now it’s in the sky.”

“The numbers do make sense,” said Kenneth Brill, founder of the Uptime Institute, an industry consulting group based in Santa Fe, N.M. “But they shouldn’t be taken as indicating the problem’s over. There is certainly increasing energy consumption and that should be a concern for everyone.”

The slowdown in the rate of growth of electricity use is particularly significant because it comes in the midst of the biggest build-out of new data center capacity in the history of the industry.

Fueled by an insatiable demand for new Internet services and a shift to so-called cloud computing services that are largely hosted in commercial data centers and in the large data farms operated by companies like Amazon, Apple, Google, Microsoft and Facebook, there has been an increasing discussion about the growing percentage of the nation’s electricity that will be consumed by vast data centers being constructed at a record pace.

But the new report indicates that electricity used by global data centers in 2010 remained relatively modest. “Electricity used in global data centers likely accounted for between 1.1 percent and 1.5 percent of total electricity use, respectively. For the U.S. that number was between 1.7 percent and 2.2 percent,” according to the report.

In an earlier paper, Mr. Koomey reported that the power used by servers in data centers represented about 0.5 percent of world electricity consumption in 2005. When cooling and auxiliary infrastructure were included, that figure was about 1 percent, he wrote. The worldwide demand for data center power in 2005 was equivalent to the output of about 17 1,000-megawatt power plants.

As part of his latest research, Mr. Koomey was able to get a more detailed estimate of Google’s contribution to the global growth of power consumption by data centers than has been previously publicly available. Google, which generally builds custom computer servers for its data centers, has been secretive about the number of servers that it uses to power services like Google Search and YouTube.

However, in May a Google executive told Mr. Koomey that the company’s total data center electricity use was less than 1 percent of the Koomey report’s estimate of electricity consumed by data centers worldwide.

If the estimate is accurate, it could confirm the widely held industry perception that Google, with its many large data centers, is relatively more efficient than the mainstream of the data center industry. A vast majority of data center designers choose to use standard industry equipment, not equipment specialized for particular computing tasks.

Like this content? Join our growing community.

Your support helps to strengthen independent journalism, which is critically needed to guide business and policy development for positive impact. Unlock unlimited access to our content and members-only perks.

Most popular

Featured Events

Publish your event
leaf background pattern

Transforming Innovation for Sustainability Join the Ecosystem →