Jonathan Koomey, a Stanford University professor and project scientist at Lawrence Berkeley National Laboratory, was the first person to quantify data center energy use. Experts would argue that his 2007
In this Q&A, Koomey discusses why those numbers were so important, how server energy consumption has slowed, and why data centers shouldn't get a bad rap as energy hogs. This Q&A is part of a series of interviews with five data center professionals who have changed the industry.
How did your 2007 data center energy use study come about?
Jonathan Koomey: The folks at AMD [Advanced Micro Devices Inc.] became convinced efficiency would help them achieve their next jump in market share. They wanted to give a quantitative assessment of the size of the data center energy problem, which is why they originally funded the work.
AMD saw efficiency as its strategic advantage and so they supported studies that would support a focus on that issue. The study also helped promote the idea that you could do something about the problem.
Why were those numbers important?
J.K.: Anytime you're trying to understand a problem, you need to scope it out in a quantitative way. People didn't have any sense of how important data center power use was, and the study gave them that. It also showed that the power use had doubled in a five-year period, which was a substantial growth rate. That got people's attention.
When you first published the study, people thought data center energy consumption would
double again. Did it?
J.K.: I haven't finished a detailed analysis of 2010 data center electricity use yet, but IDC's most recent forecast for server sales has shown a substantial slowing of the server-installed base in the past few years. I don't think power use will have doubled since 2005, but we'll have to wait until I estimate those numbers to find out.
Why has energy consumption growth slowed?
J.K.: You had Intel and AMD making processors more efficient, SPEC [Standard Performance Evaluation Corp.] rolling out a power-to-performance metric, increased virtualization, and -- probably most important -- the biggest economic shock since the Great Depression.
But that's not the whole picture. Energy-efficiency issues are as much about people and institutions as they are about technology. The industry is starting to realize this and make changes.
People think of this industry as sophisticated and high tech. But the human factor has a huge effect on how well you do in terms of efficiency and cost effectiveness of your installation. Once companies have the structure to deal with misplaced incentives and rationalize facilities and IT having separate budgets, they improve efficiency rapidly, because the economics of preserving the status quo just don't make any sense!
Data centers get a bad rap for being energy hogs, but their work often replaces manual,
physical processes that were more energy-intensive. What are the implications of moving from
physical to digital processes?
J.K.: People get obsessed by direct-electricity use, but they forget that moving bits is usually better than moving atoms. I did a study with colleagues at Carnegie Mellon comparing downloading music to buying it on a CD, which found that downloads caused 40% to 80% fewer emissions than purchasing a physical CD. That study is coming out in the peer-reviewed Journal of Industrial Ecology this year, and it traces the whole lifecycle of making CDs as well the electricity used for downloads.
I think the result is an important one: replacing atoms with bits is generally good for the environment, and we should keep in mind this bigger picture when evaluating how much electricity is used by data centers and other information technology equipment.
Let us know what you think about the story; email Matt Stansberry, Executive Editor.