BACKGROUND IMAGE: iSTOCK/GETTY IMAGES
As data centers have become larger and more centralized, data center energy use has been on the rise. Research company IDC recently determined that businesses spend almost as much on power and cooling as they do to keep servers running. In response, businesses have developed metrics to examine energy efficiency; the tools provide broad measurements that are helpful for enterprises, but often are more appropriate for huge data centers.
The Green Grid consortium was formed in 2007, and one of its first initiatives was to develop power usage effectiveness (PUE), a data center energy use metric. The specification is designed to help IT professionals determine how energy efficient their data centers are and then monitor the impact of any efficiency efforts.
The standard covers all IT equipment, including servers, hubs, routers, wiring patch panels and storage systems. All of these devices generate heat, so data centers run special air conditioning and power systems designed to keep IT equipment cool. The PUE benchmark compares the power needed to run the equipment itself versus the electricity required to keep them temperate, backed-up and protected. Typically, IT managers oversee the computer systems -- servers, storage and networking equipment -- and facility managers deploy the environmental infrastructure, which includes power, cooling and airflow.
Heavy data center hitters reducing the most
The traditional enterprise data center uses about twice as much electricity -- usually in the 1.9 to 2.0 PUE range -- to lower temperatures to keep systems operating. These centers often use air conditioning systems to push chilled air through raised floors and water-cooling systems to transfer heat from servers, network devices and storage units.
"The leading IT vendors have invested a lot of money to reduce their data centers' energy usage," said Eric Woods, a senior analyst at Pike Research Inc. Because of such work, Apple, Microsoft, Google and Facebook have ratios in the 1.1 PUE range. However, special steps enabled them to reach those numbers.
These energy-efficient data centers are located close to power generation facilities. For instance, Facebook Inc. built a data center in Prineville, Ore., to be close to cheap hydropower. The proximity delays the stepping down of power line voltage until it is close to where the servers and switches are running and reduces energy loss in a power line.
Instead of air conditioning, the Facebook site uses ambient, outside air cooled by evaporation. The air is piped to the cool side of servers, blown into channels between baffles that steer it to the hottest components. The heat is collected in a hot aisle and flushed from the building. This process uses a fraction of the electricity typically seen with air conditioning systems.
Location plays a key role in energy efficiency. Facebook built a second data center in Forest City, N.C., where summers are warm and humid and again tried the ambient air cooling technique. That approach worked but the center faced high temperatures and a lot of humidity. Instead of working with 65 degree Fahrenheit air, it operated at temperatures up to 85 degrees Fahrenheit; rather than a maximum of 65% relative humidity, it functioned with 90% humidity. The higher temperatures and humidity changes required increases in fan-driven air, which increased energy usage.
Google Inc. is another vendor trying to construct more energy-efficient data centers. Since 2008, it has been tracking its PUE performance. Even though it has a broad mix of older and new data centers, the company reduced its overall PUE from 1.23 in 2008 to 1.12 in 2012, with one facility reporting a PUE of 1.06.
Facebook can scour the Earth for the best data center location, but most companies have more limited options when building and fine-tuning their data centers. Rather than the high, dry-air plains of eastern Oregon where summer nights are cool and winters cold, businesses often need to build their sites somewhere near their main office. Consequently, they often reside close to expensive power and operate year round in less than ideal conditions.
Solving the data center energy use problem
There are steps smaller companies can take to improve energy efficiency. Firms should try to improve air flow. They should mix and match the delivery of cool air and the collection of waste heat, so the air flows naturally without requiring special devices. Short-circuiting of heated air over the top or around server racks can be helpful. Cooled air can be funneled back to air conditioning units through openings in a raised floor, such as cable openings.
Another step is measuring energy use. "In the past few years, a number of tools have emerged from established vendors as well as startups that help businesses monitor energy usage," said John Pflueger, board member of The Green Grid from Dell Inc.
Businesses need to examine their older servers. The Environmental Protection Agency determined that old servers consume 60% of a data center's energy but deliver only 4% of its processing power. Underused systems represent another area of potential energy savings. Idle servers consume 50% to 60% as much power as a fully loaded system. Virtualization offers firms a way to funnel older and underused servers onto a central system and reduce wasted energy.
Data centers typically convert AC power from the grid into DC power to charge their uninterruptable power supply (UPS) batteries -- which provide back-up power whenever there's even a flicker of an outage from the grid. But that DC current must then be converted back to AC as it enters the center's power distribution units (PDUs), and then it returns to DC when it hits the server racks. During these conversions, some amount of energy is lost. Keeping everything in the data center on DC power eliminates three energy-wasting conversions: At the UPS, at the PDU and at the front end of the power supply unit on servers.
As technology has evolved, energy use has become a hot button issue for data center managers. During the past few years, new metrics such as PUE have emerged that help these managers monitor and curb energy use. While most enterprise data center managers do not have the funds or the flexibility to deploy their centers in optimal locations, they can take steps -- such as virtualizing more of their equipment -- to lower their energy bills.
About the author:
Paul Korzeniowski is a freelance writer who specializes in cloud computing and data-center-related topics. He is based in Sudbury, Mass., and can be reached at firstname.lastname@example.org.