Liquid cooling drives down costs and improves efficiency

Everything old is new again, and data center cooling is no exception. As density increases, liquid cooling looks more appealing.

My father used to joke that his fashion sense was trendy every 20 years. It wasn’t too long after I heard him make that comment that the grunge movement brought his flannel shirts and worn jeans back into the mainstream.

Data center cooling might soon follow a similar pattern: As data center densities increase, liquid cooling is gaining currency as an efficient technique. It was in style during the 1970s with mainframes and during the 1980s in Cray supercomputers, and now it’s coming back in the form of specialized racks and direct liquid-cooled servers from IBM and others.

Why the resurgence of liquid cooling? When it comes right down to it, air is a poor heat conductor. Much of the heat carried away from a server is actually transported by the water vapor in the air. That’s partially why the American Society of Heating, Refrigeration and Air-Conditioning Engineers (ASHRAE) specifies minimum relative humidity and server operating temperatures.

Furthermore, to cool effectively with air, you need to start with colder air, which takes energy to produce. If you can’t make the air cooler, you need to move it faster, and fans consume more energy.

This is tremendously inefficient. ASHRAE’s power usage effectiveness (PUE) metric measures how much power is wasted in providing 1 watt worth of computing work. The Uptime Institute found that the average data center has a PUE of 1.8. That means for every 1.8 watts consumed by the data center, only 1 watt reaches the computing workload.

The Two-Pronged Approach

So how do we drive up efficiency and drive down costs? First, vendors must increase servers’ operating temperature ranges. For example, Dell’s Fresh Air initiative has optimized several server models to run constantly at 80° Fahrenheit, the top end of ASHRAE’s temperature recommendations. So you can turn up the thermostat and save a ton of money (or use free air cooling).

Second, push liquid cooling. Many data centers are skipping computer room air conditioners. Instead, they’re installing racks with integrated cooling and planning for liquid cooling directly to servers. EBay’s Phoenix data center was built to accommodate in-row, rear-door or direct liquid cooling, even for the modular data centers on the facility’s roof. The online auctioner boasts an unprecedented PUE of 1.018 for individual liquid-cooled rooftop modules in the winter, an overall year-round PUE of 1.35 and the flexibility to add liquid-cooled servers indoors.

Do we even care about these efficiencies in the face of the cloud? Of course we do, because they drive down prices. And while some organizations have moved to public clouds, many others are keeping their data centers. Driving down data center operating costs just adds to the cost benefits of consolidation, automation and standardization. But liquid cooling isn’t something that people consider often, because moving air to cool servers is so traditional. Like my father’s plaid flannel, though, liquid cooling has been done on a large scale, and its time is coming again.

About the author:
Bob Plankers is a virtualization and cloud architect at a major Midwestern university. He is also the author of The Lone Sysadminblog.

This article originally appeared in the December/January issue of Modern Infrastructure.

This was first published in January 2013

Dig deeper on Data center cooling

Pro+

Features

Enjoy the benefits of Pro+ membership, learn more and join.

0 comments

Oldest 

Forgot Password?

No problem! Submit your e-mail address below. We'll send you an email containing your password.

Your password has been sent to:

-ADS BY GOOGLE

SearchWindowsServer

SearchEnterpriseLinux

SearchServerVirtualization

SearchCloudComputing

Close