Using free cooling in the data center

In this Q&A, an expert discusses the differences among free cooling technologies and explains when data center facilities will see a significant ROI using these approaches.

Data center cooling can represent a significant portion of a facility’s power consumption. So it’s no surprise that we’ve seen a sharpened focus on metrics, such as power usage effectiveness (PUE) and, more recently, carbon usage effectiveness (CUE) to help data centers gauge how much of their utility power is lost to supporting infrastructure. 

Play now:
Download for later:

Using free cooling in the data center

  • Internet Explorer: Right Click > Save Target As
  • Firefox: Right Click > Save Link As

In many climates, data center managers can help reduce these cooling costs by taking advantage of outside environmental conditions to cool IT equipment. Under the right conditions, these technologies can save so much in electrical costs that we often call it “free cooling.” In this podcast, Nick Martin, assistant site editor for SearchDataCenter.com, speaks with Bob McFarlane, a principal in charge of data center design for Shen Milsom & Wilke, to find out more about these so-called free cooling technologies.

Nick Martin: What are the types of free cooling technologies available, and what are the pros and cons of each?

McFarlane: There are really three types of free cooling:

  1. Air-side free cooling, which is what people think of first. This is where outside air is brought into the data center directly through filters or indirectly through heat exchangers.
  2. Adiabatic, which is a variation on air-side free cooling in which the air is brought to some sort of chamber and used along with water evaporation to cool the air.
  3. Water-side free cooling, where a cooling medium, such as water or glycol, circulates directly through cooling towers rather than the chillers or compressors.

As far as the pros and cons are concerned, it's a little complicated. Water-side free cooling is easiest. Most data centers use chilled water to cool their systems anyway. So it's logical to use water-side free cooling because the piping and the cooling towers are already in the plan. That makes the cost relatively small. Power for pumps and cooling towers are still required, of course, but it’s an easy concept to follow.

The problem is the changeover. You don't want to change from free cooling to mechanical refrigeration often, because short-cycling compressors is not good. Manual changeover is supposedly simple, but can get rather involved. They are many cases where the gain isn't worth the effort to put the system in, if it’s not going to be used. Automated controls solve that problem, but they can get complex and they need a wide window to ensure that the system doesn't go changing back and forth between free cooling and mechanical refrigeration.

There's no simple answer to all of this. The ease of using air-side free cooling is even more deceptive than water-side. I actually get people who think they can just open the window in the corner. The problem is, the air volume needed to cool data centers is very large. You need approximately 160 cubic feet per minute (CFM) per kilowatt. That's at a velocity of 2,500 feet per minute, which is about 28 miles per hour. So the air is really flying through the ducts. That means that a 250 kW data center would take 40,000 CFM of air and a two-by-eight-foot opening to get the air in. A one-megawatt data center would take a four-by-15-foot opening just to get the air into the data center. That's a big hole, and that's a lot of duct work.

But the real problem with air-side free cooling is the air quality. Even clean air requires filtering. Filters reduce the airflow, they expend energy to pull the air through them, and they require cleaning and replacement, which adds labor to the price tag. Humidity also has to be controlled. The best locations for free cooling are generally lower-humidity areas. High humidity requires dehumidification and that requires mechanical refrigeration. Mechanical refrigeration is part of what we’re trying to reduce with free cooling, uses a lot of energy. So there's no simple answer to all of this.

Adiabatic cooling is a variation of air-side free cooling that uses evaporation. That is usable only in the low-humidity environments, but it can significantly extend the hours of free cooling.

Martin: What do administrators need to consider regarding free cooling management?

McFarlane: Really, this comes down to whether or not the facilities personnel are capable of handling the type of system you choose. Even if you control the system as part of IT, facilities people are going to get involved in big mechanical systems. Do they understand the changeover capabilities and parameters? Do you have control, or do they? If the outside temperature starts to climb, can they react fast enough? Automatic systems may fail. I would say the main thing administrators have to consider is their facilities side.

Martin: What are free cooling costs and return on investment?

McFarlane: That's a major “it depends.” Condenser systems with economizer coils probably have the least capital cost. You can order air conditioners with economizing coils without adding a lot to the cost of the unit. You may have to add features to the cooling towers to prevent freezing in some climates, but none of that is really a major expense. Chiller plants become more expensive if using the chilled-water option. Manual changeover systems are relatively inexpensive, but of course but they may be a problem to use with automatic changeover systems that are going to cost more.

Air-side systems are going to have significant capital costs, even if you're going to use heat exchange systems, which are more appealing. Trying to pull air through a bunch of filters is not the best choice. The Kyoto wheel, which is an air-to-air heat exchanger, is a large-diameter, slowly rotating wheel that is gaining popularity and is well-proven. APC by Schneider Electric has brought a device to market called the EcoBreeze that combines adiabatic cooling with a chiller and a changeover system. So it's not completely air-side free cooling, but has all the automation built into it.

These are large physical systems and carry a high capital cost. So the return on investment depends on location, climate and the effectiveness of the controls. What percent of the year can you run without mechanical refrigeration? With automatic systems, they may not react to all of the times you could use free cooling. And it depends on your electrical costs. The higher the utility power cost, the higher potential for a fast return on investment. If most of your free cooling time is at night, are utility power costs lower at night? In many places power costs are less at nighttime.

What is your operating temperature? Are you following ASHRAE TC 9.9 recommendations, which is now up to 27 degrees Celcius (80.6 degrees Fahrenheit) for inlet temperatures? That envelope was set to allow you to use more hours and days of the year for free cooling. If you’re doing that, then your return on investment may be a lot better than if you're still trying to run your equipment at low temperatures.

Martin: Are these free cooling options suitable for a data center retrofit, or are they only practical and cost-effective if they are integrated into a new build?

McFarlane: Water-cooled data centers may be retrofitted, but it depends on the design. If it's water-cooled, it may be practical, but it would probably be disruptive unless you have totally redundant systems. In a retrofit like this, you’re cutting into chilled water or condenser lines and stopping the cooling.

Air-side free cooling is almost certainly restricted to new builds. It would be terribly disruptive to put ducts and openings of that size in an existing facility unless you can segment the facility and create a clean module in part of your data center. It also depends on the type of the existing cooling design. The outside air has to be brought into some type of distribution system. That usually means large ducts rather than under-floor systems. If the ducts don't exist, a retrofit with ducts of that size is going to be a major undertaking. So I’d say that free cooling is probably going to be much more practical in new builds than in retrofits.

Martin: Do you see more data centers adopting free cooling in the future?

McFarlane: Most new data center projects tell us that they’d at least like to consider free cooling. It's often driven by the pursuit of a Leadership in Energy and Environmental Design (LEED) certification, the desire for energy savings, corporate philosophy, or sometimes it's just curiosity, because the term free cooling floats around as a buzzword and not everyone knows what it means and what's involved in it.

The real big deal now is going to be ASHRAE 90.1. It is going to be code in many jurisdictions, which means it's a requirement, not an option. ASHRAE 90.1 is going to eliminate the present exceptions for data centers. It's going to require that free cooling be included in the designs of all new data centers. If 90.1 is adopted as part of the environmental code in your jurisdiction, and the chances are it probably will, then you're going to have to incorporate free cooling into a data center design. ASHRAE 90.1 does not consider mission criticality, so you're not going to be able to use the old argument that it might reduce your reliability. So yes, we're going to see free cooling adopted in data centers almost universally as time goes on.

This was first published in November 2011

Dig deeper on Data center cooling

Pro+

Features

Enjoy the benefits of Pro+ membership, learn more and join.

0 comments

Oldest 

Forgot Password?

No problem! Submit your e-mail address below. We'll send you an email containing your password.

Your password has been sent to:

SearchWindowsServer

SearchEnterpriseLinux

SearchServerVirtualization

SearchCloudComputing

Close