Manage Learn to apply best practices and optimize your operations.

Best practices for a data center economizer system

A free cooling economizer system can be a cost-effective means of cutting energy use and costs for maintenance of mechanical cooling systems.

Using free cooling with an economizer system to reduce energy costs is a hot topic in the data center. Any time you can turn off your air-conditioning units or limit the use of chillers to reduce the heat generated by the servers will not only cut operating costs but will also slow the wear-and-tear on your mechanical cooling systems. Julius Neudorfer, CTO of data center design and consulting firm North American Access Technologies, talks about some best practices with a data center economizer system.

Question: What does “free cooling” with an economizer system mean from an enterprise data center perspective?

Julius Neudorfer: We all like the word “free” and cooling represents a significant portion of energy costs in the data center. In a traditional enterprise data center, cooling is usually accomplished via the use of mechanical refrigeration, involving the use of either a compressor unit within the data center (CRAC) or an external one that provides chilled water to an air handler in the data center (CRAH).

Free cooling generally refers to an economizer system, which can reduce or eliminate the need to operate your mechanical refrigeration when the outside air temperature is at or below a point where it can provide direct or indirect cooling.

In most cases free cooling does not eliminate the need for a compressor-based system. You still need a complete mechanical cooling system in place that must be operational ­— even if you only need it for a small portion of any given day — in order to keep the data center cool during warmer days. It also does not eliminate or reduce fan energy, just mechanical cooling energy.

Question: Free cooling clearly has a major impact on data center design. Is free cooling best for a new build, or can it be successfully introduced to an existing build?

Neudorfer: While free cooling systems can be retrofitted in some cases, it is more cost-effective if incorporated into a new build. There are two basic types: air-side and water-side. The direct air-side system is getting the most attention at the moment. It basically introduces outside air into the cold aisles of the data center whenever the outside air is within an acceptable temperature and humidity range. It is very difficult — in many cases nearly impossible — to retrofit an air-side economizer system into an existing data center.

Use of a water-side economizer system has been more common in larger data centers especially in cooler climates, but is generally effective only when ambient temperatures are substantially below the supply water temperature, typically 45°F. It is possible but expensive to retrofit a chilled water system with a heat exchanger for water-side economizers, but it isn’t practical while the system is operating. Opportunities to do the retrofit are limited.

The Green Grid has publicly available maps that estimate the percentage of free cooling available for both air and water systems by region. However, there are far more opportunities for free cooling with air-side economizers since they can potentially provide 100% free cooling up to 80°F and be partially effective at even up to 100°F ambient. It is important to analyze the potential number of days that are cold enough to allow the economizers to operate effectively to justify incorporating them into a new design. This is especially true if considering retrofit installation obstacles and when calculating the return on investment (ROI).

Question: Are there any important complementary technologies the free-cooling curious should consider in tandem with economizer system deployment?

Neudorfer: In any data center, preventing warm and cold air from mixing is important. It is even more so for an economizer system — water or air — to maximize effectiveness. By isolating the hot exhaust air, you improve the operating range and create more opportunity for free cooling days. This is particularly true if servers are consolidated onto blade servers, since they generally have a Delta-T up to 36°F and therefore have higher exhaust temperatures than most small servers, which typically have 15°F  to 20°F.

Containerized data centers  offer near-perfect hot/cold isolation, and some are specifically designed to maximize the use of free cooling. In other words, they’re great if you’re just starting a build. These are things to think about when you’re starting out.

Question: What are some major considerations in evaluating a site for free-cooling suitability?

Neudorfer: In addition to the temperature, local humidity is also important for both air-side and water-side systems. Specifically, high ambient humidity precludes or minimizes the usefulness of adiabatic systems and forces the need for mechanical cooling to de-humidify even if the ambient temperature is within the proper range. High humidity levels also affect any evaporative-based water-side economizers.

Keep in mind that a direct air-side economizer system is subject to introducing things from the outside air including smoke, particulate matter or corrosive gases from adjacent sites. That could trigger false smoke alarms or cause corrosive damage to IT equipment, especially if combined with higher humidity from adiabatic cooling. In fact, ASHRAE is currently studying the corrosive damage issue.

Question: What are some of the biggest data center design oversights you see from free-cooling adopters? How could they have been avoided?

Neudorfer: Like anything else that is new, there is a learning curve. In this case water-side economizer systems have been sorted out since they’ve been around longer. Air-side systems are still in their infancy, and there aren’t really any “off-the-shelf” versions. They are usually designed into the building structure, and each one is unique. While some of the basic components are standard — motorized vents and dampers — the control system design, integration and tuning are still custom and in the early stages of commercial development.

In fact, Facebook’s Prineville, Ore., data center — primarily cooled by fresh air and adiabatic systems — experienced a serious control system program error several months after it was commissioned in early 2011. The system responded incorrectly to rapidly changing external weather conditions and shut off all outside air suddenly. It then re-circulated the warm internal air while simultaneously trying to use adiabatic cooling to control the rising temperatures. The injection of high levels of water mist into the re-circulated air caused condensation on the servers, shorting out the power supplies and forcing a shutdown. They were able to correct the programming and have operated successfully since that episode, but it shows these systems still have serious kinks.

While I believe free cooling with air-side economizer systems will become more common, it should be approached with caution. Once mastered, it offers tremendous opportunities for energy savings.

ABOUT THE AUTHOR: Julius Neudorfer has been CTO and a founding principal of NAAT since its inception in 1987. He has designed and managed communications and data systems projects for both commercial clients and government customers.

Dig Deeper on Data center budget and culture

Start the conversation

Send me notifications when other members comment.

Please create a username to comment.