Evaluate Weigh the pros and cons of technologies, products and projects you are considering.

Weigh your various data center cooling options

Not interested in moving to Antarctica? Consider other cooling options by analyzing these pros and cons for methods like in-row coolers and free cooling.

Data centers have options when it comes to a cooling strategy -- beyond turning the whole room into an artic chill...

zone.

Data center cooling options, from containment to self-cooled cabinets to outside air, come with their own advantages and drawbacks.

To determine which data center cooling methods are worth implementing, review the pros and cons in relation to your needs. Is cooling ride-through highly important? What about saving energy? Flexibility in design?

Cooling containment

Cooling containment means separating cool or cold inlet air from hot discharge air. Containment involves erecting barriers -- curtains, blanking panels, walls or other designs -- to keep the zones separated. The achievable level of cooling containment depends on budget, facility design and project planning stages.

Pros: Containment prevents air flows from mixing, so the air conditioning system doesn't work as hard to achieve target inlet temperatures. Sometimes containment can reduce the number of operating air conditioners.

Cons: Strict data center layout adds design planning time, fire safety considerations and cost to the facility, especially if retrofitting an existing space. The more containment is implemented, the more these drawbacks come into play.

Cabinets must be arranged front-to-front and back-to-back to alternate rack faces where cool air flows with aisles of exhausts. Close any gaps between cabinets with fillers, and block all unused rack spaces with blanking panels.

To prevent air flow over the tops of cabinets and around the ends of rows, hang fire-rated, anti-static plastic curtains at the ends of rows and above cabinets. For full containment, erect solid end panels with doors, as well as either solid barriers between the tops of cabinets and the ceiling, or ceiling panels at cabinet height.

Localized cooling

Source-of-heat coolers include in-row air conditioners, in-row coolers, above- and below-cabinet coolers, self-cooled cabinets, rear-door coolers, immersion cooling tanks and direct liquid cooling.

In-row air conditioners, such as the Tripp Lite SMARTRACK series, are packaged like equipment cabinets to stand within or at the end of rows. Some units discharge air to the cool aisle, while others direct air toward higher-load cabinets.

In-row coolers provide chilled water-, condenser water- and refrigerant-based heat transfer, with and without humidity control -- InRow RC and RP families from APC by Schneider Electric, for example. Fan speed and cooling capacity are usually controlled via temperature sensors on adjacent cabinets. With enough chilled water reserve and small pumps on an uninterruptible power supply (UPS), they can ride through a power outage.

Above/below-cabinet cooling is similar to in-row cooling; it uses refrigerants. Above-cabinet units discharge cool air directly in front of cabinets and pull hot air back in over the cabinet tops. Some units mount on top of cabinets while others suspend from the ceiling in the middle of the cool aisle.

Below-cabinet units are the opposite. One example is the Uptime Rack family like the ECC 13. These units also ride out a power failure on UPS.

Data center cooling by the numbers

Typical server operation today: Inlet temperature of 24 degrees Celsius (75° Fahrenheit)

Long-term reliable server operation: ≤27° C (80.6° F)

Acceptable operation for short-term, extraordinary events: ≤35° C (95° F)

Reliable operation for new, extreme-environment servers: ≤45° C (113° F) with up to 90% relative humidity

Temperatures provided by The American Society of Heating, Refrigerating and Air-Conditioning Engineers (ASHRAE)

Self-cooled cabinets are the epitome of closed-loop systems. These cabinets recirculate air within their enclosures, cool via water or refrigerant, and work without supplemental cooling in the room. Emerson Network Power's Liebert XDK racks with integrated cooling are one example.

Rear-door coolers are radiators through which chilled water circulates to remove the heat. They replace normal cabinet doors, cooling the hot exhaust air and then discharging it back into the room. Examples include Motivair's Chilled Door products, which are often optional or aftermarket rack accessories.

Immersion cooling uses a nonconductive/noncorrosive oil bath to remove heat directly from circuits. In a power failure, the oil bath provides an enormous thermal mass to keep equipment cooled with only a small circulating pump running on UPS. While highly efficient, immersion requires modified hardware and makes equipment messy to work on.

Direct liquid cooling is the ultimate expression of source-of-heat cooling. Cooling liquid circulates through a heat sink in direct contact with hot components, such as processors. It is used for some high-performance hardware already, and it is likely to gain popularity.

Pros: When cooling units operate close to computing equipment, it saves the cost of fans pushing cooled air all the way across the room, under the floor or through ductwork. At-the-source cooling also prevents cold air from warming up before it reaches the rack inlet.

Since the cooling units are modular, it's possible to relocate them as compute needs change. Source-of-heat cooling adds flexibility into the data center layout, and prevents redesigns of the building structure or major computer room air conditioner (CRAC) upgrades. Rear-door coolers, for instance, satisfy the total cooling requirement if installed on enough cabinets. In many instances, however, that means every cabinet.

Cooling units easily capture hot return air. Systems like in-row coolers work well with containment cooling designs. They use less power than some conventional data center cooling options.

They also handle high-power-density cooling, such as 10 or even 25kW cabinets.

Cons: Chilled-water coolers introduce the danger of leaks. Adding a network of pipes under the floor or over cabinets can be complicated, but proper installation with leak-detection systems and drains reduces concerns.

Refrigerant, a gas rather than liquid, is more difficult and expensive to install, balance and alter, though it does eliminate water dangers.

Some source-of-heat options, like self-cooled cabinets, are large, heavy and expensive. They introduce heterogeneity into facility maintenance, which may mean more upkeep.

Many source-of-heat coolers remove only sensible heat -- the heat you can feel -- and are designed for very high heat loads. Sensible heat contains no moisture, and therefore, many source-of-heat coolers have no humidity control, requiring data center operators to install separate humidity units.

Many of these units won't operate at low heat loads. You will likely have to mix localized cooling with conventional CRACs to cool lower-heat-density racks and cabinets.

Free air cooling

If the air outside a data center is regularly below 27 degrees Celsius (around 80&#176 Fahrenheit), free cooling is possible -- directly cooling equipment with filtered outside air, or cooling circulating water by outside air, requiring only pumps and no mechanical refrigeration. Kyoto wheels are one example of a free cooling design.

Evaporative or adiabatic cooling also uses free air that passes over a moist medium. This technique absorbs heat through evaporation and works in hotter climates.

Pros: Free air cooling significantly reduces facility energy use and maintenance tasks. It is often touted as a green or sustainable computing initiative.

Free cooling is an option for a wider mix of data center locations as companies upgrade to higher-operating-temperature servers. Free cooling can also work with mechanical cooling methods.

Cons: To use free cooling year round, the data center must be located in an area with predictable temperature and humidity levels to achieve the necessary performance. This may change as computing equipment becomes more resilient.

The investment to install free cooling systems is sometimes high. The transition from mechanical to non-mechanical cooling is usually the biggest challenge in a free cooling design. If mechanical systems still work, removing them might be a hard sell.

About the author:
Robert McFarlane is a principal in charge of data center design at Shen Milsom and Wilke LLC, with more than 35 years of experience. An expert in data center power and cooling, he helped pioneer building cable design and is a corresponding member of ASHRAE TC9.9. McFarlane also teaches at Marist College's Institute for Data Center Professionals.

This was last published in October 2014

Dig Deeper on Data center design and facilities

PRO+

Content

Find more PRO+ content and other member only offers, here.

Join the conversation

1 comment

Send me notifications when other members comment.

By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy

Please create a username to comment.

What's your current data center cooling method? Is it working well?
Cancel

-ADS BY GOOGLE

SearchWindowsServer

SearchServerVirtualization

SearchCloudComputing

Close