Manage Learn to apply best practices and optimize your operations.

Tackle the cooling challenges that edge computing poses

Edge computing is the next new trend, but recognize its obstacles before riding the wave.

Remote offices have always had their own servers and network connections, but major processing and storage stayed...

in the home data center, with dedicated cooling and power. Until now.

Edge computing has come of age. Decentralized processing and storage beyond the data center is becoming the new norm. As the demands for increased processing power and reduced latency grow, self-contained micro data centers, or data centers in a rack are rapidly emerging. Unlike the small servers and switches we've dealt with in the past -- which operated just fine in office quality cooling environments -- many of these edge computers use high-density blade technologies that generate significant heat densities and require constant cooling.

Edge computing at this level encounters obstacles:

  • Limited existing building cooling capacity outside a centralized data center;
  • Night, weekend and holiday cooling shutdowns or temperature shifts;
  • Lack of redundancy and reliability;
  • Physical building constraints complicating special cooling installations;
  • Confined spaces for installation of edge computing racks; and
  • Energy efficiency constraints of ASHRAE Standard 90.1.

Losing your cool -- and your power

Limited and intermittent cooling, as well as low facility reliability, are common to commercial buildings, and particularly to the buildings that tend to house remote offices. Therefore, many data center in a rack offerings are available with built-in, high-performance cooling systems. One option is rear door cooling coils that remove and neutralize the heat before it escapes from the cabinet. Another is internal chilled water coils and fans, or even direct liquid processor cooling. These server cooling methods all require connection to water. If that's not available from the building in sufficient capacity and reliability, you'll need more conventional cooling. Cabinets with built-in compressorized refrigeration are large and expensive, so the facility designer usually adds air conditioners to the room. These mechanical cooling setups require piping to heat exchangers outside the building -- usually on the roof. This assumes that space is available for the outdoor heat exchangers and a required service area around them, and for the piping to them. In a low-rise building, this may not be difficult, but in a high rise, it can be challenging and often prohibitive.

The most important consideration for edge computing spaces' air conditioners is choice of units. Businesses often choose wall-mounted, home-style units because they're inexpensive and small. Unfortunately, they're also unsuitable. Computing equipment emits sensible heat. Thankfully computers don't sweat, so the latent cooling capacity common to residential and commercial building air conditioners is not necessary. Those units have a low sensible heat ratio, meaning that only 50% to 75% of the advertised cooling capacity tends to be sensible, which is what we need to cool data center IT equipment. Air conditioners selected for the edge computing space should be designed and rated for the IT environment.

The trouble with remote space

Air conditioners suitable for IT cooling take space. Appropriate wall-mount units exist, but the ideal location is most often in the ceiling. Floor-mounted units take space, which is usually tight in remote offices, and the units could be bigger than the computing cabinet. It's also easier to manage air flows from a ceiling-mounted unit. This is critical; no cooling unit of any type will be effective unless the implementation enables proper control of its air flow. In addition to directing cool air to the fronts of the cabinets, the cooling design must consider the path that hot air takes after it's discharged from the computers. If it's not separated from the cool supply air, the two air volumes mix together. That has two undesirable effects. Hot air recirculation increases the temperature of the cool air entering the computing hardware, which wastes expensive cooling energy and reduces cooling effectiveness. Cool air bypass also reduces the temperature of the discharge air returning to the air conditioners. The effect that return air temperature has on air conditioner performance can be dramatic. It may seem counterintuitive, but the warmer the air returning to the air conditioners, the more real cooling capacity is available from the air conditioner coils. You actually get more cooling capacity from the same air conditioner by returning hotter air to it. Separated air flows results in big energy savings as well as less capital spent on oversized cooling units. Restrict cool and warm air to their proper paths with air ducts, or with containment air barriers that physically separate the fronts and backs of cabinets. These standard data center techniques can be physically challenging in the confines of a small room with only one or two cabinets, or even impossible to accomplish with the wrong cooling unit and/or location. Considering the cost and importance of high-performance computing systems, their environment should be a priority and should be addressed by experts familiar with the intricacies of IT environmental control.

A bad economy

Another constraint is only recently taking effect. The American Society of Heating, Refrigeration and Air Conditioning Engineers (ASHRAE) produces well-known Standard 90.1, "Efficiency Standard for Buildings Except Low Rise Residential." This standard is adopted into law in virtually every U.S. state and municipality, and is therefore enforced by local inspectors and code officials. The 2010 edition removed the exemption for data centers, so now places stringent energy efficiency requirements on computer cooling systems. In most cases, it requires the installation of an economizer, which is a large cooling tower often located on the roof of a building. An economizer is meant to reduce the need for mechanical refrigeration, saving the energy it would use by cooling with ambient air when the outside temperature is low. Even in desert climates, economizers can be very effective at night. Unfortunately, the specific requirements of ASHRAE 90.1 are often impractical or uneconomical, so enforcement may actually preclude an edge computing installation. It usually takes several years for the newest editions of standards to be adopted by any locale, but by now the 2010 version and even the 2013 edition are in effect nearly everywhere. A new standard, ASHRAE 90.4, will provide a more practical approach to achieving energy efficiency in data centers in the future. Smaller installations, such as most edge computing cabinets will still fall under the requirements of 90.1.

Edge computing is the wave of the future. But the environment in which the hardware will be located should be examined by a qualified professional, prior to committing to the purchase of this newer computing technology, to ensure it will comply with applicable codes and receive proper support.

Next Steps

Deployment options for turnkey data center IT

How to treat backup for remote office IT

Explore the benefits of edge data centers

This was last published in April 2016

PRO+

Content

Find more PRO+ content and other member only offers, here.

Essential Guide

Cutting edge: IT's guide to edge data centers

Join the conversation

1 comment

Send me notifications when other members comment.

By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy

Please create a username to comment.

How does your data center management team control edge computing sites?
Cancel

-ADS BY GOOGLE

SearchWindowsServer

SearchServerVirtualization

SearchCloudComputing

Close