Controversial hot aisle containment practices

Is your hot aisle too hot for OSHA? New data center trends are causing concern. You might need to adapt to adhere to the law.

If your data center relies on hot aisle containment for efficient cooling, there are practices and laws designed to protect the staff in the data center. Some are proving controversial.

Hot and cold aisles aren't facility designs fomenting disagreement in the data center industry. Building design, configuration and operation standards are also a concern, including man traps for security and economizers for energy savings.

Network switch air flow

TechTarget TechWatch logo

Many large network switches don't conform to hot/cold aisle containment designs. Data centers can use baffles from rack and cabinet manufacturers to force the strange air flows into conventional front-to-back patterns, but space for baffling isn't always available in existing rack rows, particularly when hundreds of cables already terminate on immovable patch panels. Different network switches have different nonstandard air flow patterns, so one baffle design does not necessarily fit all. And when mounted in baffled racks, the switches' added air resistance may increase fan speed and energy use. Expensive switches are most often mounted in standard racks, circulating and re-circulating their own air.

It's not just the big switches that are having cooling problems; top-of-rack switches are becoming more prevalent to consolidate the high numbers of network connections in server cabinets, and their air flows are usually conventional front-to-back. But switch network connections are usually on the front with server connections on the back. Too often, this results in switches being mounted in reverse, negating their own air flow designs.

Should you undertake cabinet reconfigurations to accommodate switches? Hot aisle/cool aisle containment has been part of data center building practices for so many years that perhaps we should insist that switch manufacturers conform to standard practices like everyone else.

New fire suppression designs

NFPA 75 and 76 Fire Protection Standards were revised in 2013, mostly to address the fire suppression problems created by containment. Isolating hot and cold aisles significantly improves cooling and energy efficiency, but if you don't have fire suppression heads in every aisle, then the air barriers block sprinklers and/or fire suppression gas. These barriers have to move so gas or water can penetrate the facility. Containment system manufacturers addressed fire safety with fusible links and heat-shrink panels, but by the time the links melt or the panels shrink and the air barriers drop out of the way, the fire is already raging. The NFPA revisions don't recognize any of these means of removing the containment barriers.

Upon smoke detection, a cooling containment apparatus must release electrically and cannot fall where they're a tripping hazard or will impede egress from an aisle. That's a tall order. Putting fire protection heads in each aisle is easy in new data center builds, but difficult, expensive and disruptive in retrofits.

The fire codes also invoke Article 645 of the NEC (NFPA-70), virtually mandating the dreaded Emergency Power Off or EPO button.

While these changes may hinder energy efficiency, the fire safety situation trumps cost savings.

Aspirating smoke detection aspirations

On the topic of fire protection, what about early-warning systems, such as very early smoke detection alarm and fire alarm aspiration sensing technology?

VESDA and FAAST are commonly recommended in a data center facility because they catch a developing fire long before it triggers the main fire suppression systems, preventing any serious damage. But the high-velocity air that most computer room air conditioners discharge makes it difficult to design and install these systems.

Local laws can impose restrictions that make VESDA and FAAST impractical. Consider the new requirements of NFPA 75 and 76. If the design requires containment to drop upon smoke detection, which smoke detection does that mean? Do curtains drop away at the earliest detection by the aspirating system? Or when the main suppression system kicks in, does it fill pre-action pipes or start the countdown to gas release?

The aspirating system can cause an alarm at various threshold levels to provide an early warning and then trigger suppression if the smoke density continues to increase. But what if local law requires the entire building to be evacuated at the first sign of smoke? Does this render the early detection system impractical or require its threshold to be set so high that it is doing no more than a conventional smoke detector? If you encounter circumstances that inhibit the usefulness of an early detection system, will you fight them or just forego the potential benefits?

Wider operating temperature range

The ASHRAE TC 9.9 Thermal Guideline recommends inlet temperatures for standard servers as high as 80.6 degrees Fahrenheit (27 degrees Celsius), and as high as 104 degrees F (40 degrees C) for special classes of hardware. The wide range significantly increases the number of hours and days that free cooling occurs rather than mechanical refrigeration. Even without free cooling, slightly higher operating temperatures save significant energy and electrical cost.

All the major manufacturers agreed that both new and legacy equipment can handle operation at 27 degrees C. The 2011 update to ASHRAE's guideline showed that the effect of higher temperatures on server life expectancy is minimal if cooling is properly managed.

Temperature and noise

High equipment densities and increased operating temperatures have brought noise and heat exposure into question lately. The Occupational Safety and Health Administration (OSHA) in the U.S. regulates human heat and noise exposure based on a combination of level (dB of noise or amount of heat) with exposure duration and rest time.

Data center noise is reaching levels that require adherence to OSHA guidelines. Consider putting employees in hearing conservation programs and providing them with protection. Focus on staffers working in fully contained aisles where reverberation within the enclosed spaces amplifies noise levels.

In-row and overhead coolers, while highly efficient, are problematic for noise control. Employees working in a loud aisle for less than four hours of the day are probably not at risk. Assess loud aisles to see if the noise levels are dangerous.

Temperature remains less of a health concern, although technicians working in some of today's hot aisles might disagree. Temperature exposure limits are calculated on the Wet Bulb Globe Temperature (WBGT). Globe Temperature is taken with a special thermometer placed at the center of a 6-inch-diameter black copper sphere, and is primarily a radiation phenomenon affected mostly by solar angle and wind speed.

WBGT takes humidity into account as well, though data center humidity levels should never be above 60%. This makes a WBGT estimate relatively easy to calculate. Let's use some worst-case parameters:

  • Maximum inlet temperature of 80 degrees F (27 degrees C)
  • Maximum dew point temperature of 59 degrees F (15 degrees C)
  • Temperature delta through the servers of 25 degrees F (14 degrees C)
  • Resulting hot aisle temperature of 105 degrees F (40.6 degrees C)
  • Resulting WBGT of 83.3 degrees F (28.5 degrees C)

Most hot aisle work is wiring, classified for OSHA as light hand work in standing position wearing light clothing. Under these conditions, the maximum allowable WGBT is 86 degrees F (30 degrees C), so continuous light work is legal in the hot aisle. Even moderate work activity is permissible for 50% of the time with 50% rest.

But for newer ASHRAE Class 3 & 4 equipment -- up to 113 degrees F (45 degrees C) operating temperature at up to 90% relative humidity -- OSHA regulations will kick in. An inlet temperature of 104 degrees F (40 degrees C) results in discharge temperatures of perhaps 125 degrees F (51.7 degrees C), creating a WGBT of 89 degrees F (31.7 degrees C). At this temperature, only light work is permissible, and only for 25% of the time with 75% rest. Moreover, 125 degrees F is above the OSHA maximum safe touch temperature for heated surfaces.

More details are available on the OSHA website.

Liquid cooling

Nothing is as controversial today as running water through the data center and its equipment. If we keep increasing heat densities, we know liquid cooling must follow, but it still makes us paranoid.

A liquid refrigerant could be less unnerving than water, even if it's not as efficient and is considerably more expensive. What about full oil immersion? Is that too esoteric for most of us? If it comes to a decision between high-performance computing and liquid cooling, which will win out?

About the author:
Robert McFarlane is a principal in charge of data center design at Shen Milsom and Wilke LLC, with more than 35 years of experience. An expert in data center power and cooling, he helped pioneer building cable design and is a corresponding member of ASHRAE TC9.9. McFarlane also teaches at Marist College's Institute for Data Center Professionals.

Dig Deeper on Data center design and facilities

SearchWindowsServer
Cloud Computing
Storage
Sustainability
and ESG
Close