It's not the heat, it's the humidity. Data center humidity is often an afterthought compared to temperature, but...
keeping it at the right level keeps computer components from failing, and energy from being wasted.
Modern air-cooling units in a data center, whether they are CRACs in a raised-floor environment, overhead coolers or units nestled in a row of server racks, display a relative humidity reading that data center managers can adjust. But within the industry there is debate about the proper humidity range to ensure safety of data center equipment, as well as whether there exist better ways to measure humidity in the room besides the relative humidity reading.
It's safe to say that most data center managers aren't meteorologists, but it’s important to understand the basics of data center and server room humidity, which can impact how long your computer equipment lasts and how much your electricity bill costs.
How does data center humidity work?
Humidity is a measurement of moisture content in the air. If a data center room is too humid, condensation can build on computer components and cause them to short out. In addition, high humidity can cause condensation to form on the coils of a cooling unit, causing it to work harder to rid itself of the condensation, which in turn can lead to wasted cooling, also called latent cooling, and that costs money.
The traditional way of measuring humidity in the data center is to look at relative humidity. Relative humidity is given as a percentage and measures the amount of water in the air at a given temperature compared to the maximum amount of water that air can hold. A technical committee of the American Society of Heating, Refrigerating and Air-Conditioning Engineers (ASHRAE) that focuses on computer rooms once recommended that relative humidity be within the 40% to 55% range, but has since said that humidity should be measured by dew point rather than relative humidity (more on that later).
Meanwhile, if humidity is too low, data centers can experience electrostatic discharge (ESD), akin to giving someone a shock after shuffling along a carpeted floor in stocking feet. That sort of event can shut down electronic equipment and possibly damage it.
That's what happened to Paul Henderson shortly after he started working as the head of systems and network engineering at the Princeton University Plasma Physical Laboratory. A systems operator got a static charge and touched a system, which tripped an internal thermal sensor on the server and caused it to power off.
To prevent this from happening again, Henderson's group installed a humidifier in the data center as its older air-conditioning units didn't have internal humidity controls.
Henderson found the lower end of ASHRAE's recommended humidity to be ideal.
"The optimal humidity, from my experience, is about 40% in a computer room," he said. "If it is lower than that, you can generate static electricity rubbing into a door or cabinet or just crossing a long floor."
Data center humidity range too strict?
But some people in the industry think that range is too narrow and restrictive, and ASHRAE should expand the boundaries of what is recommended. ASHRAE did say that a range between 20% and 80% is "acceptable," but still recommended the 40% to 55% range.
However, keeping humidity in that range can be difficult because of the frequently changing conditions in the data center, such as higher temperatures due to increased load. Conditions within different parts of the data center can also vary, causing cooling units to behave differently and making humidity control more difficult. At the same time, failing to maintain a certain humidity level precludes data center operators from being able to say they are designing and operating within the ASHRAE recommended range.
"This is a controversial subject with differing opinions," said Robert Sullivan, a consultant with data center consultancy The Uptime Institute Inc. "People use the telecommunication industry, the TIA organization, and say we should look at what they recommend, which is really wide, like 35% to 65%. And so the data center people are saying, 'Why are we so tight?'"
Sullivan is against loosening the range for data centers. "I am one of the advocates of being tight because I firmly believe that ESD at the low relative humidities causes problems."
Sullivan said he bases his opinion solely on anecdotal data, adding that trying to get "failure data" from manufacturers has been impossible. "If they've done the analysis, they're not willing to share," he said. But what Sullivan has found is that when relative humidity falls below 20%, computer parts start failing even without receiving a shock from an operator. Sullivan said this could be the result of the triboelectric effect, which states that if the air is dry enough, just the act of it passing over a surface can affect computer components.
"In more than one case, when relative humidity was brought under control in the server room, the failure stopped," he said.
But Coy Stine, a simulation engineer at electronic cooling company Degree Controls Inc., said that bringing humidity into the correct range isn't necessarily going to solve static problems. Though he acknowledges that low humidity is a contributing factor to ESD, there is debate in the scientific community as to how the water in moist air prevents ESD.
"You can be fine in your moisture range and still have an ESD event," he said. "There are other contributing factors, as well."
Sullivan agreed, acknowledging that the triboelectric effect doesn't occur with perfectly clean air; particles in the air, such as dust or dirt, can help to create the surface charge. Stine added that other factors, such as how well grounded your building and data center are and the design of the server case (which can provide grounding within the box), can also affect the likelihood of ESD.
Don Beaty, a consultant with DLB Associates Inc. and member of the ASHRAE technical committee, said that telecommunications central offices, for example, are not usually humidified at all, but that staffers commonly wear grounding wrist straps to reduce ESD. In the end, Beaty said, data centers have to weigh the pros and cons of having low humidity.
"The lower limit (of the humidity range) should probably be set based on a total cost of ownership (TCO) analysis," Beaty wrote in an email. "What is the operating cost of humidification vs. what would be the cost of increased equipment failure due to a lower relative humidity limit?"
Relative humidity vs. absolute humidity
Further muddling the issue is the fact that many people in the industry -- including some in the ASHRAE technical committee itself -- think that absolute humidity should be measured instead of relative humidity. Absolute humidity, also expressed as dew point, is a measure of the amount of water in the air independent of temperature. So while relative humidity drops when temperature goes up in a data center, absolute humidity stays the same. ASHRAE now recommends that data centers measure humidity by dew point and fall within 5.5 to 15 degrees Celsius (41.9-59 degrees Fahrenheit).
Degree Controls' Stine gave an example as to why dew point, or absolute humidity, is a better measure. He said that if the air entering a server was 60 degrees and had a relative humidity level of 40%, its absolute humidity would be about 0.0045 pounds of water per pound of dry air. That can then be set as your low humidity level in the data center.
But here's where the problem with relative humidity comes in, Stine said. As air goes through a server, it heats up. That causes the relative humidity to drop, possibly down to as low as 20%, even though absolute humidity stays the same.
"You can have perfectly good air going into a server and what seems to be bad air (according to the relative humidity level) coming out, but the absolute humidity hasn't changed," Stine said.
He added that a data center manager will then increase the relative humidity control on the air-conditioning unit to get it within the 40% to 55% range. That then causes condensation to form on the cooling coils, which causes the unit to work harder to evaporate that moisture.
Uptime Institute's Sullivan agrees that controlling absolute humidity or dew point is best.
"What you want to do is control the moisture content of the air in the room rather than trying to control the relative humidity," he said.
Beaty, meanwhile, said there probably needs to be more research on absolute humidity and the dew point temperature, especially as liquid cooling becomes more popular in the data center.
"From a practical standpoint, the dew point temperature limit is needed to avoid potential water condensation in datacom equipment as the industry introduces more liquid-cooled equipment. As more liquid cooling products hit the market, it may be necessary to tweak the currently allowable dew point limits … based on industry experience."
Separate data center humidity from cooling units?
There is also some momentum in the industry around supporting a separate central air handler in the data center that monitors humidity rather than having each individual air-conditioning unit with its own setting.
R. Stephen Spinazzola, vice president for RTKL Associates Inc., a data center design company, said that having multiple air cooling units can lead to problems when each one starts performing different functions depending on the conditions in that localized area. For example, Spinazzola said he's been in data centers where three air-conditioners are doing three different things: One is cooling, one is dehumidifying and the third is humidifying.
"You go into data centers and you have all these units fighting each other, and they're probably out of calibration," Spinazzola said.
What Spinazzola recommends --- and what he helped insurance company Highmark Inc. do with its LEED-certified data centers -- is to implement a central humidity control with a separate, dedicated air handler. He has published a paper with a cost analysis showing that this is a cheaper and more effective way of controlling humidity in your data center.
"We've showed that it's a lower first cost to do it, as well as a humongous lower cost over time," he said.
Beaty added that humidity control is not a new issue, citing a 1988 article from the ASHRAE Journal that talked about the problem with CRAC units "fighting" each other over humidity. He said a possible compromise would be to allow the CRACs to handle humidification but to have a centralized dehumidification system. This would reduce the "fighting" between adjacent CRACs and allow the cooling coils on the data center floor to run dry.
Let us know what you think about the story; e-mail: Mark Fontecchio, News Writer.
Using ASHRAE specs for data center metrics
Additional air conditioners still not cooling hot areas
Be a heat-seeking missile