The American Society of Heating, Refrigerating and Air-Conditioning Engineers (ASHRAE) may expand its recommended ranges for data center humidity and temperature to help users save energy without compromising IT equipment safety.
"I think it's really important from an energy standpoint if we can widen that range," said Roger Schmidt, a distinguished IBM technologist and chairman of ASHRAE Technical Committee 9.9 (TC 9.9), which focuses on computer rooms, in a recent podcast. "Obviously it depends on what the IT manufacturers will come back and say what they feel is acceptable from a reliability standpoint."
In most modern data centers, air-cooling units, such as computer room air-conditioners, or CRACs, in a raised-floor environment, overhead coolers, or units nestled in a row of server racks, display a relative humidity reading that data center managers can adjust. Preventing data center humidity from climbing too high or dipping too low prevents condensation from forming on your equipment and the possibility of electrostatic discharge, both of which can be harmful to computer equipment.
Pushing the envelope
ASHRAE does have a wider "allowable" range; its "allowable" relative humidity, for example, is 20% to 80%. This range is more in line with manufacturers' recommendations for servers. Dell Inc., for example, recommends 20% to 80%; HP recommends 15% to 80%; IBM, 8% to 80%; and Sun, 10% to 90%, according to various corporate Web sites. Maximum wet-bulb temperature recommendations can also get as high as 84 degrees Fahrenheit.
"The end users we've spoken to, they're OK with expanding the range a little bit," said Herb Villa, a TC 9.9 member and a field technical manager at Springfield, Ohio-based Rittal Corp. . "They can see the tangible benefits for system and facility performance and energy savings."
Where does the savings come from? By allowing a wider humidity range, a data center's air conditioners can focus on just blowing cool air. Trying to control humidity manually can be counterproductive, causing cooling units to fight one another when humidity levels differ throughout a room. If one unit, for example, is humidifying a room as another dehumidifies, the units cancel one another out and use extra electricity in the process.
Earlier this year, Shands HealthCare in Gainesville, Fla. shut off the humidity controls in its 3,500-square-foot data center. It's not unprecedented; Intel Corp. started doing it three years ago and now doesn't control humidity in about 50,000 square feet of data center space. Brad Kowal, Shands' associate data center director, said he would like to stay in the 30%-to-70% relative humidity range. As the more dry winter months of December, January and February approach, he'll keep an eye on it.
Kowal said he's glad to hear that ASHRAE will expand the recommended range, which he thinks is too tight right now.
"Yes, I want the ASHRAE data to keep pushing the logical envelope," he said. "It was not logical to me to have data center boundaries between 40[%]and 55[%] humidity in a 72-degree room when the manufacturers have it much wider."
"Do I want my data center to be 85 degrees? Probably not," Kowal continued. "But I think we can get energy savings and save the company some costs by taking a 72-degree room to 78. At the end of the day, I don't want to be arguing with IBM about my room being 78 degrees."
Villa agreed, adding that the ASHRAE move would be less about changing the range and more about changing mindsets concerning data center temperature and humidity.
"They're making sure the message is consistent," Villa said. "They're moving that box to make it more consistent with equipment manufacturers. They're not changing anything, perhaps, besides what people think."