willcao911 - Fotolia

Get started Bring yourself up to speed with our introductory content.

Increase efficiency with data center temperature monitoring

Data centers must track multiple temperature components. Organizations can use ASHRAE standards and on-premises hardware for easier monitoring.

A data center's operating environment is as important to system availability as reliable power and software. Good data center temperature and humidity controls are an essential part of facility monitoring and upkeep. By following the latest recommendations, admins can preserve their hardware while also saving energy.

In 2004, ASHRAE published its landmark Thermal Guidelines, establishing recommended data center temperature and humidity ranges for hardware. In its fourth edition, Thermal Guidelines is the foundation of the ASHRAE Datacom Series -- 14 books on data center design and operation. Despite multiple editions, the fundamentals remain the same from the original publication of the guidelines.

Developed in collaboration with major hardware manufacturers, Thermal Guidelines established that equipment inlet temperatures can be as high as 80.6 degrees Farenheit on a continuous basis without impairing service life, reliability or performance, or voiding warranties. This was revolutionary in an industry that had considered 55 degrees the standard since the early days of mainframes.

The cooling unit failure caveat

Operating at the high end of the recommended thermal range saves energy, but there's always the possibility a cooling unit could fail and the data center temperature could go higher than 80.6 degrees. What then? ASHRAE allows temperature to safely rise to 89.6 degrees for several days.

These guidelines allieviate short-term cooling failure concerns and allow more hours of free cooling, even if outside air temperature climbs above the recommended maximum for part of a day. As long as the data center remains within ASHRAE's allowable range, it's not necessary to revert to mechanical refrigeration for short time periods; operating at allowable temperatures maximizes energy savings and avoids potential cooling system transition problems.

However, pushing air into a room at the maximum recommended temperature can result in excessive temperatures for much of the equipment. It's impossible to maintain uniform air temperature in a data center because all the device fans cause random air mixing.

The potential energy and cost savings from operating at higher data center temperatures are so great that cooling system controls have become increasingly sophisticated. Individual servers now monitor dozens of internal temperatures and make that data available for highly granular hardware and facility control.

Hardware for data center temperature monitoring

But even if you aren't Google, Facebook or Amazon and don't have the resources for sophisticated levels of control, it's still possible to improve energy efficiency without overinvesting in monitoring equipment.

The traditional approach is to control air conditioning units via Return Air Temperature (RAT) calculations. But with the newer practice of hot/cold aisle segregation and hot/cold aisle containment, hot aisles are set at higher temperatures than legacy setups. RAT tells admins nothing about actual inlet equipment temperatures, because it is based on old standards.

Monitoring inlet temperature is fairly easy. Intelligent power strips -- or power distribution units (PDUs) -- can connect multiple temperature and humidity probes and collect data. Affixing PDUs to cabinet doors, the top and middle of the cabinet for under-floor air delivery, or middle and bottom for overhead temperatures provides good control data.

Wireless sensors are easy to install and move to find the optimal sensor locations. Regardless of the technology, admins can make data avilabile to air conditioners that control discharge temperature, as well as to data center infrastructure management software for monitoring and reporting. This helps admins get more accurate data center temperature measurements and make finer adjustments as computing workloads change.

Factoring in humidity

ASHRAE also addresses humidity control. In 2014, the standards group published a radical change, stating that relative humidity can drop to as low as 8% without concern about static discharge harming rack-mounted hardware. However, at any humidity level, technicians must wear grounded wrist straps when working inside equipment.

With few exceptions, data center flooring and technician clothing and footwear don't create static concerns at humidity levels far below the legacy standard of 50% relative humidity. This means admins don't need to account for flooring, clothing or shoes when calculating relative humidity.

To help measure data center moisture content, ASHRAE's guidelines also established a dew point temperature, known as absolute humidity. Because humidity is best controlled on dew point data and static discharge is associated with relative humidity, the ASHRAE recommendations correlate the two. In paying attention to dew point temperatures, admins can prevent water damage and condensation on server racks.

To properly operate a data center at higher temperatures and reap the associated energy and cost savings, be sure to evaluate any monitoring hardware and cooling systems. With the right data, tracking dew point, humidity and inlet temperatures will keep any data center as efficient as possible.

Dig Deeper on Best practices for data center operations

Join the conversation

1 comment

Send me notifications when other members comment.

Please create a username to comment.

What hardware do you use to monitor data center temperatures?
Cancel

-ADS BY GOOGLE

SearchWindowsServer

SearchServerVirtualization

SearchCloudComputing

Close