Data center metrics and standards guide
A comprehensive collection of articles, videos and more, hand-picked by our editors
A data center isn't a meat locker, though many are cooled so much you'd expect to see Rocky in there punching a...
By submitting your personal information, you agree that TechTarget and its partners may contact you regarding relevant content, products and special offers.
side of beef.
The concept of elevated operating temperatures is simple: The computer room air conditioner system doesn't need to run as long or work as hard; cooling uses less energy and therefore costs the company less money.
You don't need to run the data center at a warmer temperature, but you do need to ask, "How warm is cool enough?"
What is a safe server temperature at the air inlet? What about safe temps for other data center equipment?
These questions have been the subject of enormous debate and consternation over the years. At some point, the inlet air entering the server becomes too hot to effectively carry away the excess heat from central processing unit, memory modules, power supplies and other devices (such as graphics processing units), resulting in system overheating and premature failures.
The exact temperature where this happens varies dramatically. The actual effectiveness of airflow cooling depends on a suitable design of the server's heat sinks, fan selection, component placement and overall airflow patterns through the system.
The American Society of Heating, Refrigerating and Air Conditioning Engineers (ASHRAE) conducts extensive research into data center cooling, yielding an ever-evolving set of recommendations through Technical Committee 9.9. The 2011 version of ASHRAE recommendations defines four classes of data center equipment (A1 through A4), each with higher allowable levels of temperature and humidity.
Class A1 systems allow temperatures from 50 degrees to 90 degrees Fahrenheit (15 degrees to 32 degrees Celsius), while Class A2 systems allow 50 degrees to 95 degrees F (10 degrees to 35 degrees C). Both systems support 20% to 80% relative humidity (RH).
ASHRAE recommendations also define two extended classes of equipment: Class A3 devices support 41 degrees to 104 degrees F (5 degrees to 40 degrees C) at 8% to 85% RH, and Class A4 systems support 41 degrees to 113 degrees F (5 degrees to 45 degrees C) at 8% to 90% RH.
The trick for determining the maximum server temperature for your data center is to understand the maximum allowable inlet temperature for the equipment used. Determine the class of each server, storage array or other device.
Most current server designs accommodate Class A1 or A2 temperature ranges, but extended temperatures into Class A3 or A4 ranges require careful attention to system design and support. Legacy systems won't survive long in a Class A3 or A4 environment, and it can take several technology refresh cycles before a data center can safely adopt the highest temperature ranges. It's best to approach elevated temperatures in gradual phases and verify system integrity at each temperature goal.
Current technologies allow IT teams to perform a broad range of management and monitoring operations remotely, but people need to get into the data center for hands-on work, and extreme temperature ranges and humidity levels can be hazardous to human health. Careful system containment approaches mitigate human exposure to higher heat.
Related Q&A from Stephen J. Bigelow
Which Amazon WorkSpaces tools allow IT teams to easily package, deploy and manage applications for DaaS users?continue reading
We want to ensure a good desktop experience for DaaS users. How can we connect Amazon WorkSpaces and local resources to do this?continue reading
With Windows Server's updates to Group Policy settings and templates comes a danger for administrators: users flooding the help desk with tickets.continue reading
Have a question for an expert?
Please add a title for your question
Get answers from a TechTarget expert on whatever's puzzling you.