This content is part of the Essential Guide: Data center metrics and standards guide
Manage Learn to apply best practices and optimize your operations.

What's the highest server temperature you can handle?

There's no correct inlet air temperature for all servers. Age, performance and design help determine how much heat your server can handle.

A data center isn't a meat locker, though many are cooled so much you'd expect to see Rocky in there punching a...

side of beef.

The concept of elevated operating temperatures is simple: The computer room air conditioner system doesn't need to run as long or work as hard; cooling uses less energy and therefore costs the company less money.

You don't need to run the data center at a warmer temperature, but you do need to ask, "How warm is cool enough?"

What is a safe server temperature at the air inlet? What about safe temps for other data center equipment?

These questions have been the subject of enormous debate and consternation over the years. At some point, the inlet air entering the server becomes too hot to effectively carry away the excess heat from central processing unit, memory modules, power supplies and other devices (such as graphics processing units), resulting in system overheating and premature failures.

The exact temperature where this happens varies dramatically. The actual effectiveness of airflow cooling depends on a suitable design of the server's heat sinks, fan selection, component placement and overall airflow patterns through the system.

The American Society of Heating, Refrigerating and Air Conditioning Engineers (ASHRAE) conducts extensive research into data center cooling, yielding an ever-evolving set of recommendations through Technical Committee 9.9. The 2011 version of ASHRAE recommendations defines four classes of data center equipment (A1 through A4), each with higher allowable levels of temperature and humidity.

Class A1 systems allow temperatures from 50 degrees to 90 degrees Fahrenheit (15 degrees to 32 degrees Celsius), while Class A2 systems allow 50 degrees to 95 degrees F (10 degrees to 35 degrees C). Both systems support 20% to 80% relative humidity (RH).

ASHRAE recommendations also define two extended classes of equipment: Class A3 devices support 41 degrees to 104 degrees F (5 degrees to 40 degrees C) at 8% to 85% RH, and Class A4 systems support 41 degrees to 113 degrees F (5 degrees to 45 degrees C) at 8% to 90% RH.

The trick for determining the maximum server temperature for your data center is to understand the maximum allowable inlet temperature for the equipment used. Determine the class of each server, storage array or other device.

Most current server designs accommodate Class A1 or A2 temperature ranges, but extended temperatures into Class A3 or A4 ranges require careful attention to system design and support. Legacy systems won't survive long in a Class A3 or A4 environment, and it can take several technology refresh cycles before a data center can safely adopt the highest temperature ranges. It's best to approach elevated temperatures in gradual phases and verify system integrity at each temperature goal.

Current technologies allow IT teams to perform a broad range of management and monitoring operations remotely, but people need to get into the data center for hands-on work, and extreme temperature ranges and humidity levels can be hazardous to human health. Careful system containment approaches mitigate human exposure to higher heat.

Dig Deeper on Data center design and facilities

Join the conversation


Send me notifications when other members comment.

Please create a username to comment.

I think the best places to run data centre are in Norway mainly since temperatures in that region remain cooler throughout the year due to which absolutely no cooling is needed at all. Besides, there is one data centre operating in a cave below mountains that's called the most eco-friendly data centre in world today. It's totally independent on fulfilling its energy needs as the water flowing down the lake on mountain produces the whole electricity required to run computers. The same low temperature waters is used to keep the inner atmosphere cooler and noservice of AC Repair Technicians are needed there at all.
For those interested, here's a tour inside the green data center in Norway:
My suggestion is to use a infrared thermometer or camera and measure temps. Also check or do air balancing test/survey. Check the speed of the air handler .... I am sure you will find the problem.
What many users don't realize is that server manufacturers spend a large portion of their design and validation server development cycle in thermal analysis. Most current and one generation old WinTel servers have built in CPU limiter that reduce CPU functionality as temperatures inside the box increase. So you may still have operation, but suffer performance hits if the server CPUs has to "throttle-down." Best bet, don't scrimp on cooling expense and don't assume that data center ambient temperature is an accurate indication of what's happening inside the server box.
My server room was poorly cooled and regularly hit temps of 3 figures. We had very little failure.