Mario Savoia - Fotolia
While battery cooling cabinets sound logical at first, there are very practical reasons why no one implements them in data centers.
Data centers that use valve regulated lead acid could put the sealed-cell batteries into a 22 to 25 degrees Celsius (C) (68-77 degrees Fahrenheit [F]) battery cooling cabinet rather than cooling the entire data center to the appropriate temperature, but it isn't an economical choice. If you're using flooded lead acid batteries, you must maintain them in isolation from the main data center space, due to the potential for acid spill and hydrogen explosion, so separate cooling is automatically required.
The ideal battery temperature is 25 degrees C (77 degrees F). The 2008 ASHRAE recommended maximum continuous operating temperature for computing equipment is 27 degrees C (80.6 degrees F). It is virtually impossible to maintain exactly 27 degrees C from top to bottom of every cabinet in every row of a data center, even with excellent cold aisle containment. Most data center designers establish a slightly lower target, such as 24 degrees C (75 degrees F), to allow some variation. This cooling design makes the use of a self-cooled battery cabinet redundant. Exceeding 27 degrees C won't damage equipment, as is shown in the 2011 ASHRAE TC 9.9 thermal guidelines. This temperature point was picked as the upper limit of the recommended thermal envelope because most server fans significantly increase speed above it, potentially using more energy than higher temperature operation could save. Operating rack space continuously at a higher temperature, and cooling batteries separately, is not recommended.
There are two possible exceptions.
The first would be a facility using cold aisle containment while operating near the ASHRAE 27 degrees C upper temperature limit. Most modern computing hardware operates with a temperature differential of at least 11 degrees C (20 degrees F) between the incoming and discharge air, meaning the rest of the room could be at or above the hot aisle temperature (38 degrees C/100 degrees F). While that's not good for the batteries, it's also around the upper operating limit of most uninterruptible power supply (UPS) systems, which could cause the whole UPS to go into a self-protective thermal shutdown. In this situation, a separate UPS and batteries room with its own cooling would be strongly advised.
The second possible exception would be a short-term power or cooling failure in which the temperature in the data center rises above the ASHRAE allowable temperature. Although this would shorten battery life, that is probably a secondary concern in such an emergency. If UPS reliability is paramount, the right design is a true Uptime Tier IV data center in which the UPS and batteries would be duplicated and separated along with their cooling, with 2N backups behind them. The small additional energy savings from integrated battery cooling would be of little advantage.
Unless you are running your data center well above the top of the ASHRAE recommended thermal envelope, it is doubtful that self-contained battery cooling would save enough energy to offset the cost of the special cooling unit.
About the author:
Robert McFarlane is a principal in charge of data center design for the international consulting firm Shen Milsom and Wilke LLC. McFarlane has spent more than 35 years in communications consulting, has experience in every segment of the data center industry and was a pioneer in developing the field of building cable design. McFarlane also teaches the data center facilities course in the Marist College Institute for Data Center Professional program, is a data center power and cooling expert, is widely published, speaks at many industry seminars and is a corresponding member of ASHRAE TC9.9 which publishes a wide range of industry guidelines.
Your data center might need a fuel cell generator
MGHPCC ditched batteries for flywheel UPS systems
Follow the checklist for better power and cooling
Bring costs down with modern data center cooling systems
Dig Deeper on Data center design and facilities
Related Q&A from Robert McFarlane
Our latest firewall/VPN firmware upgrade left CPU usage at 100%. A malfunctioning DHCP-Server means people aren't getting IPs. I have to pull the ... Continue Reading
We're setting up a 3,700-square-foot server area with 175 server racks and in-row cooling. It's a greenfield project. How do we estimate power ... Continue Reading
We are looking at a building surrounded by several acres of land for a new data center. Should we consider geothermal horizontal loops for cooling? Continue Reading