Get started Bring yourself up to speed with our introductory content.

Efficient data center cooling system design you can pull off

Businesses can slash data center energy consumption with a combination of updated technologies and technical practices.

This tip was originally published in the Efficient Data Center Designs handbook.

IT staff roles and workflows aren't the only things changing as IT evolves and budgets tighten -- data center cooling system designs also need an update.

IT managers should re-evaluate how their data centers run. Lower energy use doesn't have to mean sacrificing hardware's reliability and performance. Replacing old hardware and updating data center cooling systems provide great benefits. Explore virtualization and consolidation options, and consider outsourcing certain tasks to other facilities or to the cloud to further reduce energy costs.

Consolidate server hardware

One of the most effective ways to improve energy efficiency is to compute with fewer servers, virtualizing multiple applications to run simultaneously on each server. Virtualized servers host as many virtual machines (VMs) as the physical server hardware -- CPU cores, memory, network I/O and so on -- allows.

The potential impact of server consolidation is stunning. A traditional physical server running a single enterprise application might use 5% to 15% of the server's total computing resources. That same server could host 10 VMs, each using an average of 8% of the server's total computing capacity, replacing 10 physical servers while still leaving 20% headroom in its total computing capacity.

Server consolidation is not an all-or-nothing proposition, and varies depending on the type of workloads and the technology and team supporting them. Organizations that are new to virtualization may start with limited consolidation using noncritical applications and then gradually increase consolidation levels and virtualize more important workloads. Virtualization raises the importance of systems management tools and practices to track, monitor and control VMs.

Adopt energy-efficient servers

Server upgrades and consolidation are often approached as independent projects, yet the two initiatives both increase overall energy conservation. Businesses can easily virtualize the existing server farm, then systematically upgrade servers during subsequent technology refresh cycles. Server upgrades give IT teams the chance to optimize consolidation and balance the distribution of VMs across servers. Taken together, virtualization on more energy-efficient server platforms can make significant improvements in energy conservation.

New server designs provide greater computing capacity while reducing data-center energy consumption. A new Intel Xeon processor dissipates 65 W of heat, compared to 150 W just a few years ago, even with many more cores and memory. Part of the energy savings is in using slower processor clock speeds and relying instead on processor performance enhancements. These enhancements include methods like hyper-threading, which basically allows a microprocessor to do the work of two processors, and processor throttling, which adjusts clock and voltage settings based on computing demands.

New server designs with energy-efficient processors and advanced power-conservation capabilities can in turn increase virtualization and consolidation, further reducing energy demand.

Improve data center cooling systems

Consolidating enterprise workloads into fewer, more energy-efficient systems has a welcome side effect: The servers produce less heat, and that means the data center requires less cooling.

Emerging standards from the American Society of Heating, Refrigerating and Air-Conditioning Engineers (ASHRAE) also support server operation at elevated temperature and humidity levels. For example, class A4 servers carry an allowable upper limit of 113 degrees Fahrenheit and 90% relative humidity.

The biggest challenge for most organizations is using the existing cooling infrastructure in the most efficient way and introducing alternative cooling methods where appropriate.

Take advantage of elevated operating temperatures and explore smaller computer room air conditioning (CRAC) designs and other efficient cooling systems, especially because alternative cooling technologies continue to evolve, and reap large savings.

One avenue to better cooling efficiency is the aggressive use of containment, such as a hot aisle/cold aisle strategy. This approach directs cooling airflow to the server rack spaces rather than to an entire room, so the CRAC system does not need to work as long to achieve the desired operating temperature. When ASHRAE thermal guidelines are followed, containment can also limit IT staff exposure to uncomfortable temperatures and humidity.

Unfortunately, more efficient cooling can actually harm existing mechanical refrigeration systems. As cooling demands fall, CRAC cycle times get shorter, and frequent compressor starts and stops will wear down the large, building-grade cooling system.

Moving cooling closer to the IT equipment is a more efficient cooling alternative for data centers. One option is to move the mechanical refrigeration from the building's roof to the server area, using smaller, high-efficiency, in-row air conditioners.

An increasing number of enterprise data centers are supplementing mechanical refrigeration (or eliminating it entirely) through the use of economizers. Air economizers pass cool outside air across a heat exchanger to remove heat from the data center, while water economizers pump cool lake or river water through a heat exchanger. Data centers in extremely arid climates can achieve good cooling performance with an indirect evaporative cooling system. These approaches eliminate the energy-hungry compressors needed for CRAC systems and instead rely on fans to move cooling air.

When the organization has taken all viable steps to reduce energy usage in the data center, the next step is to outsource workloads either to a highly efficient colocation center or cloud hosting platform.

Next Steps

Read more on data center ideas you shouldn't try to pull off

Dig Deeper on Data center design and facilities

Start the conversation

Send me notifications when other members comment.

Please create a username to comment.