Computational fluid dynamics helps cut cooling costs

As racks get packed with denser server ratios, more data center personnel turn to computational fluid dynamics to fight escalating energy bills.

As server density grows, so does the need for better cooling. That's where computational fluid dynamics comes in.

The first data centers housed large, incredibly energy inefficient mainframes that emitted tremendous heat. Ways to cool and therefore maintain the integrity of the machine were devised. In most cases, water was used -- which was an unfortunate choice if ever there was a leak, as computers, electricity and water do not mix.

When distributed computer architectures came about, the problem eased. Each server came in its own tower case, and fans could be placed inside to blow air over the critical components -- mainly CPUs and storage devices -- and the hot air could then be vented to the outside. Once more of these servers were put in the data center, the need to provide cooler input air grew, leading to the use of computer room air conditioning (CRAC) units and raised floors to pump the air through. Different form factors, such as a move toward blades and "pizza box" servers, have removed the need for large axial fans that shift large volumes of air to smaller, less volume-capable radial fans.

Meanwhile, energy prices have shot up and what was seen as a necessary but relatively hidden part of the data center is more of a focus. Power usage effectiveness (PUE), which compares the amount of energy used by the total data center facility with that used by the IT equipment, shows an "average" data center facility uses more than 1 watt in energy for cooling and other peripheral systems for every watt used for purely IT purposes.

Equipment density and economic drivers

While the energy efficiency of IT equipment is improving rapidly, the massive growth in equipment density is making it more difficult to ensure systems are running economically.

Sure, there are many ways of approaching the overall cooling needs of a data center -- for instance, using hot and cold aisles, running the facility at a higher overall temperature and using free air cooling -- but the problem remains. Higher densities of equipment are leading to hot spots within the equipment that are difficult to cool effectively.

Lurking in the dark has been an approach that engineers have used for many years. Computational fluid dynamics (CFD) is a means of visualizing the heat map of an environment and then playing "What if?" scenarios to try to optimize a system. In areas such as turbine and boiler design, CFD is a proven and useful approach, but it's not often seen in data centers.

To understand why not, we have to look at where CFD and data centers have already touched. The facilities team has a strong interest in ensuring a data center does not burn down. Therefore, using infrared and temperature sensors in a facility could highlight issues before they become major problems. A CFD system can minimize false positives by obtaining a view of what is normal throughout a data center and producing a log of places where hot spots are acceptable.

Up until now, the IT team didn't see this data. Facilities believed sharing this information would be of little use to IT because the sensors and systems that used the data were for facilities systems, such as building information modeling/management (BIM) systems. However, data center infrastructure management (DCIM) systems came along from the likes of Eaton, Nlyte Software, Romonet, Intel, Schneider Electric and others, and these vendors saw how computational fluid dynamics could help visualize and predict the effectiveness of a particular layout of IT equipment.

But, at that time, facilities were still predominantly purchasing DCIM. Now, IT groups know what DCIM can do for them and are seeking out vendors' solutions that make sense for them.

How IT uses CFD

First, IT finds the baseline in an existing data center that shows where cooling is required. Identifying the hot spots means they can apply targeted cooling rather than using a scattershot approach to try and keep the average temperature of the data center within certain limits. Many organizations are surprised by how much they can save just by targeting the cooling and removing any focus from those components that run well within a recommended thermal envelope without any cooling.

Next comes finding the know-how for optimizing the environment. For example, it may be beyond the capabilities of the existing system to effectively cool a certain rack full of spinning disks. However, by splitting the disks across two racks and placing low-energy network equipment in the space cleared could enable the existing cooling process to deal with the heat load and would also avoid incurring additional expense.

After that, IT will look into introducing new equipment. The question of whether existing cooling can deal with the addition of 100 new servers here or a new storage system there would be explored in a virtual environment, and they would have to choose the least expensive, most effective solution before the equipment is even on-site.

Last -- and probably most important -- is dealing with new architectures. How well will the facility likely handle a move from build-your-own racks to modular computing platforms, such as Dell vStart, IBM PureFlex or Cisco Unified Computing System? How will cooling need to change if the network is flattened to make the most of a fabric approach? And, what if the existing CRACs are replaced with free air cooling or other lower-cost systems? Where will cooling need to be focused to ensure the data center still provides high continuous availability?

CFD is a hidden technology that IT and facilities groups need to bring into more use. To be slightly corny, CFD should be a hotter topic than it currently is.

About the author:
Clive Longbottom is the co-founder and service director at Quocirca and has been an ITC industry analyst for more than 15 years. Trained as a chemical engineer, he worked on anti-cancer drugs, car catalysts and fuel cells before moving into IT. He has worked on many office automation projects, as well as control of substances hazardous to health, document management and knowledge management projects.

This was first published in January 2013

Dig deeper on Data center cooling

Pro+

Features

Enjoy the benefits of Pro+ membership, learn more and join.

Related Discussions

Clive Longbottom asks:

Where do you stand on CFD?

0  Responses So Far

Join the Discussion

5 comments

Oldest 

Forgot Password?

No problem! Submit your e-mail address below. We'll send you an email containing your password.

Your password has been sent to:

-ADS BY GOOGLE

SearchWindowsServer

SearchEnterpriseLinux

SearchServerVirtualization

SearchCloudComputing

Close