Tip

Will water-cooled servers make another splash in the data center?

The heat is on data center managers on how to most effectively and efficiently keep their facilities cool. And while water in the data center can be a nightmare, using water to cool server processors

    Requires Free Membership to View

may soon be the most practical way to handle the heat from the ever-increasing number of processors in high-performance data center servers. This tip will show you why there’s no need to fear the water.

What’s old is new again
Big mainframe computers were water-cooled between 1964 and 1990, then complementary metal–oxide–semiconductor (CMOS) transistor fabrication technology was invented. This development dramatically lowered the heat produced by chips and allowed a return to air-cooled equipment. Water cooling has also been used on high-performance personal computers for many years. Many laptop systems are water cooled since air movement would not be practical in tight quarters.

Although water cooling was better, the IT industry switched back to air cooling because it proved cheaper with lower heat loads. But since then, chip densities have increased dramatically and clock speeds have skyrocketed — both of which exponentially raise power demands and increase heat from each processor.

Air cooling is noisy; additional air cooling requires more and faster fans. It’s not uncommon for a simple 1U rack server to contain 10, 12 or more variable speed fans. When a fan fails in a blade center, the remaining blowers ramp up, sounding like a jet engine due to the enormous amount of air movement required. The noise makes the data center environment difficult for people to work in for any amount of time.

Air cooling can also waste a lot of power. Fan energy follows a cube law function, so when the fan speed doubles, its energy consumption is 23 or 8 times what it was. Energy efficiency alone tells us moving massive amounts of air is not a viable way to cool high heat loads.

If the airflow stops — even for a brief time — expensive processors can be destroyed if protective sensors don’t shut down the afflicted server(s) quickly. A little water circulation will bridge the time gap.

Over the course of computing history, we have often identified system performance limits as “compute bound” or “I/O bound.” Processors today are “heat bound.” Water is significantly more effective at carrying heat than air. When used for cooling, water is approximately 3,500 times more efficient than air.

With water-cooled processors, chip and server manufacturers have virtually unlimited opportunities to develop tremendous computing power. Imagine the potential for cost-savings — and growth potential for your data center — by using a far more efficient cooling method.

Getting the pieces into place
The big worry is leakage: Water and expensive electronics don’t mix. If manufacturers want us to accept water-cooled servers, the setup needs to be quick and foolproof. If the first generation of widely used hardware with water cooling generates reports of leaks, the industry will suffer.  

The water-supply infrastructure is another area of concern. We’ve been using in-row coolers and rear-door coolers for some time now; many of them are served by water lines. The infrastructure for water-cooled computers is pretty much the same.

One way to avoid leaks is to run direct piping. For example, cross-linked polyethylene (or “PEX”) piping has been successful with water-cooled devices, such as in-row coolers, as well as in home and commercial radiant heating systems. It’s also been used in Europe for many years. But the number of “hoses” required for many direct server connections would create a bigger jumble than cabling has ever been. For larger or more flexible installations, a system of piping and connection points will be necessary.

If you don’t have a raised floor, the pipes will need to be overhead, which is even more frightening to most people. But overhead piping is really no more dangerous than piping under floors if it’s done right. Pipes should be run in aisles rather than over cabinets and should be equipped with drip pans as a “belt and suspenders” precaution, but it’s not inherently dangerous.

For many, the safest approach is to put water pipes under a raised access floor – even if the floor isn’t needed for anything else. If you install leak detection and floor drains and run power and cabling overhead, there isn’t much for a leak to hurt. The water in these systems is not at high pressure and pipes are wrapped with insulation, so nothing is going to explode or cause spraying. As a rule, drains should either be self-priming or should lead back to a primed origin since they will probably never actually get wet.

Whether piping is under the floor or overhead, recognize that well installed and tested piping doesn’t leak very often, so the best insurance is to specify thoroughly. Even if you’re a public entity that’s required to take low bid, a solid spec should weed out those you wouldn’t want. But even with top-quality materials and the best installation, you may want to take additional precautions. One is to run water detection tape inside the pipe insulation — especially if your only option is overhead piping — so a problem is identified immediately.

Multiple tap points should be provided in header pipes to avoid the need for “wet taps” in the future. Wet taps are not only risky to do, but are logically more likely to leak than those that were welded in and pressure tested as part of the construction. Taps should be fitted with top quality ball valves and securely capped until needed.

Let’s not forget about the hoses. The first line of defense is spill-proof connectors. The links between header pipes and computers will need to be made and broken over time. You don’t want water running out of hoses and onto equipment when this is done, so high-quality spill-proof connectors are mandatory.

Water is no stranger to the data center
In many cases, you already have water in the data center — often to run the chillers or air conditioners — so there’s no reason to resist a move to water cooling so long as you know how to get it done right.

ABOUT THE AUTHOR: Robert McFarlane is a principal in charge of data center design for the international consulting firm Shen Milsom and Wilke LLC. McFarlane has spent more than 35 years in communications consulting, has experience in every segment of the data center industry and was a pioneer in developing the field of building cable design. McFarlane also teaches the data center facilities course in the Marist College Institute for Data Center Professional program, is a data center power and cooling expert, is widely published, speaks at many industry seminars and is a corresponding member of ASHRAE TC9.9 which publishes a wide range of industry guidelines.

This was first published in February 2012

There are Comments. Add yours.

 
TIP: Want to include a code block in your comment? Use <pre> or <code> tags around the desired text. Ex: <code>insert code</code>

REGISTER or login:

Forgot Password?
By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy
Sort by: OldestNewest

Forgot Password?

No problem! Submit your e-mail address below. We'll send you an email containing your password.

Your password has been sent to:

Disclaimer: Our Tips Exchange is a forum for you to share technical advice and expertise with your peers and to learn from other enterprise IT professionals. TechTarget provides the infrastructure to facilitate this sharing of information. However, we cannot guarantee the accuracy or validity of the material submitted. You agree that your use of the Ask The Expert services and your reliance on any questions, answers, information or other materials received through this Web site is at your own risk.