Cool aid or Band-Aid?

IBM's new liquid-based server cooling technology is the first step to bring water back into the data center. Experts predict it's inevitable, others call it a Band-Aid.

IBM has hailed Cool Blue, its new liquid-based data center cooling technology, its most effective offering yet for server farmers battling the war on heat. But by putting a water-based cooling product on the market, Big Blue has ignited a debate over what place liquid has in the data center.

Known officially as the eServer Rear Door Heat Exchanger, Cool Blue is a door that hinges to the back of a rack, with a hose installed in the floor that goes up the door. Sealed tubes filled with chilled water remove up to 55% of the heat generated in a fully populated rack, then dissipates it by pulling hot water into the unit so it is not released into the data center.

Tom Condon, a senior consultant for Chicago-based Systems Development Integration, points out that one of the oldest axioms in building a data center is to "keep water away." But he said the heat generated by the proliferation of blade servers -- by IBM and other vendors -- has created major heating problems because they allow data center managers to pack multiple servers into a rack space formerly reserved for a single box, making the idea of liquid cooling one vendors can no longer afford to dismiss.

"Any time you bring liquid into the data center, there is always a chance for danger. We tell clients as a general rule to keep water away," Condon said. "But if you're confronted with cooling problems you can't solve, it's inevitable you're going to have liquid cooling in the data center."

Data center managers seem ready to go back to water.

According to Jim Wilson, advisory systems engineer in the information systems department of San Mateo County, Calif., water cooling worked before, so why not now.

"Personally, I don't have a problem using water to cool equipment since the old 3090 mainframe was water cooled and we never had any problems with it. With the way our data center is set up today [air cooled machines and good air conditioning], we don't have any cooling problems. For installations with cooling problems, it could be an option," Wilson said.

Tony Scordino, manager of network services for Westchester Community College in Valhalla, N.Y., agrees that it pays to be open to the idea.

"Mainframes at one time were cooled with water. My choice to water-cool a server would really depend on how the vendor would implement such a system. Issues of how possible rusting or leakage would be addressed are key in my decision making. Also, what infrastructure changes or maintenance procedures would be required, could sway my decision. The bottom line is that I would be open to the idea," Scordino said.

Still, Milpitas, Calif.-based Rackable Systems, a server and storage provider that sells air-cooled equipment, said water isn't the answer. Rackable, whose customer list includes Yahoo, video game giant Electronic Arts, Toshiba and Sony, said Cool Blue is bound to leave data center managers with almost as many headaches as it solves because it brings two risky elements in the ecosystem -- water and plumbing.

According to Josh Goldenhar, senior director of product marketing and strategic planning for Rackable, data centers shouldn't need to use water to cool things down because they shouldn't be so hot in the first place. Blade sprawl, he said, has forced Big Blue's hand.

Goldenhar also said Cool Blue causes additional concerns because its architecture requires data centers to install plumbing for the hose that runs from the floor up through the Cool Blue door. This makes racks immobile, Goldenhar said, adding to a data center's expenses because of the cost associated with retrofitting HVAC equipment to fit Cool Blue into an existing footprint. Plus, Goldenhar said, if you have to shut down your chiller to fix a problem with a Cool Blue door, you could be affecting multiple racks.

For more information:

Cool Blue -- How IBM beats the heat

Avoiding a summer server meltdown

"IBM is coming up with this because of the path they've taken," Goldenhar said. "We do view it as a stop-gap measure by IBM. They were required to do it in the way they built the servers and put them into the rack … it's effective, but it's suboptimal."

According to Susan Davis, vice president of marketing and product management for Egenera Inc., a Marlboro, Mass.-based blade server manufacturer, the main flaw in the current blade trend is that vendors never account for a way for customers to deal with the heat generated by multiple blades running in a single rack.

"The basic problem is that even newer systems were never designed with a way to solve this problem, so [IBM] had to come up with a solution as an afterthought … [and] it is never as good as if had you thought of it from the beginning," Davis said.

IBM strongly disputes claims that Cool Blue comes with its own set of headaches.

"IBM is a leader in the server market, and we have decades of experience with cooling technology," said IBM spokesman Tim Willaford. "This solution is not a requirement. It's an option to help customers deal with hot spots in their data center."

The general consensus seems to be that Cool Blue will help box-heavy customers deal with those hot spots effectively -- but it's not for everybody.

"I think in a new data center [Cool Blue will work] because they'll design for this kind of infrastructure," said Dan Busby, product design manager for Egenera. "It will be a rough road for pre-existing data centers. But in the end it allows them to use up the whole space."

Let us know what you think about the story; e-mail: Luke Meredith, News Writer

Dig deeper on Data center cooling

Pro+

Features

Enjoy the benefits of Pro+ membership, learn more and join.

0 comments

Oldest 

Forgot Password?

No problem! Submit your e-mail address below. We'll send you an email containing your password.

Your password has been sent to:

-ADS BY GOOGLE

SearchWindowsServer

SearchEnterpriseLinux

SearchServerVirtualization

SearchCloudComputing

Close