Bob Sullivan on the future of data center cooling: A Q&A

Dr .Bob Sullivan – the father of hot-aisle/cold-aisle data center design – weighs in on hot- and cold-aisle containment, data center economizers, and variable-frequency drive fans.

Image goes here

Robert Sullivan (aka "Dr. Bob") is known as the father of hot-aisle/cold-aisle design, one of the fundamental best practices in the data center. Since spearheading that movement nearly two decades ago, Sullivan has advocated data center cooling efficiency best practices around the world through his work with the Uptime Institute . In this interview with SearchDataCenter.com, Sullivan discusses hot- and cold-aisle containment, variable-frequency...

drive fans and other data center cooling technologies. This Q&A is part of a series on five people shaping the data center of the future.

It's been 17 years since you introduced hot-aisle/cold aisle cooling best practices to data centers. What is your estimate on adoption today?
Robert Sullivan: I'd say over 80%. But the thing that infuriates me is to see computer rooms that have been designed in the last three to five years that still have a legacy layout. Give me a break.

Do you see data center airflow containment as the extension of hot aisle cold aisle design?
R.S. Hot aisle/cold aisle was the first step of isolating hot air and cold air. And it worked up to 10 kW with all the best practices in place. Sealing the floor, putting in blanking plates, continuous rows -- you could contain 10 kW in every cabinet. Beyond that, you have to start doing something else: bring in supplemental cooling, in-row or overhead, fan-assisted floor tiles. Everybody had their own solution, but ultimately you need to go to isolation.

Once you go to physical separation, then the load you can cool is dependent on how much airflow the cabinet can handle from the cooling unit to the server fans. If it is a chimney cabinet and it is closed in back, there is a finite amount of air that can go up that chimney. If you exceed that, the cabinet inevitably balloons and you get a significant amount of air forced back through the front of the cabinet.

I'm a bigot about hot-aisle isolation. All I have to do is flood the room with enough cold air to satisfy the volume demand of the fans cooling the servers -- I can cool any load. My prediction: If I had a cabinet that could handle the airflow from the cooling fans of 100 kW, I could cool 100 kW in a cabinet. I've done 40 kW in a cabinet, and when I attempted to add the next 20 kW, the cabinet ballooned because it couldn't handle the airflow.

Why do you like hot aisle containment more than cold?
R.S: Because I have one control point. All I have to do is put enough air in the room to satisfy all the cooling fans of the servers. If I isolate the cold aisle, now I have 20 control points, because I have to control the amount of air that goes into every cold aisle and make sure there is sufficient air to satisfy the server fans sucking air out of that cold aisle. I will assure you that, unless I have some very sophisticated instrumentation and a way to modulate the amount of air, I will oversupply the cold aisle, which now gives me cold-air bypass so I become inefficient. I might be able to handle higher loads, but I'm not able to do it with maximum efficiency.

Today, every computer room should run at 77 degrees Fahrenheit for inlet temperature. That's another reason for isolation. Even if you don't have 20 kW per cabinet. When you start pushing 77-degrees Fahrenheit into your room, it's very nice to have hot-aisle isolation.

Is physical isolation a fit for every data center?
R.S.: You need to have a way to get the air back to the cooling system. Whatever that cooling system might be, you need a path and that path needs to be isolated. If I have an open ceiling, it's going to be hard to do isolation. But you could use extensions. Let's say I have a normal air handler in the room 6 feet tall. If I put a 4 foot extension on every air handler and hot aisle, now I have raised my level to 10 feet and reduced the chances of getting recirculation over the top (because hot air likes to go up). Essentially you've isolated the hot air.

Talk to me about variable-frequency drive (VFD) fans. Do they offer big savings?
R.S.: The thing you have to realize is that if you can reduce fan speed to 80%, you use half the power. If you have a VFD system, you should never allow that fan to run higher than 80% of its speed. Personally, I think you're better off to add more cooling capacity to the room, running the fans under 80%.

With many data centers adopting best practices, what's next for more efficient data center cooling?
R.S.: Air side economizing -- a lot of people have concerns and I'm one of them. With me, it's the exposure of the fire suppression system, more than just the particulate contaminants. If you keep your computer equipment too long, you build up an excessive layer of dirt on the components inside and you're going to affect the reliability. But I can live with that. I can't live with my fire suppression system being activated and having all of the cooling in the room shutting down. Some jurisdictions require that you EPO[emergency power off]. And they all require that you shut down all air movement when you get fire alarm.

For every megawatt of power, you're circulating 150,000 CFM of air. You're bringing in gaseous and fine particulate contamination and it's going to set off smoke detectors. I've set off smoke detectors in a computer room [adjusting fans] and blowing dust off the top of cabinets. Fortunately everybody was in the room, and we avoided the whole thing before anything happened. But if nobody is there and the fire suppression system goes off, it's going to shut it down. It's happened. So that's why I am an advocate of the heat wheel. It's air-side economizing without air transfer. That to me is the long-term answer.

Let us know what you think about the story; email Matt Stansberry, Executive Editor.

This Content Component encountered an error

Pro+

Features

Enjoy the benefits of Pro+ membership, learn more and join.

0 comments

Oldest 

Forgot Password?

No problem! Submit your e-mail address below. We'll send you an email containing your password.

Your password has been sent to:

-ADS BY GOOGLE

SearchWindowsServer

SearchEnterpriseLinux

SearchServerVirtualization

SearchCloudComputing

Close