News Stay informed about the latest enterprise technology news and product updates.

Block those holes!

This column originally appeared on TechTarget's Expert Answer Center as a post in Robert McFarlane's blog. Robert served as the on-demand expert on the Expert Answer Center for two weeks in October to November 2005, during which he was available to quickly answer questions on data center design as well as to write daily blog entries. Keep an eye on the Expert Answer Center for topics that could help your IT shop.

Where does all that air go? One thing's for sure -- in most data centers much of it never makes it to the equipment...

it's supposed to cool. Lots of cold air leaks out of a multitude of openings in the floor tiles, doing virtually nothing. And a lot more disappears right in front of the cabinets after it gets out of the floor. Air conditioning is expensive, and that's a lot of wasted energy and a pile of wasted money, to say nothing of the shorter life you get from equipment that overheats.

It wasn't so critical a few years ago. Energy was cheaper and heat loads weren't as high. But with fuel costs going through the roof and heaters being shipped to data centers disguised as computers, we now have to make things a lot more efficient. The fundamentals are actually easier than you might think. In fact, basic remedies are downright simple, and pretty darn cheap compared with installing more refrigeration.

In most data centers, 25% or more of the cold air is probably being lost. There are two major places to look: your raised floor and your equipment cabinets. Let's start with the raised floor.

The biggest holes are usually the ones the cable comes through (although we've seen entire floor tiles removed, which is just complete foolishness). It used to be standard practice to just cut a 6- or 8-inch square hole, or even larger, no matter how many or how few wires needed to go through it. At one time, when mainframes used those huge "buss and tag" cables, large openings were needed to pass the oversized connectors. And since those holes were usually under equipment that was cooled from below anyway, it really didn't matter. Not so today. RJ-45's, and even the largest power plugs, will go through a much smaller hole. But an amazing amount of air will still leak through that opening, around the spaces that aren't filled with wires. Those holes have got to be sealed. There are two ways: Make some kind of seal yourself -- out of Masonite and duct tape or some such contrivance -- or use a commercial product made for the job that makes it easy to add or remove cables in the future. Two such products are the KoldLok Brush Grommet, and the Sub-Zero Pillow. Take your choice. The Pillow will seal most holes more completely, is less expensive, easier to install and adapts to a wide variety of opening sizes. The Brush Grommet comes in only a few sizes, stops most of the air but not all and can be a little pricey, but it's a lot neater, and no one can remove it and forget to put it back.

Next, look for all those places where pipes, conduits or anything else penetrates the floor. Unlike cables, which are subject to change, these things aren't going anywhere. Seal them with Fire Stop Putty or any good caulking that won't dry up and shrink. If they're too big, the fire stop manufacturers make products to go behind the putty (, and a host of others). Just don't use fiberglass, mineral wool or any other product that can flake off and get into the air going to your equipment.

Now look all around the room where tiles have been cut to the walls or air conditioners or anything else. A good quality, closed-cell weather stripping will usually seal all these openings. Lastly, look for tiles that don't seat tightly. Some air will leak through the seams between the floor tiles. That's inevitable unless the installation has been made with special products and techniques that fully seal these joints, which is highly unlikely in a data center. But the amount of leakage in a normal, well-installed floor is tolerable IF you have sealed all the other holes. If the floor is older, it may be necessary to have a raised floor contractor come in to re-level the tiles and get them as well aligned and seated as possible. After equipment is in place, however, there can only be a certain amount of improvement. Tiles trapped under equipment racks can't be moved or re-aligned, so they will determine how well adjacent tiles can be aligned. But every little bit helps.

Now let's get to the easiest, most overlooked and usually most effective way to improve cooling in the whole data center: unused panel spaces in cabinets. We must assume that your layout conforms with the accepted "hot aisle/cold aisle" approach, with cabinets oriented "front-to-front" and "back-to-back." If not, there aren't many things you can do to help except to re-orient your cabinets and change your whole layout, which is obviously not easy. But if your installation is "hot/cold aisle," you just MUST close those unused panel spaces. If you don't, the air you manage to push through your perforated tiles gets up to the first unused panel space and just flows right through the cabinet to the back. It's called "bypass air," and it does two really bad things. First, it starves all the equipment above the opening of cold air. There's always a temperature gradient from bottom-to-top that makes the upper equipment run hotter than that closer to the base of the cabinet, but if most of the cold air has escaped through the cabinet before it even gets to the top, that upper hardware is going to run much hotter and will have a much shorter life. Second, the cold air bypassing through the cabinet mixes with the hot air that must return to the air conditioners, cooling it down. That's the air that tells CRAC's how much new cold air to put out. If the return air is already cooled down somewhat, it fools the air conditioners into thinking everything is fine, so they stop working so hard. The result? Less cooling to the hardware, higher temperatures, shorter life and some strange cycling of the air conditioners than can also upset the humidity control.

And there's another factor. (Who said this was easy?) Not only can cold air bypass from front to back, but hot air can bypass from back to front. Since warm air rises naturally, this just worsens a bad situation by delivering even warmer air to the upper computers. In short, you're engaging in "computer euthanasia" simply by leaving these openings. Is it any wonder that the servers toward the tops of the cabinets statistically have a higher failure and error rate than those at the bottom? Load cabinets from bottom to top, and then close all the remaining spaces with blank panels. If you make a lot of changes, or you can't get people to pick up a screwdriver to replace the panels, several manufacturers now make "snap-in" panels. IBM and SMC make them, too, if you can ever locate them on their Web sites. There are probably others, and we know of several cabinet manufacturers who are planning to come out with them. Snap-ins are a little more expensive, but there's simply no excuse for not putting them back in when a change is made.

Even though Robert's stint on the Expert Answer Center is over, he is always ready to answer your questions on Ask him your most pressing data center design question.

Dig Deeper on Data center design and facilities

Start the conversation

Send me notifications when other members comment.

Please create a username to comment.