By submitting your personal information, you agree that TechTarget and its partners may contact you regarding relevant content, products and special offers.
- You aren't getting enough of your available cooling to these cabinets.
- You don't have enough cooling to support the amount of heat being generated in the room.
- Both of the above.
There are only two ways to address the problem:
A) Modify or properly add to your air conditioning and/or air delivery.
B) If you do have enough total cooling, space out your equipment to reduce the number of servers and, consequently, the heat load, in each cabinet.
The first thing is to get the most out of the air conditioning you already have. Read my article entitled "Block those holes!". If that doesn't help, either you don't have enough cooling capacity, or what you have is improperly designed and/or isn't getting the available air to the equipment. Make sure you're not simply blocking under-floor air with cables, and that your air conditioners have been properly serviced and are working properly.
Spacing out the servers does not mean putting 1U between boxes. It's unlikely this will help much, except in simply reducing the total load in each cabinet. Hot air rises, so the intake air will be warmer by the time it reaches the top of the cabinet than it was when it came out of the floor. And some of the hot air from the back "hot aisles" will inevitably go over the tops of the cabinets, re-entering the upper servers. As you've correctly observed, it's no wonder that the servers at the top of the cabinets tend to get the hottest. Therefore, as a general rule, load the cabinets from the bottom up, starting about four or five U's from the bottom with a blank panel. And block the remaining space with blank panels as well.
"Spacing" means leaving two or more lightly-loaded cabinets between the higher loaded ones. In legacy data centers, we usually suggest no more than fifteen to twenty 1U servers in a cabinet as a "rule of thumb", but this is totally dependent on the power demand of each server, and the air delivery available. I would suggest starting with the problem cabinets only half full and seeing if there's enough improvement. Also make sure that cables are not blocking the rear exhausts of the servers.
If this doesn't help enough, and you've taken all the steps to get the most out of the air conditioning you have, you may need more cooling capacity. There are a number of new solutions on the market, and more coming, but this is an expensive and potentially disruptive undertaking. Read my articles "Let's add an air conditioner" and "Cabinets, bloody cabinets!". In general, I would probably not recommend simply adding another CRAC Unit. Rarely can it be put where it will really solve the problem, and often it will actually create new ones. Look at localized overhead cooling, in-row cooling, or even self-cooled cabinets. These all require infrastructure of some kind but, if properly selected, designed and used, any of these will likely be more cost-effective and flexible than rolling in another big air conditioner and enduring the disruption of welding and soldering in your data center.
I would be remiss if I didn't also note that there are computerized analysis tools that, in the hands of an expert, can pinpoint problems and help in developing solutions before a lot of money is invested.
Dig Deeper on SDN and other network strategies
Have a question for an expert?
Please add a title for your question
Get answers from a TechTarget expert on whatever's puzzling you.