The problem you're hearing about results from heat density, not from total heat. Blades may reduce total power...
consumption by as much as 20% over discreet servers, assuming equivalent quantities and processing capabilities, but that remaining 80% or more is packed into a much smaller space. For example, instead of 2,000 Watts per cabinet spread over let's say ten cabinets, you might now have 16,000 Watts in just one cabinet. Watts per Square Foot (W/sf) has become a rather meaningless number but, for comparison's sake, this amounts to nearly 2,000 W/sf in a cabinet 28" Wide x 42" Deep, as opposed to about 333 W/sf for your present servers in smaller cabinets. (Don't try to mount high-density devices in small cabinets. The cable will block whatever air you have.)
It would take nearly 3,000 CFM (Cubic Feet per Minute) of 55 degree Fahrenheit air to handle this kind of heat density, assuming your air conditioners are set for 72 degree Fahrenheit Return Air. This is impossible to accomplish with under-floor air alone. Even if you could do it, you'd have air moving over 17 MPH through two standard tiles in front of a cabinet, and you wouldn't get the air evenly distributed all the way to the top of each cabinet. Using "Grate Tiles" brings it down to 8 MPH, but that's still way too high and you'd probably starve other parts of your data center for air in the process. Even increasing aisle widths doesn't do it. There are really only three ways to solve the problem:
- Put only one Blade Chassis in a cabinet and spread them out so the load is distributed across a larger part of the data center. This is what most people are doing these days. It doesn't help much with space, but it keeps these expensive servers from going into thermal shutdown.
- Install "Spot Cooling", such is available from Liebert ("XD" System), or install a High Density Room within the data center such from APC ("InfraStruXure High Density with Integrated Cooling").
- Install Liquid-cooled Cabinets such as the Knurr "Cyberchill", the Rittal "Liquid Cooling", or the Sanmina "EcoBay". (Yes, sooner or later, you WILL have water piping back in your data center!!).
Whatever you do, make sure to close all unused panel spaces and openings to keep hot air (rear of cabinets) from re-circulating back to the front of the cabinets and reducing the effectiveness of the cold air entering the servers. This is known as "Bypass Air" and is one of the worst enemies of effective cooling. And, of course, make sure those cables don't block the air paths.
Cooling these devices can get tricky, but it's the way technology is moving so don't shy away from it. Just learn to handle it, but that may take expert assistance.
Have a question for an expert?
Please add a title for your question
Get answers from a TechTarget expert on whatever's puzzling you.