zhu difeng - Fotolia
When your data center runs out of space, power or cooling -- or all three -- you have some difficult decisions to make. Those deliberations become more challenging if your business is likely to move within the next several years, or if there are discussions about eventually transferring some computing to the cloud or to a hosting site. These decisions are important, and not ones you want to rush. The choices an organization makes, after all, could be costly -- in both capital outlay and operational effectiveness.
But while organizations ponder their long-term data center strategies, an IT team still bears short-term burdens: It needs to keep that data center running, and must reliably support users.
So, which steps are realistic? Which will provide real benefit for minimal investment? And, just as importantly, which ones will be least disruptive? In short, how can you navigate the short term in the most economic, effective and efficient way possible?
When your facility nears its limits, sound data center strategies involve determining what you can clean up, what you can fix up and, lastly, what you can phase up.
Improve efficiency by cleaning, dumping old equipment
If a data center is already running at the limits of its capacity, the IT staff will likely have to shut down and remove most of those comatose servers.
The next step is to challenge the importance of anything that shows low utilization, such as 10%. See if it can be virtualized -- or done away with. If it's supporting a single application that one user thinks is nice to have, it may be time to have a serious conversation. A data center at its limit isn't in a position to accommodate equipment that's not pulling its weight. Plus, those kinds of changes provide the added benefit of reducing power consumption.
While you're looking at hardware, clean the equipment air filters and make sure cables aren't blocking exhaust airflow. These simple steps cost nothing and can improve cooling effectiveness and extend equipment life. If you have cardboard boxes and other unnecessary items stored in your data center, get rid of them. They create more dirt that will reblock filters and degrade cooling.
If you supply air from a raised access floor, remove that unused cable. It's not easy, but you don't need a capital budget authorization to do it. You may be surprised by the difference it makes in cooling performance.
It might be worthwhile to employ a professional data center cleaning service. It's amazing what a clean facility can do for equipment operation and power usage.
Optimize current equipment with minor fixes
When you're running close to your limits, nothing is more important than good preventive maintenance. Consider this, even if it means a special call beyond the terms of a service contract. Confirming everything runs at maximum performance may justify the extra charge.
Other data center strategies to extend a facility's life span include changing air conditioner filters, checking all belts and bearings, and verifying that everything is clean. Your vendor should have a thorough maintenance checklist, just like a mechanic has for a car. Make sure it's followed completely. And if you're running so close to the edge that you can't afford to shut down an air conditioner for service, rent portable coolers to get you through.
Uninterruptible power supply (UPS) batteries are probably the most failure-prone item in the data center, and they'll fail just when they are needed most. Valve regulated lead acid (VRLA) sealed cells are only good for a few years anyway, so if they're more than 3 to 5 years old, it's a good idea to replace them.
Before calling for preventive maintenance on your UPS, and certainly before replacing a UPS or adding capacity, check the phase balance. You may have more power available than you think.
Large UPS systems -- generally 20 kW and above -- are three-phased. This means there are three hot wires, but nearly all cabinets and equipment connect to only one or two of those phase wires. In the United States, 208-volt circuits draw power from any two of the three phase wires. A 120-volt circuit in the U.S., as well as a 230-volt circuit in Europe, draws power from any one of the phase wires, plus a neutral wire. As a result, it's easy to load one or two of the phases to near capacity, leaving little load connected to the remaining phase or phases.
The front-panel display can tell you the load on each phase, but the general display will show only the percentage load based on the worst-case phase. Therefore, if phases are way out of balance, your display could show 98% utilization -- even though 20% to 30% of your capacity remains available and unused.
Rebalancing phases as closely as possible -- the goal is within 5% -- can unchain significant extra power from an existing UPS, solving potential overload concerns at little to no cost.
Install blanking panels in any unused rack and cabinet spaces to stop the waste of expensive cooling air. Snap-in panels can make a huge difference in cooling effectiveness. Likewise, expandable panels are available to close gaps between cabinets, and products are now available to seal the space between the bottom of a cabinet and the floor.
If all else fails, add equipment
Other data center strategies to squeeze more life out of a facility include major equipment additions -- but this should be a last resort.
If you need more UPS capacity, consider the use of smaller, in-rack UPS units. These will be helpful, but only if a minimal amount of additional UPS capacity is needed. Even though this is meant to be a short-term solution, use commercial-grade units. Check the batteries in these small units every few months, and heed their alarms.
If you need more cooling capacity, in-row coolers (IRCs) may be a better option than large computer room air conditioners (CRACs) or air handlers, particularly if the existing cooling is via under-floor air delivery. Adding CRACs may force more air under the floor than the plenum space can accommodate, and may exacerbate pressure variations due to under-floor obstacles. Further, the air streams can interfere with each other, actually reducing cooling in some areas of the floor instead of improving it.
Get your data center to chill
IRCs are placed between cabinets and deliver cool air directly in front of the cabinets where the highest heat loads exist. Further improvements can be made by relocating equipment with high heat outputs to cabinets configured for higher density. Another option would be rear door heat exchangers, which neutralize the heat before it leaves the cabinets.
These options require chilled water or refrigerant piping out to the floor, which is a significant, potentially disruptive installation. The advantage of these approaches is they can be sized and located to address the specific need.
If cooling remains a problem, consider containment. Cold-aisle containment is generally the better choice for existing facility retrofits, although it can be difficult to control the air balance. Hot-aisle containment avoids the balancing problem, but it requires a return air path back to the air conditioners. That's inherent with an IRC design, but could be difficult if you don't already have a return air plenum ceiling back to your CRACs.
Plastic curtains are easier to implement in an existing space than solid air barrier doors and panels, and they allow air leakage, which can solve the air balance challenge of cold-aisle containment. However, the plastic may not comply with flame-spread and smoke-emission requirements.
Solid containment, using end-of-row doors and above-cabinet panels, may be more difficult to implement in an existing space, but will provide more complete containment than curtains. Air balance in cold-aisle containment installations will be challenging, usually meaning the design will need to allow some leakage to avoid problems.
Also, there are important fire-protection considerations related to containment to consider with your data center strategies. If sprinklers or gas discharge heads are not located in each aisle, the containment may isolate an aisle from the fire suppressant, which is illegal. U.S. fire-prevention standards require the containment barriers drop automatically upon smoke detection.
When faced with the need to extend the life of an existing data center for a few years, the first steps likely will be those you should have been taking all along, but weren't forced to until now.
Large budget approvals to fix an end-of-life facility shouldn't be necessary. When they are, the solutions need to be modular in nature, providing only what is necessary, at the lowest cost. Don't add major equipment unless absolutely necessary.
Save energy and cool the data center at the same time
How hot is too hot for your data center?
Extend your data center life by up to a decade
Strategies for the data center's changing role