In an effort to avoid expensive and time-consuming data center buildouts, some data center managers are pursuing...
By submitting your email address, you agree to receive emails regarding relevant topic offers from TechTarget and its partners. You can withdraw your consent at any time. Contact TechTarget at 275 Grove Street, Newton, MA.
hot- or cold-aisle containment to reduce power consumption in their facilities.
With containment, cold air used to cool IT equipment is kept physically separate from the hot air exhaust. This reduces mixing and lightens demands on power-hungy computer room air conditioning (CRAC) units. That translates into reduced energy consumption and frees up room for additional IT load.
The need to reduce power consumption is very real. More than a third (36%) of data center owners and operators recently surveyed by the Uptime Institute said they will run out of power, cooling or space in the next 12 months. Assuming a minimum of 18 months for a traditional brick-and-mortar data center buildout, it’s easy to understand why reclaiming power is important.
More new data centers are being outfitted for containment from the get-go, but it’s also possible to retrofit older data centers with containment products like doors, curtains and top-of-rack hoods, said Vince Renaud, vice president and managing principal at Uptime.
“Anything you can do to minimize bypass air is a great solution,” he said.
Containment in action
At the Uptime Institute Symposium last week, Agriculture and Agri-Food Canada (AAFC) described how it used containment to eke extra power out of an overloaded 20-year-old Winnipeg, Manitoba, data center.
“This was a phoenix rising from the ashes,” said Eric Swanson, AAFC data center manager. The data center from 1990 was badly congested, experiencing 60% growth in IT load with only a six-inch raised floor and no redundancy whatsoever in its cooling gear.
In an initial pass, AAFC’s data center team was able to improve its efficiency from a power usage effectiveness (PUE) of 2.15 to 1.97 by implementing easy fixes like grommets, blanking panels, return pipes to CRAC unit plenums -- even using duct tape to seal the racks. “I had to use every trick,” Swanson said. “People thought I had lost my marbles.”
But AAFC found a longer-term solution to its energy woes by implementing active containment hoods from Opengate Data Systems over the hottest racks, Swanson said. The firm now has five hoods over nine racks of equipment, and has been able to add a four-kilowatt blade enclosure to the environment while maintaining cooler supply temperatures all the way down the row and fewer hotspots, he said.
On a much larger scale, Verizon implemented cold-aisle containment across 12 colocation data centers totaling about one million square feet. Working with containment systems from Polargy, the firm achieved a 7.7% improvement in overall energy efficiency and 18.8 million kilowatt hours (kWh) in annual savings.
Verizon settled on containment after evaluating a host of energy-saving mechanisms, said Mark Capurso, Verizon real estate operations manager. “We get hit by a lot of vendors with miracle cures for energy consumption,” he said. Based on product evaluations at a beta site, “we figured out that containing aisles and blanking panels were the quickest and most effective thing we could do.”
Despite containment’s effectiveness, the technique is still used sparingly in older data centers, said Uptime’s Renaud. Data center operators with energy problems tend to go for the “low-hanging fruit” like blanking panels, blocking bypass air and tile placement. “Then, if you’re still having problems, that’s when you look at containment.”