The trip from nine data centers to a single greenfield facility for one of the country's legacy airlines involved...
a little midflight turbulence but resulted in a safe landing.
United Airlines' data center consolidation project is one of the most energy efficient in the U.S. and even makes use of a Kyoto cooling wheel.
It was complex to combine nine data centers -- some were colocated, one was in a high rise in Houston and some were barely rated Uptime Tier II -- into one Uptime Tier IV data center just outside Chicago, said Tom Songaila, director of IT critical facilities and data center engineering at United Airlines.
United has a second data center in a 28-year-old bunker that is 23 feet underground. And at some point in the future, the airline company will add yet another backup facility.
Construction of a greenfield data center such as the one United built has become less common over the past five years with the growth of cloud computing and colocation providers, according to Jason dePreaux, a data center analyst with IHS, in Austin, Texas.
However, the consolidation of multiple data centers remains common, whether that results in a greenfield data center, colocation or a move to the cloud, dePreaux said.
Inside United's green greenfield data center project
Part of the reason United has so many data centers is its merger with Continental Airlines -- the Continental data centers were included in the process.
The data center project was commissioned in 2013 and went live in 2014. Its area is 166,000 square feet on 16 acres, with enough spare room to double in size. The building is rated to withstand an EF4 tornado and seismic activity stronger than the Chicago area has ever seen before. And the data center's available power is 4 MW -- expandable to 6 MW.
Energy efficiency was a major project goal, and a big part of that was the Kyoto cooling wheel. Songaila estimates the 20-foot KyotoCooling Kyoto wheel saves $1 million annually in operating costs and eliminates 19,544 metric tons of carbon dioxide output each year by foregoing chiller-based computer room air conditioners (CRAC).
The facility has a power usage effectiveness average of 1.09 and during the next 10 years, the United Airlines data center, when compared to a less efficient operation, is expected to save 420 million KWh of electricity, 115 million gallons of water, $35 million in operational costs and 250,000 tons of carbon dioxide.
The use of free cooling is a trend in greenfield data center construction that reduces energy consumption from CRAC and other power-intensive options.
"That one thing [free cooling] alone is a huge contributor to savings," dePreaux said.
The building received a LEED Silver certification for its energy and environmental design by the U.S. Green Building Council.
"We hit that without any problem," Songaila said.
Many of the greenfield data centers built by enterprises today are constructed because organizations want to control environmental impact, said Sophia Vargas, a data center infrastructure and operations analyst with Forrester Research, a Cambridge, Mass.-based IT research firm. For example, Goldman Sachs built its own data centers to meet a 2020 deadline to have zero emissions.
"The ones that are pushing the boundaries to be energy efficient are spending a little more," she said.
Most greenfield data center projects still use OEM cooling systems from major makers such as Schneider Electric, Emerson Network Power and Siemens, but the Kyoto wheel is a great way to use free cooling, Vargas said.
"That might shake things up a bit," she said.
A Liebert DS CRAC from Emerson is used for the electrical areas of United's data center not cooled by the Kyoto wheel. There is also raised floor throughout the data center, which was chosen to make cabling easier, Songaila said.
United also used Emerson Network Power's Trellis data center infrastructure management (DCIM) tool -- which took more work to get up and running than Songaila had thought.
"We do expect to see the fruits of our labor," he added.
A DCIM system is almost always included in greenfield data center projects, according to Vargas.
United's greenfield data center also has 2N backup, using two power supplies from two different substations. In addition, there is an onsite diesel backup that can last for at least 48 hours. The uninterruptible power supply system is also ground faulted.
Greenfield project meant all hands on deck
The new greenfield construction was about more than just the physical building, Songaila said. It also involved breaking down the departmental silos and changing the data center's management structure.
That change resulted in the creation of an Integrated Critical Environment team, or the "ICE" team. "That breaks down the silos of who does what," he said.
The project also resulted in the creation of the Critical Infrastructure Services team to manage the facility and all United's reservation centers as well as airport IT infrastructure assets around the globe.
Traditionally, about 80% to 85% of data center outages are caused by humans and the new teams are designed to minimize those errors. Everyone on the ICE team can stop work and escalate the problem if needed, Songaila said.
For example, all of the IT pros know how to run the generator and do a building check and conversely, all of the systems workers can rack and stack IT equipment.
Every member of the ICE team also carries an iPad. On those iPads, among others things, are the maintenance and restore procedures for more than 125 tasks in the data center.
The integration of United's data center and IT teams won the recognition of this year's Brill Awards for Efficient IT from the Uptime Institute. The awards highlight projects that "improve the industry's ability to sustainably deliver IT services to end users while minimizing cost and other resources." United Airlines was one of two winners of the Global Leadership Award; the other was the Boeing Company.
About the author:
Robert Gates covers data centers, data center strategies, server technologies, converged and hyperconverged infrastructure and open source operating systems for SearchDataCenter. Follow him @RBGatesTT.