United Parcel Service's Tier 4 data center goes green

The facilities team at United Parcel Service's data center in Alpharetta, Ga., has gotten green data center operations down to a science. Learn how they did it in this two-part case study. Here in part one, we focus on how UPS identified wasted cooling and, in doing so, eliminated CRAHs.

This Content Component encountered an error
*Editor's note: This is a two-part case study. Part one focuses on how UPS identified wasted cooling, and in doing so cut out the CRAHs. Part two is about UPS's water-side economizer use and other power-saving cooling strategies.
Let's face it: A lot of green data center case studies are pretty worthless. Vendors and customers pat one another on the back for buying green products and offer vague promises to save energy in data centers over a period of time.

But the facilities department at the United Parcel Service of America Inc.'s Alpharetta, Ga., site are about to save you a lot of money on your data center air-conditioning bill today. Joe Parrino, data center manager at UPS' Windward data center also explains his organization's load-shedding process and proves that using outside air to cool a data center can work—even in the hot temperatures of the southeastern U.S.

Brown goes green in the data center
UPS' Windward data center bucks the conventional wisdom. Old data center facilities are supposed to be inefficient, and outdated mechanical systems are primarily to blame. Even worse, considering the amount of redundancy designed into the facility to prevent downtime, an Uptime Institute Tier 4-rated data center would have to be a real energy hog.

The Specs: UPS Windward Data Center

United Parcel Service of America Inc. has two data centers, one in Alpharetta, Ga., and the other in Mahwah, N.J.

Together, the two facilities house 15 mainframes, 2.9 petabytes of storage and nearly 3,000 servers.

These data center handle the logistics for the eighth largest airline in the world, plus nearly 100,000 ground vehicles.

The Windward facility was completed in 1995 and has 50,400 square feet of raised floor.

Windward achieved the Uptime Institute's Tier 4 rating for availability: It has multiple active power and cooling distribution paths, concurrent maintainability and System + System, N+1 redundant components.

But somehow the 13-year-old, Tier 4 facility in Alpharetta scores a power use effectiveness (PUE) as low as 1.9 or, in the Uptime Institute's parlance, SI-EER. This ratio represents the measure of the power going into the facility at the utility meter divided by the power going to the IT load, measured either at the power distribution unit or uninterruptible power supply.

In the case of the Windward data center, PUE was measured at the output of the uninterruptible power supply; measuring the output of the PDU was too difficult. For a more detailed discussion of the differences in measuring at the power distribution unit versus at the uninterruptible power supply, listen to the podcast "Where to measure IT vs. infrastructure power use: PDU or UPS?" with Pitt Turner.

According to the Uptime Institute, the average ratio is 2.5. This means that for every 2.5 watts going "in" at the utility meter, only 1 watt is delivered out to the IT load. In this regard, United Parcel Service's Windward data center is way ahead of the curve. But how did the company do it?

Cutting out the air handling units
Forced-air cooling is one of the least efficient systems in data center infrastructure, and wasting cold air is the most common mistake in data center management. You can set up hot aisle/cold aisle, install blanking panels, and seal gaps in the floor, but you've probably still wasted cold air in a place you wouldn't expect: the perforated top of power distribution units.

Parrino's staff learned this by chance. The team noticed the perforated roof on a PDU as it sat in a hallway waiting for installation. They took airflow measurements on several installed units using a velometer and calculated the cubic-feet-per minute (CFM) loss (i.e., the velocity of the air multiplied by square footage of the opening). United Parcel Service determined the units lost 2,000 CFM per PDU.

Warning:
Given the expectation of "24 x Forever" reliability, The Windward Data Center staff carefully evaluated implementing this improvement.  In addition to the load bank testing that was done, they worked with an engineer from the PDU manufacturer to ensure any increases in transformer temperature would be well-below the specified limits.  Temperature measurements were taken for several days following the covering of the PDUs, so that a trend could be developed and the maximum increase in temperatures of the transformers would be understood.
"What heats up inside a PDU that would require 2,000 CFM of cooling?" Parrino wondered. The only component possibility was transformers, which have a high temperature tolerance. So Parrino conducted an experiment. He ran a PDU with a solid Lexan cover at full load (i.e., 180 kW) using a load bank for one hour in an outside location on an 85 degree Fahrenheit day. Measurements of the transformer temperatures were taken with an infrared camera. The transformer temperatures increased 20 degrees from the nominal 115 degrees on the conditioned raised-floor space to about 135 degrees in a non-air-conditioned location. This was well within the manufacturer's stated 300-plus degrees Fahrenheit operating range. "We didn't even come close to the shutdown temperature," Parrino said.

The next step was to seal the top of PDUs with Lexan covers. Parrino hired a contractor to install covers on all the units. The covers have a three-inch opening to ensure that the transformers get airflow but also block 90% of undesirable bypass airflow. Following the installation of the Lexan covers, average transformer temperature increase was around 1 degree to 2 degrees Fahrenheit.

"After we installed the covers, we looked at the under-floor static pressure and we were amazed at what we got back," Parrino said. The data center had 62 PDUs that were wasting 124,000 CFM of cold air. With the covers installed, Parrino estimated that he could shut off six computer room air handlers [CRAH] based on measured airflow of 19,000 CFM per CRAH unit. In reality, he shut off 10.

The cost of covering PDUs was about $6,000, and United Parcel Service estimated that payback would take about 4.3 months. Instead the project actually paid for itself in only a month and a half.

Parrino said he plans to implement variable frequency drives on some of Windward's CRAH units, and his team is experimenting with variable air volume floor grates controlled by intake temperatures of the racks. "This will slow the consumption of CRAH fan energy even further by delivering the CFM that's needed for each rack instead of delivering based on the worst-case IT load," Parrino said.

>>>go to part 2>>>

ABOUT THE AUTHOR: Matt Stansberry is SearchDataCenter.com's senior site editor. Write to him about your data center concerns at mstansberry@techtarget.com.

This was first published in January 2008

Dig deeper on Data center energy efficiency

Pro+

Features

Enjoy the benefits of Pro+ membership, learn more and join.

0 comments

Oldest 

Forgot Password?

No problem! Submit your e-mail address below. We'll send you an email containing your password.

Your password has been sent to:

-ADS BY GOOGLE

SearchWindowsServer

SearchEnterpriseLinux

SearchServerVirtualization

SearchCloudComputing

Close