Article

ADC data center aiming for 1.1 PUE, LEED Platinum

Mark Fontecchio

Using a mix of data center energy efficiency measures such as air-side economizers, rotary uninterruptible power supplies (UPSes) and hot-aisle containment, Advanced Data Centers is building a facility in Sacramento, Calif.,

    Requires Free Membership to View

that may be one of the most energy-efficient data centers ever.

For more on data center design and energy efficiency:

SPEC benchmark measures servers' performance-to-power ratio

EPA releases draft Energy Star server specification

EPA promises data center efficiency metrics for 2008

Eight PUE best practices for your data center

And while it won't be up and running until the end of next year, the ADC facility is also on track to become the first data center to be Leadership in Energy and Environmental Design (LEED)-certified Platinum, the highest efficiency rating a commercial building can receive.

Assuming all goes according to plan, the 71,000-square-foot facility will feature a power usage effectiveness (PUE) of 1.1, which is equal to a data center infrastructure efficiency (DCIE) of 91%. According to the Uptime Institute, a typical data center has a PUE of 2.5, which is a DCIE of 40%.

In other words, for every 11 watts of energy that enter ADC's facility, 10 will go toward powering the IT equipment. The single remaining watt will be used to light and cool the facility and given off as wasted heat that travels from the utility to the IT equipment.

"Part of it is just plain good design," said Bob Seese, ADC's chief data center architect. "That's what it boils down to."

Getting PUE to rock bottom
No single technology or technique is responsible for driving PUE at ADC's data center so low; it's a cumulative effect.

Air-side economizers are a main part of the building's design. On the east side of the facility, ADC built a wall to take in outside air. Inside the wall are filters that strip the air of particulates that could harm IT equipment. Dozens of fans blow the outside air into the facility, and if necessary, dozens of water coils cool the outside air to a suitable temperature.

Part of it is just plain good [data center] design. That's what it boils down to.
Bob Seese,
data center architectAdvanced Data Centers

ADC President Michael Cohen said the data center will use air-side economizing 75% of the year. Industry groups such as the federal Environmental Protection Agency and the American Society of Heating, Refrigerating and Air-conditioning Engineers, or ASHRAE, recommend air-side and water-side economizing, also known as free cooling, to save data center cooling costs.

Another energy-saving measure is the design of the heating, ventilation and air-conditioning (HVAC) systems. In order to reduce the amount of power needed to pump water through the HVAC system, the company simply reduced the number of sharp turns in the piping system, Seese explained.

On the power side, ADC's data center will use rotary UPS devices based on flywheels, an alternative to the more common battery-based UPS. Seese said these devices are 97% efficient, meaning only 3% of the power put into it is wasted, compared with battery-based UPS systems that can have efficiency numbers in the 80s, depending on their power load. Over the past couple of years, flywheel UPSes have gained greater traction, often as a supplement to battery-based UPS devices. Data center colocation company Terremark also uses only flywheels.

On the data center floor, ADC will contain the hot aisle in the back of the server racks. Hot-aisle/cold-aisle containment has gone big-time, with industry leaders proclaiming it a great way to imrpove data center energy efficiency.

In ADC's case, the hot aisle will be contained by attaching a ducted plenum with a chimney at the top to exhaust the hot air. This eliminates the problem of hot and cold air mixing with each other and will allow ADC to pressurize the entire room with cool air.

Because the room will feature a uniform temperature, ADC will be able to raise the set point temperature in the room more safely, a measure that can save a lot of cooling power and money over the course of a year. Containing the hot air also reduces the power used to blow air around the room. In a typical raised-floor environment, the computer room air conditioners (CRACs) blow air up from perforated tiles on the floor all the way up to IT equipment located at the top of the rack and often have to blow hard to prevent the mixing of hot and cold air.

ADC's data center will also have an air-handler system that can recycle the hot air if it's needed, similarly to Canadian media company Quebecor . In ADC's case, the hot air can be mixed back in with the outside air in the colder months both to warm it up and to add humidity.

"In the colder months, we end up with humidity problems, with humidity dropping to 10% or so," Seese said. "It makes sense to re-mix the air with outside air to bring it more in line. We will still have to do humidification at times; we know that. But this will limit the hours of humidification."

Going for LEED Platinum status
All this attention to energy efficiency poised ADC to receive Platinum pre-certification from the U.S. Green Building Council's (USGBC) LEED program.

LEED certification has become a way for data centers to prove their environmental friendliness and, in some cases, become more energy efficient in the process. A good example is Digital Realty Trust's Chicago data center, a 20,000-square-foot portion of which is LEED-certified.

Requirements for LEED certification extend beyond data center equipment, however. In ADC's case, the facility was built on a so-called brownfield site, characterized by the presence of environmental contaminants, some of which ADC had to remove. The site will also re-capture rainwater runoff on-site to re-use in landscaping, restrooms and the cooling tower. Other features garnering LEED points include the following:

  • use of shade to reduce the need for cooling in the building;
  • recycling of demolition debris and use of local building materials; and
  • designing the building in such a way that much of the facility gets natural light during the day.

"The LEED thing was not on our original radar when we started this project," Seese said. "But when we totaled up the numbers, it appeared that we were not only in position for Platinum but that we had far more points than required for Platinum certification."

Let us know what you think about the story; email Mark Fontecchio, News Writer. You can also check out our Data Center Facilities Pro blog.


There are Comments. Add yours.

 
TIP: Want to include a code block in your comment? Use <pre> or <code> tags around the desired text. Ex: <code>insert code</code>

REGISTER or login:

Forgot Password?
By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy
Sort by: OldestNewest

Forgot Password?

No problem! Submit your e-mail address below. We'll send you an email containing your password.

Your password has been sent to: