Hosting giant renovates cooling systems, plans to save $1 million

After overhauling its data center cooling systems, hosting giant the Planet expects to save $1 million on its electric bill. Its VP of facilities explains how the company optimized its air distribution.

In February 2007, when we last talked to Jeff Lowenberg , the vice president of facilities at Houston-based hosting giant the Planet was saving $10,000 a month on its electric bill by isolating hot and cold air more efficiently in one of the company's data centers.

For more on data center energy consumption:
United Parcel Service's Tier 4 data center goes green

DOE set to unveil data center energy-efficiency tool

The Green Data Center e-book

Now Lowenberg has taken hot-aisle/cold-aisle principles to the extreme, applying new technologies and best practices within all six of the Planet's data centers. Just by optimizing the way air is distributed, the Planet expects to wrack up $1 million in savings in 2008.

Last year, Lowenberg tackled the problem of getting cold air to flow where it's needed most by sending out his crew with calking guns, sealing up every crack in the floor, sealing where the concrete wall and sheetrock meet, and covering every cable cutout and hole with a grommet or seal.

We didn't do this to get the press; we did it to save money.
Jeff Lowenberg,
VP of faciliiesthe Planet

"You've got big pipes coming into the data center for electrical, Freon and chilled water, and often they don't get sealed very well," Lowenberg explained.

Since then, Lowenberg has taken cooling efficiency to the next level by rearranging floor tiles to better manage cold airflow, installing blanking plates in server cabinets and sealing power distribution units to reduce bypass airflow.

Data center cooling is where most of infrastructure energy efficiency is lost. The fundamental rule in energy efficient cooling is to keep hot air and cold air separate.

Beyond blanking panels: Extending the height of the return-air plenum
The Planet also uses the method of extending the height of its computer room air conditioning (CRAC) units' return-air plenums to optimize air cooling.

Figure 1: The Planet's data center
At the Planet's data center, extending CRAC units' height has brought returns on cooling the air -- and the cost of the electric bill.

"Extending the plenums closer to the ceiling helps in two ways: One, hot air naturally rises, so the CRAC units are sucking air in from the hottest part of the data center," Lowenberg explains. "Two, by extending the plenums higher, it ensures that the CRAC units are not sucking in any cold air from the cold aisles, as it allows for the hottest air to be sucked into the units. In this scenario, the top of the plenums must be at least 2 feet from the ceiling."

Essentially, the plenum is extended by bolting a sheet metal box to the top of a CRAC unit. Lowenberg says it is an option you can buy from Liebert Corp., but it was a lot less expensive to have a contractor come in and do it. The retrofit also makes the data center a bit quieter by moving the mechanical noise higher in the room.

For the Planet's next data center, Lowenberg plans to extend the plenums through the ceiling, using the area above the ceiling tiles as a return air path.

"There will be ceiling tiles that are 90% open -- almost like grates -- positioned above the hot aisles," he said. "The plenums that extend above the ceiling will then pull the hot air that is rising via natural convection through the open ceiling tiles and back down into the CRAC units. For this method to be successful, the open area between the ceiling tiles and the roof must be completely sealed to prevent cold air from seeping in."

Raise the set point to ASHRAE specs
Lowenberg also applied another tactic -- raising his equipment intake temperatures to the standard of the American Society of Heating, Refrigerating and Air-Conditioning Engineers (ASHRAE), which recommends air intakes for data center equipment stay below 77 degrees Fahrenheit and 45% relative humidity. "Most data centers are much cooler than that. We bought Upsite Technologies' temperature strips, scattered them throughout the data center throughout the grills of the racks and played with the set points to bring temperatures up to 77 degrees." The Upsite temperature strips were SearchDataCenter.com's 2007 data center cooling product of the year. The thermometers and stickers use colored-coded indicators to denote temperatures within ASHRAE's optimal, acceptable, or outside the acceptable operating range. ASHRAE is considering expanding the recommended temperature and humidity ranges for IT equipment, which would enable data center managers to raise set points even higher.

Going green for profit, not PR
Recently the Planet piloted these methods over a six-month period in one of its Houston data centers. During that time, the server load (measured at the uninterruptible power supply, or UPS) increased 5%, while the power to cool the data center decreased 31%.

"I'm not sure how much better we can do," Lowenberg said. He attributes a lot of his success to the Uptime Institute Inc. and Site Uptime Network and to working with the Institute's Dr. Bob Sullivan.

This spring the Uptime Institute will launch a green data center awards program at its 2008 Symposium, and I asked Lowenberg whether the Planet would be a contender. He said that Uptime approached the company, but the Planet didn't have time to get the paperwork together.

"We didn't do this to get the press; we did it to save money. Our big thing is cutting costs. Electricity is our third or fourth highest operating expense," Lowenberg said. "I think some companies will find the PR value [in going green]. Buying carbon credits, planting trees, it's all fine and dandy -- but those companies aren't doing anything different from before -- they're throwing money at the problem."

Matt Stansberry is SearchDataCenter.com's senior site editor. Write to him about your data center concerns.

Also, check out our news blog at serverspecs.blogs.techtarget.com.

Dig deeper on Data center cooling

Pro+

Features

Enjoy the benefits of Pro+ membership, learn more and join.

0 comments

Oldest 

Forgot Password?

No problem! Submit your e-mail address below. We'll send you an email containing your password.

Your password has been sent to:

-ADS BY GOOGLE

SearchWindowsServer

SearchEnterpriseLinux

SearchServerVirtualization

SearchCloudComputing

Close