At colo data center, air mixing happens up high

Fortune Data Centers' new data center that uses hot-aisle/cold-aisle containment and evaporative cooling towers to boost efficiency.

Using computer modeling to test the efficiency of a new data center in San Jose, John Sheputis found something

interesting in the mixing of hot and cold air.

For more on data center energy efficiency:
PG&E invests in data center efficiency

How to create an energy-efficient data center

Data center efficiency projects richly rewarded with energy rebates

Sheputis, CEO of Fortune Data Centers, saw that most mixing of hot and cold air -- a no-no in data center design -- happened at the top of racks, rather than around the sides of rows or through empty slots in racks. These discoveries came through the use of computational fluid dynamics (CFD) software, which many data centers use to get a visual footprint of their facility's airflow.

So when the company began building its 8 MW data center -- with 40,000 square feet of IT floor space -- it decided to contain the hot and cold aisles at the top. Fortune Data Centers did so by dropping vinyl and Lexan curtains down about 5 feet from the ceiling to the top of racks.

"It's extremely practical and inexpensive," he said. "We're just trying to prevent uncontrolled airflow."

The company also pumps in cold air from above through a plenum, so it looks like an upside-down data center. Ducts poke through the plenum in the aisle behind the servers, sucking up the hot air and returning it to air conditioners. The company says this takes advantage of the natural properties of air -- hot air rises -- and doesn't require as much fan horsepower to push the air through a raised floor and to the front of the IT racks. It also doesn't have to worry as much about whether the floor can handle the weight of the equipment.

Reducing PUE
Sheputis claims the data center will have a power usage effectiveness (PUE) number of 1.37. According to the Uptime Institute, average PUE is about 2.5, so 1.37 is low. Fortune also uses water-side and air-side economizing. Sheputis estimated that evaporative cooling towers probably cut the amount of mechanical cooling needed in half.

But the company came to that low PUE with 8 MW worth of load banks rather than real-time data with IT equipment. Load banks are devices that can simulate IT load and are often used in data centers to test power capacity and cooling infrastructure.

Still, Sheputis says that PUE could get even lower, because when the company did the testing with load banks, it didn't drop the Lexan all the way to the floor, so the hot-aisle/cold-aisle containment wasn't complete. As a result, Sheputis said there was probably some air mixing.

That said, Fortune Data Centers is a colocation company looking for clients to rent space in the facility. Those customers could presumably leave gaps between the racks, under the racks, or even in the racks with open slots, all of which would lead to air mixing. Sheputis said there are ways to deal with that.

"If you were my tenant and you didn't use blanking panels, we could hit you with energy surcharges," he said. "There are certainly ways to enforce what we want to see in terms of the energy policy of the building. But most of the value (of the hot-cold aisle containment) is from the top of the rack."

Let us know what you think about the story; email Mark Fontecchio, News Writer.

Dig deeper on Data center energy efficiency

Pro+

Features

Enjoy the benefits of Pro+ membership, learn more and join.

0 comments

Oldest 

Forgot Password?

No problem! Submit your e-mail address below. We'll send you an email containing your password.

Your password has been sent to:

SearchWindowsServer

SearchEnterpriseLinux

SearchServerVirtualization

SearchCloudComputing

Close