Facebook IT 'likes' facilities, shares PUE status

Facebook IT 'likes' facilities, shares PUE status

Facebook IT 'likes' facilities, shares PUE status

Date: Jul 25, 2014

Turns out, a low PUE and environmentally conscious data center can exist in a humid subtropical climate.

Western North Carolina hosts a generally inhospitable environment for IT equipment -- hot summers and muggy weather year-round. Nevertheless, when Facebook built in Forest City, North Carolina, the company went with outside air cooling for its data halls.

Open Compute Project hardware withstands fairly high heat and humidity, but that's just one facet of the intricate power, cooling and management dance to keep the data center performing to webscale IT standards.

Facebook developed a data center infrastructure management (DCIM) platform in-house to monitor the complex relationship between load and cooling. This interfaces with a heavily customized Siemens AG building management system.

Power usage effectiveness (PUE) was 1.08 on the day that SearchDataCenter toured the campus this month, close to the average. Facebook's custom power and load dashboard, which anyone can track on its website, keeps everyone at the facility mindful of its two goals: Stable compute infrastructure with no waste.

"You can't get to that level of efficiency without constant communication between IT and facilities," said Keven McCammon, data center manager for the Forest City site.

The location has about 80 full-time Facebook employees, and a host of contractor workers that support the data center. One such contractor is Kyle Jessup, a fiber integration manager who has been at the facility since Facebook began deploying servers in 2011.

Chase down the Facebook PUE

Uptime's guide to lower cooling demand

High voltage and other power trends

Working with a DCIM tool

"Everything moves at a fast pace; you're never stuck in a rut," Jessup said. "Everyone's focused on the bigger picture, not on controlling their jobs."

For example, to provision new cabinets, Jessup's team works with facilities to move the equipment into place and check light levels through the fiber. IT's networking team has all the fiber locations mapped and ready for installation, with servers loaded onto the rack ahead of time.

"Things go smoothly," Jessup said.

There's also room to grow professionally, he added.

"With a few certifications, I could go into IT administration," Jessup said. "Some electrical training and I could move to facilities. Or I could keep learning and moving up with fiber and networking."

Under the hood at the Facebook data center

There is no mechanical cooling in the Forest City data halls, which operate at about 83 degrees Fahrenheit in the cold aisle. Pressurized airflow management and the buildings' overall design make use of the local climate, a thermal belt, to move air with as little energy as possible, even spanning the 350,000-square-foot main buildings and 90,000-square-foot Cold Storage Facility. The outside air economizer includes a hot-air recycling system because hot-aisle air is very low-humidity, according to McCammon. The buildings use dehumidifier systems only when necessary.

To maintain this cooling regime, Facebook removes all extraneous pieces from the servers that might impede airflow. Rather than relying on raised floors, Facebook floods cold air down along the rack from the ceiling. Blanking panels allow only server exhaust air to go into the hot aisle.

The company also finessed its air flow strategy: Send air down too forcefully, and it will bounce off the concrete floor, creating an air silo. Facebook designed and built deflection plates so that air moves down the aisle to the servers.

Even the facility lighting is on motion sensors to reduce energy consumption. (McCammon said you can outrun the motion sensors on the scooters that Facebook has for employees to dash around the buildings.)

For data center power, Facebook uses high-voltage systems, stepping down from 480 V in three phases. It is less lossy than traditional power, according to McCammon, contributing to a lower PUE. Facebook worked with several vendors to develop an auto-switching power supply for its servers.

In addition, the data halls do not have batteries in a separate room as a typical data center would. Facebook installs the batteries in the cabinets right beside the server racks.

Meredith Courtemanche is the site editor for SearchDataCenter.com. She edits tips and other content for the site, writes news stories and creates editorial guides.

More on Data center energy efficiency

There are Comments. Add yours.

 
TIP: Want to include a code block in your comment? Use <pre> or <code> tags around the desired text. Ex: <code>insert code</code>

REGISTER or login:

Forgot Password?
By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy
Sort by: OldestNewest

Forgot Password?

No problem! Submit your e-mail address below. We'll send you an email containing your password.

Your password has been sent to: