From using toilet water and chilly night desert air to cool data centers, the data center world has been abuzz with news of Facebook’s and Google's latest environmentally conscious endeavors,
The Massachusetts Green High Performance Computing Center (MGHPCC) under construction in Holyoke, Mass., got creative with location, power distribution and cooling techniques to ensure not only the lowest possible carbon footprint, but also save money — approximately $19,000 a year based on current energy costs — once the facility is operational at the end of 2012.
Executive director John Goodhue said building the MGHPCC on an existing industrial site, a former fabric manufacturing plant, helps from “contributing to urban sprawl. That's an aspect that a lot of people don't think about, but it's a huge impact." Goodhue said he hopes the project will spark further business growth in town.
The facility consists of a 90,300-square-foot data center with 10 megawatts of available power, 78% of which will come from local renewable energy. The MGHPCC will use hydroelectric power from the nearby Connecticut River for most of its energy needs, solar power from a newly built facility, and the remainder from Holyoke Gas and Electric's traditional power grid.
The power distribution system was a bit of a challenge, said Goodhue. To keep costs and carbon footprint low, the team is using 400/320 V transformers placed underneath the computer room floor to keep wires short and deliver power as efficiently as possible.
The MGHPCC employs some innovative cooling techniques for further power savings. Enterprise data centers often use air-side economizers to keep cooling costs low, but the servers used in an enterprise are less power-hungry than those needed for heavy computational research. Many of the servers in this high-performance data center use integrated water cooling so the team had to use cutting edge close-coupled water cooling techniques to keep temperatures under control and prevent hot spots.
The MGHPCC's water-based approach includes chilled water-cooling units on the computer room floor instead of relying solely on air cooling. They took it a step further and use free cooling to chill the water. The chilled water is then fed into grouped server rack "pods."
"Taking advantage of the cool weather in New England fall, winter and spring, we will only need to power the chillers about 30% of the time," said Goodhue.
Goodhue said the innovative server grouping technique siphons hot air directly into the air conditioners in another example of cooling efficiency.
Creative power distribution and cooling techniques resulted from following LEED certification guidelines, which was a goal from the start. Developed by the U.S. Green Building Council, the LEED checklist provides a set of guidelines for green construction – getting certified as a data center is difficult since the rules encompass all types of construction. Because the list covers everything from landscaping to waste management, the MGHPCC team had to evaluate available green technology constantly in all aspects of construction.
Following the LEED list, while "challenging technically, helps us focus our thoughts," said Goodhue. The biggest technical hurdles were power and cooling, but the benefits in energy savings and the team's desire for a low carbon footprint outweighed the difficulty. The MGHPCC felt going for LEED certification was the best way objectively to measure the team's success to build an energy-efficient facility at a lower cost than a traditional data center build, and they hope to obtain certification once the facility is complete.
Five schools – MIT, University of Massachusetts, Boston University, Northeastern University and Harvard University – have put up half the money for the $95 million MGHPCC facility with the rest donated by the Commonwealth of Massachusetts, Cisco and EMC Corp. Why donate to such a facility? "(T)hey saw the importance of computationally intensive research and wanted to help get this project going," said Goodhue.