News Stay informed about the latest enterprise technology news and product updates.

Uptime conference goers mull data center efficiency

Last week's Uptime Institute conference in Orlando showed data center managers how to make facilities and IT equipment more efficient. Attendees are taking the advice to heart.

ORLANDO, Fla. -- Attending conferences like The Uptime Institute's last week are good and all, but what data center...

managers take back from them is what matters.

Much of the talk at the conference was about so-called "gold nuggets" -- actionable pieces of advice on how data center managers can make their facilities and IT equipment more efficient and save money.

More on data center best practices
Uptime Institute mines golden nuggets in the data center

Data center battles the power bill with help from Uptime Institute

Ken Brill: Tune your data center engine

But Ken Brill, Uptime's executive director, said he would like to see more users implementing changes rather than just talking about them.

"Users are lagging behind, and why is that?" Brill asked. "We identified a year ago that between 10 and 30 percent of servers are dead. What is it that prevents us when we go home from doing things that we know should be done? The answer is that you're probably not going to be thanked for improved energy efficiency by turning off 100 servers, if just one of them has an undocumented but critical legacy application that only runs once a month. In fact, you could get into serious career trouble."

Uptime outlines data center efficiency options

Robert Sullivan, a senior consultant with Uptime, has written a draft document of efficiency opportunities and sent it to the federal Environmental Protection Agency (EPA). The EPA is working on a report for Congress, due in June, which will look at data center energy consumption and some of the chances for better efficiency. The Uptime Insititute is one organization helping the EPA write the report.

In his document, Sullivan divides the opportunities into three categories: easy and free; labor intensive but cheap; and longer term and more expensive. Some of the opportunities are:

  • Turning off unused servers and storage. Sullivan has anecdotal data showing that 7 to 20 percent of servers and storage devices aren't being used, even though they're still on.
  • Tuning computer room air-conditioner (CRAC) units. A survey by Uptime found that 10 percent of CRACs weren't providing any cold air at all, and up to one-quarter weren't working properly.
  • Eliminating legacy servers by migrating their applications to a new platform or finding a replacement application.
  • Improving airflow in the data center by sealing openings in the raised floor. Only 40 percent of cold air from CRACs is reaching its intended target because of unsealed openings in the raised floor and bad airflow design in the data center, according to Uptime.
  • Using virtualization software to up CPU utilization rates and either retire older servers or repurpose them.
  • Adopting more-efficient hardware, from servers to processors to power supplies, and using larger systems when possible to encourage virtualization and reductions in space, power and cooling.

Users prioritize data center efficiency

End users at the conference last week said they planned on taking some of those opportunities back to their companies. Some, like Aaron Andersen from the National Center for Atmospheric Research (NCAR) in Boulder, Colo., have already implemented the easier changes and are looking into more advanced energy-saving techniques.

Andersen, the head of enterprise services in the computing and information systems lab, said NCAR will start design on a new data center this summer to be built next year, and it will include energy-efficient technologies. It will be using waterside economizers, which evaporate naturally chilled water in outside towers and use the resulting air to cool data center equipment. The new facility will also use high-efficiency universal power supplies (UPS), which can save money by reducing the amount of energy lost when it travels through UPS components and onto the power distribution units (PDU).

But Andersen is taking something back from the keynote speech Sunday night given by Amory Lovins, co-founder and chairman of the Rocky Mountain Institute (RMI): a slush pile. The idea is to build a hill of soft slush and use the cold water that melts off to chill your data center. Building a slush pile is by no means a foregone conclusion, but Andersen said NCAR will consider it.

"We're going to apply a significant number of green technologies," he said. "Because we're in a federal research lab, we don't have the same type of risk aversions that some corporations have."

Meanwhile, Joe Checchi, assistant vice president for critical systems at JPMorgan Chase, said he was going to look into reworking ducts to better direct hot exhaust air from his racks back to his CRACs. As for future technologies, he's interested in using direct current (DC) power to run his data center, which has 90,000 square feet of raised floor space.

The idea of powering data center equipment with DC has been generating interest in the industry, especially among larger data centers that can justify the initial investment to possibly save money down the line.

How can it save money? When the utility company sends electricity to a customer, it's in alternating current (AC) because it's easier to distribute in that form over long distances. The AC is converted to DC at the power distribution unit, converted back to AC to begin its path to the servers and finally converted back again to DC one more time by each individual server.

In a DC system, there is only one conversion at the beginning, from AC to DC. Fewer conversions translate into fewer opportunities for power and energy loss. DC servers also don't have power supplies built in for the extra conversion, so they can take less space in the data center.

Let us know what you think about the story; e-mail: Mark Fontecchio, News Writer; and check out the new Data Center blog.

Dig Deeper on Data center design and facilities

PRO+

Content

Find more PRO+ content and other member only offers, here.

Start the conversation

Send me notifications when other members comment.

By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy

Please create a username to comment.

-ADS BY GOOGLE

SearchWindowsServer

SearchServerVirtualization

SearchCloudComputing

Close