At a recent Association of Information Technology Professionals data center panel discussion, a seasoned group of IT admins discussed meeting customer power demands, with the consensus that demand is insatiable. Even as budgets seesaw from abundant to sparse, the
The first panel warning was that virtualization is not a cure-all for reducing data center power consumption. Of course, there’s a clear advantage to high-density computing -- cramming many virtual machines (VMs) into a single server -- but CPU demands for power and cooling still grow with each VM. In many cases, power and cooling costs shift from distributing power across lots of small servers to boosting power to cool red-hot VM-hosting systems.
It’s not just about CPUs cranking out BTUs, which raises the next issue. The power needed for cooling, lighting, battery backup (UPS systems) and other environmental factors usually accounts for 35% or more of a data center’s total energy consumption, regardless of how efficient a building is built. Servers gobble watts, and keeping them happy is major overhead.
Another important panel consideration was to shorten the return on investment. Returns have to show up fast -- within days or weeks. Nevermind three-to-five year returns -- in this economy, strained budgets can’t wait. All of the panelists insisted that IT managers have to show fast results before selling long-term solutions.
So what are fast turnaround projects that deliver results quickly? The suggestions below are somewhat small, mostly single project efforts, or at least quick changes without high infrastructure costs. Combining them could create a synergy where the sum is greater than the parts, but doing so isn’t required to make efficiency gains. There were four main ideas presented for reducing data center power consumption, each of which can be implemented separately.
Switch to variable-speed fans
Recent research found that power consumption drops 30% for every 10% reduction in fan speed. As the name implies, these fans only consume power when needed, only running at the speed required, based on fairly sophisticated thermostatic measures. Since these fans slow down over long periods of time at low CPU utilization, they quickly decrease power usage with each non-turning blade. And don't stop with servers; check cooling features of UPS devices and power supplies of various appliances on the same power grid, plus any other hot spots that may have a fan spinning for a while.
Raise the air temperature
According to data center infrastructure suppliers, modern servers can perform well up to 77 degrees Fahrenheit. Yet many data centers have cooled servers down to mid-60 degrees F for years. By raising the ambient air temperature a few degrees, there can be an immediate drop in power usage by the cooling system with no server performance impact. There’s no overhead or investment needed, although close monitoring and a solid pilot program would be advisable to avoid unpleasant surprises. Granted, a slightly warmer server room can be a disconcerting change. For example, the dress code may have to be adjusted to allow for lighter clothes in warmer conditions.
Use bigger, slower drives
Of course, this should not be done for high-demand transactional processes, such as financial databases or critical 24-hour systems. But by delegating a percentage of mostly unused files to a lower tier of storage, big, low-energy demand drives can replace small, fast units. In turn, less drives burn less energy, creating less heat. This can be an expensive undertaking, but as most shops build out more storage every quarter, they should see it as a worthwhile investment.
Use hosted services
Although moving IT workloads to a cloud or colocation provider externalizes the carbon consumption off to the host site, many will concede that big vendors are experts at squeezing the most out of a kilowatt. By using hosted services, you’ll be able to focus on delivering better value at a lower cost for your customers.
The risks of data center power consumption projects
IT organizations need to acknowledge the inherent risks in energy-efficiency projects. As one power company director put it, in a high-density, highly efficient environment, the data center can go thermal in seconds. Several recent high-profile outages started as a partial interruption, but cascaded to bringing down the entire facility. The catalyst – overheating that spread from rack to rack until all systems shut down for self protection.
The final warning: Spell out any risks before implementing changes to the data center and make sure to get executive support before pursuing any of these tactics of reducing data center power consumption.
What did you think of this feature? Write to SearchDataCenter.com's Matt Stansberry about your data center concerns at email@example.com.
This was first published in November 2010