When it comes to energy, data centers have found themselves stuck in the ultimate catch-22.
By submitting your personal information, you agree that TechTarget and its partners may contact you regarding relevant content, products and special offers.
Over the last decade -- and especially in the past two years -- growing IT demands have forced data center managers to crowd their server farms with more and more technology, driving energy demands through the roof. To make matters worse, the heat generated by added hardware forced companies to expend even more juice cooling them off.
And if all that wasn't bad enough, global oil concerns and natural disasters, such as Hurricane Katrina, have squeezed the energy industry, pushing usage costs skyward and leaving data center managers paying through the nose trying to solve a three-pronged dilemma many of them never saw coming.
And experts believe it's a dilemma that's only going to get worse in the near future.
Escaping the vicious cycle of energy has become the data center manager's top concern, but industry analysts said there are ways of getting out of an energy crisis, as long as you realize the gravity of the problem and take steps immediately to address it.
Bob McFarlane, Interport division president of New York-based IT consulting firm Shen, Milsom & Wilke Inc., has spent more than 30 years in the field of communications consulting and is considered one of the data center industry's leading building design experts. Of all the cutting-edge solutions vendors pass off as answers to the energy-cost crisis, such as more power-efficient chips and hydrogen fuel cells, he said none come close to addressing the real reason data centers are suffering under the weight of rising energy costs -- inefficient cooling.
Chill your bill
In many ways, your data center and your home are a lot alike. When summer hits, homeowners often use multiple air conditioners to cool off their house, only to suffer sticker shock when the energy bill comes. But it's always summer in a data center and increased energy demands have had the effect of a heat wave that never ends.
Problem is, installing extra air conditioning to cool down your data center hot spots is like replacing one headache with another. Air conditioning uses as much, if not more, energy than the hardware its intended to chill, and concerns, such as holes in a raised floor, loose tiles and needless vents, leads McFarlane to believe that as much as 25% of the air is wasted.
"Overdone air conditioning in an attempt to cool isolated hot spots, by throwing air into a room, is an extremely inefficient and costly way of not accomplishing your goal, because in most cases it simply doesn't work."
McFarlane said there are several new data center cooling technologies that are both cheaper and more efficient than air conditioning. The leading cooling alternative on the market is internal fan booster technology, which, instead of blowing through a raised floor, is installed directly into the cabinet. What internal fans do is draw the hot air away from critical hardware and displace it, a method industry experts agree is much more effective than simply trying to blow cold air on a hot spot. More and more of them, which are manufactured by vendors such as Adelphi and APC, have found their way into data centers as of late, with encouraging results.
"If your cabinet manufacturer offers this kind of accessory, give it a try," McFarlane said. "Just don't believe everything your sales guy tells you."
Liquid-cooled cabinets, such as IBM's recently released Cool Blue, have generated controversy because they use liquid, something that has data center managers unfamiliar with past water-based cooling technologies nervous. Critics charge that liquid cooling brings unneeded plumbing headaches and reduces floorprint flexibility. Still, even the solution's most ardent naysayers will agree that it does what it sets out to do -- cool down your room's hottest spots much better than air conditioning can.
Tony Lock, chief analyst for U.K.-based Bloor Research, points out that liquid cooling was once a widespread solution for mainframes, mostly because older processors ran so hot that organizations were willing to take the good with the bad.
However, as technology evolved, the industry moved toward the safety of air-based cooling.
"Plumbing was expensive," Lock said. "And water and energy didn't make good matches."
But thanks to the energy-cost crunch, many companies are gambling that what comes around doesn't go around.
Fresh squeezed juice?
If your company is large enough and uses enough energy, you might be able to negotiate a lower rate with your energy supplier. Why? Because if you're big enough, you might be able to afford to use generators to power much of your data center, and there is hardware available that can ensure that energy produced by generators is less expensive than that supplied by the power company. And if you can whip up more juice than you need, it's possible to enter into a "cogeneration" agreement with your energy supplier where you sell your energy to them.
The motivation for a utility's desire to buy commercially produced energy stems from the fact that in times of peak usage, concerns over the possibilities of a brownout exist. That has led some of the world's largest companies that generate some of their own power to engage in "peak-usage shaving," in which they sell their own excess juice to public energy concerns.
Negotiating a lower rate with the power company isn't for everyone. But if your organization can bring enough to the bargaining table, your energy supplier will be forced to listen to what you have to say so it won't lose you as a customer, and if rates keep skyrocketing, more and more companies are likely to explore this option.
Lock visited a super computing data center in Spain this summer that took a novel approach to its cooling problems. Instead of recycling energy, it recycled heat.
The company had built its entire data center around an aerodynamic concept designed to maximize its air flow. The racks and subsequent hardware were spaced out throughout the room specifically to create channels that cold air delivered by the air conditioning system could maneuver through, with plenty of vents for funneling hot air out of the room. And once that hot air left the room, it was recycled through a heat exchange into a community grid, where it was used to warm up tap water for the local town's supply.
Lock said he hadn't seen such an approach before, and the fact that the company took great pleasure in building a "green" server farm -- one that is environmentally sound -- while cutting down on energy costs illustrates just how much this issue has entered the industry's collective consciousness, and just how forward-thinking some in the IT industry will go to escape the heat.
Lock, like McFarlane, sees more efficient cooling as the best means companies can deploy to reduce energy costs.
"It's probably the easiest thing people can do," Lock said. "Assuming you can't just throw everything out and start from scratch. There are a lot of people that would like to do that but it's just not a possibility."
The data center of the future
According to McFarlane, only data centers built in the past two to three years have been constructed with the demands of the recent energy cost crunch in mind. And most businesses have neither the budget nor the support from management needed to undertake anything more than a stopgap solution.
But those with the luxury of starting a server farm from scratch need to follow one simple rule when designing a new room, McFarlane said.
"Extreme flexibility," McFarlane said. "The ability to put power and cooling where you need it, when you need it for years to come."
That extends all the way to the walls -- and even the windows. McFarlane isn't keen on the idea of a data center with windows, but if an organization is forced to put its room in a spot with windows, he recommends it tries to reduce solar loads because sunlight can cause the temperature in a server farm to rise dramatically.
And don't forget vapor barriers. Often overlooked, vapor barriers are a critical component of a data center because they regulate humidity -- McFarlane suggests 45% relative humidity is best -- which can wreak havoc for air conditioning units forced to deal with excess evaporation.
In the past six months, vendors have begun to tout technology not normally associated with causing energy headaches as more "power-efficient." Many of them, such as carbon nanotubing and hydrogen fuel cells, are too far away from mainstream use to be considered real solutions for the near future. But dual-core chips, in which two processors exist in the same thermal envelope to cut down on power consumption, are considered to be one of the server industry's biggest trend over the next five years.
Lock said while cooler chips can help, they're far from a panacea, and they were developed more from a performance standpoint than for saving energy.
"Chips were hot and grabbing too much power. One of these alternatives is multi-core processors," Lock said. "Energy benefits are a side issue, but they're not the prime issue."
McFarlane, who works closely with companies planning new data centers, said in theory, fine-tuning a data center with power-efficient hardware in an attempt to reduce power consumption could result in significant savings. But McFarlane said such a plan isn't realistic in an industry so susceptible to change, and warns that anyone thinking that way is missing the point.
"You can't fine-tune most data centers. In fact, I haven't been in one yet," McFarlane said. "It's the cooling … that what it really comes down to."
Let us know what you think about the story; e-mail: Luke Meredith, News Writer