NewEnergy Associates, an Atlanta-based power consulting firm, analyzes power supply data for utilities. Recently,...
By submitting your email address, you agree to receive emails regarding relevant topic offers from TechTarget and its partners. You can withdraw your consent at any time. Contact TechTarget at 275 Grove Street, Newton, MA.
the company began using the Monte Carlo method, a statistical model using numerical parameters to forecast energy usage, which increased its computational power needs dramatically.
The company tripled its server count over six months, and the data center staff found itself with a problem it never anticipated would become its most pressing -- heat.
Packing 50 servers into a space that had held just 17, NewEnergy soon found its ideal temperature goal for both the front and back ends of its racks -- 78 and 85 degrees, respectively -- nothing more than a pipe dream. Back-end temperatures reached as high as 105 degrees during peak usage, not good news for a data center with a thermal sensing system designed to shut down the grid at 90 degrees.
With an overloaded data center, it would have only taken a little nudge to spark a crisis.
Then, dirty water shut down the air conditioning systems.
With its cooling capabilities on the fritz, the data center was hitting 130 on the thermometer within an hour. IT management flung the doors to its highly secure server farm wide open, flanked the entrance with extra security guards and set up large blower fans not much more sophisticated than the ceiling fan in a local diner.
At that point, NewEnergy knew that heat had replaced rack space as its primary data center concern -- a concern that had become too expensive to ignore.
"If our data center is off line, we're losing money," said Neil Tisdale, NewEnergy's vice president of software development. "I always knew air conditioning was a concern, but you never thought of it as your most expensive first boundary condition."
The massive spike in the amount of servers helped cause NewEnergy's cooling headache, and the company turned to Sun Microsystems Inc. for an alternative. This is one of the first major customer wins for Sun's new Opteron dual-core processor servers.
NewEnergy will deploy Sun Fire V20z and Sun Fire V40z servers running the Solaris 10 operating system over standard Linux distributions as part of its high-performance grid infrastructure. It has already brought in eight Sun servers to complement 42 Dell boxes, and the company plans to send the Dells packing and replace them with 30 Sun boxes equipped with virtualization technology in the next 12 months.
The key component in NewEnergy's decision to switch to Sun lies in the dual-core processor. The new chip, released in mid-April by Advanced Micro Devices Inc. (AMD), was built to increase computational power and maximize system efficiency while providing up to 30% lower heat output than competing alternatives, essentially because both cores can exist in the same thermal pocket. It can also be run at a lower frequency because there are two chips to draw power from.
Gordon Haff, an analyst with Illuminata Inc., said chip makers are setting the trend toward cooler processors.
"[Cooling] is one of the most important concerns facing chip designers, which in turn is a matter of great interest to CIOs and so forth," Haff said. "[Dual core processors] are clearly the direction things are going, and AMD has given us the first instance of that [in the x86 space]."
Dell is one of the only major vendors that do not support the AMD offering.
Tisdale said NewEnergy's cooling nightmare has to be tackled on two fronts -- the servers have to generate less heat, and the air conditioning system has to control that heat more efficiently. To that end, NewEnergy is looking into a redundant air conditioning system running independent of the company's water system.
Data centers that think they can get away with a shaky cooling system are skating on thin ice and, according to Tisdale, the issues that are causing heat waves in server farms are only going to exacerbated by the current trend toward smaller yet more powerful boxes.
"Density accelerates the problem. You can get more in a data center now," Tisdale said. "Over the past two years, in talking with CTOs and CIOs, it's the same story. AC is the problem all over."
Let us know what you think about the story; e-mail: Luke Meredith, News Writer