But that location presented some serious problems; opening the loading dock doors caused radical temperature changes that affected equipment in the 2,000-square-foot data center. Making matters worse were the mere three feet of space between the data center's drop ceiling and the building's roof, where the sun beats all day, and the data center's west-facing wall that absorbs the sun's rays all afternoon.
In the summer of 2005, all of those issues, combined with poor airflow, forced the company to shut down some systems to prevent them from burning up. One Compaq Unix system, for example, flirted with triple-digit temperatures.
"What was happening was in August (of 2005) this room would get very warm," said Dan Wilson, IT manager at the company. "The air-conditioning units were working hard but weren't cooling some of the systems."
Nor did Sylvania get any advance warning. With no monitoring system in place, IT learned that there was a problem when CPU temperature alarms in the servers sounded.
So, IT and other company officials huddled up to weigh their options. Move the backup facility to headquarters in Danvers, Mass.? That would put the main data center, already in Danvers, and the backup facility too close together. Next.
Buy another CRAC unit and throw more cooling at the problem? At about $75,000 a whack, the chief financial officer wasn't about to sign off on that. Next.
Spend about $10 per square foot to analyze the airflow in the data center, mitigate power and cooling issues, and get return on investment within a year? Now we're talking..
Small airflow changes make big difference
Now, a year and a half after that dreadful summer, a dozen temperature sensors sit on top of various cabinets. Every few hours, Wilson or another IT staffer records the temperatures at each sensor, allowing them to nip potential heating problems in the bud. Sylvania has also done seemingly simple things, like installing perforated floor tiles, putting in ceiling ducts to transfer hot air to where the air-conditioners are and removing Plexiglas from cabinet doors -- all to facilitate better airflow. Wilson is also gradually moving the data center to a hot-aisle-cold-aisle setup, rearranging the rows of cabinets, which hold about 150 systems, to be perpendicular to the CRAC units instead of parallel and ordering blanking panels.
"It really was just a matter of them teaching us as we went along," Wilson said.
The "them" he is referring to is Degree Controls Inc., an airflow management company out of Milford, N.H. Eleven years old, the company originally focused on airflow management for processors and small enclosures. Around 2002, the company realized that data centers were encountering similar problems, and in the past year, Degree Controls has been actively working in data centers and has assisted about a half-dozen customers.
Degree Controls starts by auditing the data center and discovering where the hotspots are. What results is a thermal X-ray drawing that shows blue for cold areas and red for hot ones. Then it makes suggestions, such as installing sensors, ducts, fans or any other device that would promote air flowing in the correct directions.
"A lot of data centers aren't laid out the way they should be," said Walter Phelps, product manager at Degree Controls. "But data center managers don't have time to shut things down."
As a result of the changes, Wilson said Sylvania will probably save 10% to 15% in electricity costs at the Manchester plant. This summer, he wants to replace some older Digital and Compaq Unix systems with some IBM System p gear he bought late last year. That will reduce the total number of systems and, he hopes, further reduce power and cooling charges.
"We figured it would take us about a year to recoup that initial cost (from Degree Controls)," Wilson said. "We're also doing a much better job of managing our data center, and that is something that's hard to measure."
Let us know what you think about the story; e-mail: Mark Fontecchio, News Writer.