Sometimes you have to play the hand you're dealt. Take Columbia, S.C.-based AgFirst Farm Credit Bank . It has a...
By submitting your personal information, you agree that TechTarget and its partners may contact you regarding relevant content, products and special offers.
small data center located in the basement of an 81-year-old building with an aging power infrastructure. And yet the company was able to make significant improvements and forgo having to build another facility.
AgFirst is a $30 billion company that provides funding and financial services to 23 agricultural credit associations, which in turn provide financing to more than 80,000 farmers, agricultural businesses, and rural homeowners. Over the past decade, the company has seen tremendous growth, which has placed additional pressure on AgFirst's aging data center. Seven years ago, the data center had about 100 servers; now it has 600, according to AgFirst's IT operations director, Chad Toney.
"It's an older building," Toney said. "Our data center was traditionally in the basement and is still there. It has moved from being mainframe-centric to more of a client/server environment."
To tackle these problems, AgFirst called on SunGard Availability Services to assess and redesign its data center. SunGad also offers disaster recovery (DR) services with which AgFirst already had experience. "We use them for our DR location," he said. "We figured that since they're good at building their own data centers, who better to ask?"
And so, in late 2006, SunGard began evaluating the basement data center, which it had completed by 2007. Since that time, AgFirst has implemented some of SunGard's suggestions, including installing PDUs in place of breaker panels on the wall, replacing old computer room air conditioner (CRAC) units, and setting up hot and cold aisles in its data center.
Unpredictable data center power outages – often caused by a circuit breaker popping – led AgFirst to move from traditional power panels on the wall to redundant, mobile PDUs.
"The ability to connect power was based on wall-breaker panels," Toney said. "It's similar to what you have in your house. You have no means of monitoring it. We could easily overload those things depending on how much we put into the racks. When we put servers in, we would make estimates. As systems changed, the load went up, especially with high-density blades. It was a struggle to determine how much load we were going to have and what capacity we were at."
Installing power distribution units enabled AgFirst to achieve greater mobility and redundancy of PDUs and the ability to monitor power usage – all features that the circuit breakers on the wall could not provide. In addition, overheating power panels posed a potential fire hazard, but PDUs did not.
But the lack of PDUs was just one of many data center problems that AgFirst had to address.
For one thing, AgFirst's data center was not designed with a hot-aisle/cold aisle configuration in mind. Discovered by Dr. Mickey Zandi, a managing partner at SunGard's consulting services, some aisles were arranged incorrectly, which had created inefficient cooling,
"Sometimes you had the exhaust aisle pushing into the back of the cold aisle," he said. "We had to do a lot of those cleanups."
Also, AgFirst also had some CRAC units that were as old as 17 years, and they were pretty unsophisticated, with no temperature controls other than On and Off switches.
To boot, AgFirst's CRAC units weren't bringing enough cool air into the room. "Because the room had grown into this reactive mode [with some CRACs blowing really cool air to compensate for the air mixing], there was insufficient cooling for the space," Zandi said. The problem was fixed by adding a 30-ton CRAC unit with remote control capability.
A final challenge was the miles of cables that had clogged the underfloor, and prevented chilled air from reaching its destination. AgFirst decided to move to an overhead cabling configuration. But first, the old cables had to be cleared out. So far, AgFirst and SunGard have collected some six truckloads of extraneous cable – that is, cable that wasn't connected or was connected to one piece of equipment but not another.
"We have some interesting photos of old bundled copper," Toney said. "It was definitely obstructing the movement of air, and we still have a lot of work to do."
In the end, AgFirst will have a much more energy-efficient data center, according to Toney, although the firm has yet to put a dollar figure on the savings, in terms of either energy costs or in being able to hold off on building a new data center.
Still on the to-do list: continuing to clean up cabling and putting it overhead, installing a new uninterruptible power supply with redundancy, and eventually adding more security cameras in strategic areas of the data center, Toney said
"I think this has brought a lot of things to life," he said. "I feel we have some expansion room to do certain things, but having the visibility in terms of the power and cooling is big. We're able to better communicate when we're running out of capacity and need to do something else."