Recently Harley-Davidson built a new 27,000-square-foot data center facility near its headquarters in Milwaukee,...
Wis. But the company traveled a long and winding road to build its data center, and along the way, Harley-Davidson dispelled several prevailing data center myths.
At AFCOM's Data Center World conference in Orlando, Fla., Harley-Davidson's data center planning team described its situation: Before the new data center was built, the company's IT operations were scattered across 13 data centers, including a data center in Milwaukee with wood-core floors (which is not an ideal material for fire prevention) and in a 100-year-old building.
"We needed a facility that was resilient and with minimal to no single points of failure," said Tom Hardin, Harley-Davidson's manager of architecture integration.
The team concluded that consolidating data centers could also save the company money,
"We found that we were going to save $358,000 a year in electrical bills just by consolidating," Hardin said. Further, reducing the number of contractors by one staff person at each site could save Harley-Davidson up to $1.1 million each year. Finally, removing duplication in maintenance and services could save almost $900,000 a year.
The total savings of consolidating into a single data center was almost $2.4 million a year.Buy vs. build, own vs. lease
In the spring of 2006, Harley-Davidson executives approved the need for a new, consolidated data center. But it would take another year for the company to explore all options and another year to build and prepare the data center for operation.
Hardin and Mark Dereberry, Harley-Davidson' technical services lead for data center facilities, said that executives first advocated outsourcing the company's data center. But in the process of considering data center outsourcing, the company encountered data center myth No. 1.Myth No. 1: Move-in ready space is available
Harley-Davidson consulted three companies in the Milwaukee area that had data center space to share, Hardin said, but ran up against the following problems:
- No guarantees. If the partner company needed to expand, Harley-Davidson would be pushed out of the space.
- Underdeveloped facilities. Space that was available would have required Harley-Davidson to dip into its own pockets to get it data center-ready. As Hardin put it, "I was going to have to bring my bag of $8 million with me and finish it myself."
- Power capacity. While capacity was sufficient for current needs, it couldn't accommodate future expansion.
So Harley-Davidson crossed the idea of sharing data center space off its list, and moved on to the prospect of leasing professional data center space, which introduced Myth No. 2.Myth No. 2: Owning a data center is more costly than leasing
Harley-Davidson's leadership assumed that data center space is a commodity and colocation space is cost-effective. The company thought it could avoid investing in building a new facility by leasing space from an established data center provider.
But Hardin and Dereberry discovered that leasing options weren't ideal. They couldn't find colocation options in the area with the requisite power and cooling resources in place. Rather, the prospective sites sought an anchor tenant so they could start breaking ground.
"We're talking about a production data center here," Hardin said. "If I'm talking about a couple racks that I'm going to fail over to, no big deal. But this was a production data center. I don't feel comfortable with that."
So in the end, Harley-Davidson opted not to share or lease data center space, but rather to build a facility itself. The premise was that it would be less expensive in the long run.
That brought Harley-Davidson to the next question: What kind of data center should we build? Well, a green one, of course -- right?Myth No. 3: Every green measure saves money
Harley-Davidson's motivations for building a green data center were twofold: to save money and to earn kudos for being environmentally friendly. But Hardin and Dereberry discovered that while some green measures are good, others simply cost too much. Ultimately, Harley-Davidson decided to implement more efficient uninterruptible power supplies (UPSes) and generators, free cooling using economizers, a hot-air return plenum in the hot aisle, and server virtualization, among other things.
But the company decided against a rotary UPS and trying to earn Leadership in Energy and Environmental Design (LEED) certification because of the cost and what it considered a lack of return on investment.
"We looked at the rotary UPS. But it was expensive, and we were worried about the runtime," Dereberry said. He added later that "LEED is very difficult to get and can be very expensive, so we didn't go that route."
With some of the data center's design principles laid out, Hardin and Dereberry set out to find a contractor to build it and encountered yet another data center misconception.Myth No. 4: Anyone can build a data center
Harley-Davidson executives wanted to use the architect the company already had on staff and current union contractors and intended on buying used equipment to reduce costs.
But the data center team argued that building a data center is different from architecting other commercial buildings. The right design and building firm would save Harley-Davidson time and money.
"The company that did the design and build gave us one throat to choke with our issues," Dereberry said. "It went really well."Myth No. 5: Vendors adequately test their equipment
Once the data center walls had been built, it was time to bring in equipment. But Hardin and Dereberry discovered that it's not enough to rely on vendors to factory-test their wares. Generally, most components are tested only as individual pieces and not always with substantial load on them or in conjunction with other equipment you might find in a data center.
Harley-Davidson spent $100,000 on commissioning its data center, ensuring that all equipment performed as advertised.
Hypothetically, that's $100,000 the company could have saved rather than spent, but Dereberry stood by the decision.
"It's worth the money," Dereberry said. "It's worth every dime. It's going to give you testing that you need, real-world testing, to evaluate all your loads and as a single unit rather than individual components."