The state of Texas is soliciting proposals to consolidate nearly two dozen data centers into two. Again.
This quest for new proposals comes after what the state’s Department of Information Resources (DIR) characterizes as botched work by IBM, which won a bid five years ago to consolidate 27 state agencies’ IT operations into two data centers while updating the infrastructure and implementing server virtualization.
This painful saga illustrates how the intricacies of the planning process for data center consolidation and transformation, particularly through outsourcing, can be cast aside in the name of achieving quick cost savings. The Texas DIR led negotiations that resulted in the signing of an $863 million contract with IBM Global Technology Services in 2006.
IBM used its own hardware in the converged data centers, but also incorporated legacy third-party gear from several ISPs. Multiple sub-contractors also had fingers in the converged data center, including Xerox for the state’s printing jobs and Unisys for network and data center management. The project was to have been completed by December 2009, but after five years, the project remains unfinished. To date, just five of the agencies have fully migrated, and the state has also publicly alleged that IBM failed to meet service-level agreements (SLAs).
Last November, DIR posted a call for new bids from other suppliers to finish the data center consolidation project on its website, and Tom Johnson, public information officer for DIR, said last week that the project “is in open procurement.” Johnson declined to comment on whether any new bidders have submitted proposals, but said the new procurement process will be done as two separate projects -- one for infrastructure support and one for services.
Texas eyewitness shares gory details of a ‘cautionary tale’
The initiation of the re-procurement process follows the issuance of a “Notice to Cure” to IBM by the state in July 2010, which gave IBM 30 days to respond with a remediation plan for the floundering project. IBM responded in August, saying that the state’s antiquated systems, lack of internal IT expertise and resistance among the agencies were to blame for the project’s failure. The state then declared that response from IBM insufficient, and that “DIR has full legal right and authority to terminate the MSA for cause.”
It’s unclear if IBM got that memo. Last week, an IBM spokesperson emailed a statement to SearchDataCenter.com. “IBM remains the main provider of Data Center services to the State of Texas and continues to work with the Texas Department of Information Resources to move the project forward.”
Meanwhile, a former IT executive with a Texas state agency who was involved with the IBM project for five years starting in mid-2005 said the clinical language of the official letters between the two parties belies the pain experienced by staffers as the two sides struggled to achieve the original consolidation goals.
This insider confirmed that, in fact, there was resistance to the consolidation within the agencies, but with reasons not discussed in detail during last summer’s back and forth. For example, security was a huge sticking point. “Architecturally, they wanted to treat the data center as one enterprise and the agencies pushed back, saying that they had differing legal requirements that they needed to meet, and that they needed to keep their data separate.“
At the time, there was no such thing as a virtual firewall; the insistence on an “air gap” between ESX hosts holding data regulated by state and federal statutes conflicted with the planned consolidation ratio for the project, as did the number of VLANs required for the multi-tenant architecture.
“They were going to collapse at least five to 10 server-based VLANs down to two VLANs for the whole agency," said the insider. "That included our DMZ and everything else. They did not expect that an agency would need more partitioning than that [or] plan for that scale when they bought the physical switches.”
But, said the inside source, the biggest issues were not technical. IBM’s staffing of the central state data center was seen as insufficient. “IBM was trying to work with a pooled model of admins who were trying to cover all 27 state agencies and they were not at all familiar with the servers. We ended up having maybe two system administrators, the same two who did all the work for our agency because they were the only ones who understood all the systems.”
This source said administrative staff at the agency didn’t have access to anything below the operating system under the apps it owned, but repeatedly had to train or fill in knowledge gaps with IBM’s sys admins, including sending screenshots and instructions with helpdesk tickets so that problems would be remediated correctly.
“IBM and DIR were working from the assumption that agencies would not be allowed to have administrative access on these servers anymore. Of course, if you’re supporting an application, it makes it very hard to restart things, look at log files, do upgrades of the application software, do deployments to production and so on if you don’t have root access.”
“I don’t believe DIR had any experience doing something on this scale … they weren’t running a very large network themselves, much less having to manage a multi-tenanted, distributed computing environment,” the inside source said. While there were certainly problems on all sides, there were also deliberate actions by IBM “that I still don’t understand ... the number of support resources started out low and then was deliberately cut.”
By the time this person left the position with the state agency mid-last year, procurement and time to deployment of new servers remained painful even for workloads that had been successfully migrated to the consolidated data center, they said, with time to spin up new VMs at the central data center measured in months rather than hours or days.
Expert advice: Planning, planning, and more planning
It does seem like a truism to say that users considering similar data center consolidation, particularly through a service provider, need to explore all the details of their IT environment before embarking, said Jim Dries, director of sourcing advisory services for Align. Align is a consulting firm out of New York City that specializes in data center outsourcing.
But in his experience, Dries said, age-old axioms that emphasize the importance of proper planning and clear communication can still fall on deaf ears both on the supplier and the buyer side of such deals.“Because this is a state affair, certainly the exposure to the dirty laundry is much greater,” he said. But “we’ve got a recessionary climate where corporations and states are faced with budget shortfalls and are looking to outsourcing as a way to achieve savings … the financial savings was a bit of a bright light for the fireflies … where it pushed some of the more prudent aspects of doing the transaction to the side. Which can be typical.”
Consolidation and virtualization of data center resources is a trend sweeping IT, but Dries pointed out that another trend popping up alongside it is an increase in the restructuring of service contracts over time. For example, according to outsourcing data and advisory services firm TPI’s most recent research into outsourcing contracts, released in October 2010, restructuring activity on existing contracts accounted for 48% of new revenue in the outsourcing industry during last year’s third quarter.
Just as the right of way in a crosswalk won’t protect pedestrians from bodily harm if they are hit by a car while crossing the street, a project failure like this always hurts the buyer most, regardless of the contract situation. “You may be able to get money back, but the effect [project failure] has on the people, the culture, the pocketbook, maybe, for businesses in terms of lost revenue, you can’t recover,” Dries said.
Beth Pariseau is a senior news writer for SearchServerVirtualization.com. Write to her at email@example.com.