Waiting to virtualize: What's stalling data center virtualization

While cost and complexity concerns influence the decision to virtualize, the arguments against data center virtualization are getting weaker.

This article can also be found in the Premium Editorial Download: Virtual Data Center: Critical decisions for virtualization storage:

Virtualization is arguably one of the most important data center technologies to emerge in the last decade. In...

fact, virtualization technology has permeated IT so quickly and thoroughly that it’s hard to imagine a business without it.

Still, deploying and managing virtualization presents a variety of challenges. For data centers that aren't on board yet, it’s time to examine the principal concerns stalling the decision to virtualize and seek ways to address those issues.

The cost to virtualize as a deal-breaker

Perhaps the biggest sticking point to virtualize is the associated costs. In a TechTarget survey conducted from July to September 2009, more than 900 IT professionals were questioned about data center virtualization. Twenty-eight percent of respondents said they avoided virtualization because it’s just too expensive.

That’s no big surprise. Virtualization often requires newer and more powerful server hardware that can adequately support the performance of numerous virtual workloads. Doing that may require a server with four, eight or even 16 CPU cores, along with up to 16 GB of memory or more. More than 20% of respondents indicated that their existing servers weren’t adequate but buying new servers wasn’t an option.

Network and storage architectures may also need updating before adopting virtualization. Virtualization deployments are best served with fast Fibre Channel SAN for virtual machine (VM) images and periodic snapshots.

The costs of virtualization are further exacerbated when high-availability (HA) and disaster recovery (DR) strategies are added to the mix. Both require the deployment of additional servers and storage that can boost costs. For example, even though consolidation can reduce the total number of physical servers, implementing a HA workload requires a second server to host duplicate VM instances.

It’s a similar consideration for off-site DR facilities. Storage requirements increase and so do server counts -- if it’s a warm or hot site. The bandwidth that’s needed to move huge volumes of data across significant distances can have a major affect on WAN costs.

The cost of virtualization software can vary, but it must be included into any cost evaluation. For example, Hyper-V is part of Microsoft Windows Server 2008 R2, but you pay for each license of Windows Server. You can get Citrix XenServer for free, but Citrix Essentials for XenServer, which includes support and virtualization management tools, still carries a hefty price tag.

“The [Citrix XenServer] Platinum is going to cost you $2,000 to $5,000 per server,” said Ty Hacker, director of technical sales at I-Business Network, an application service provider in Marietta, Ga. “And that feature set basically includes storage management and high availability.”

It’s a similar story with VMware -- starting with VMware Server for free and then moving to paid versions with enterprise features and tools.

Then there’s the cost of labor to install and set up new hardware, retire or reallocate old hardware, install and configure the virtualization platform and provide ongoing management for that platform into the future. This includes the costs for the virtualization platform.

“It’s not just the licensing itself. It’s the support and maintenance over time as well,” said Mark Bowker, senior analyst at Enterprise Strategy Group, an IT analyst and business strategy firm in Milford, Mass. “It’s not something that goes away. You have that for life.”

For smaller shops and IT groups that have had layoffs and budgets slashed over the last few years, the burden of adopting data center virtualization may simply be too much to handle right now.

And finally, budgetary issues underscore the sensitivity to cost. Twelve percent of respondents said that non-IT executives won’t approve expenditures to virtualize, while an additional 12% of respondents are waiting for the price of all virtualization options to be reduced.

Part of the problem is poor communication. The move to virtualize is ultimately a business decision, so IT must be able to express the goals of a virtualization initiative in a way that makes immediate sense to management. For example, making the case for virtualization from the standpoint of enhancing availability or enabling superior DR and recoverability may be just the argument needed.

Timing is another issue that can delay virtualization approval. Hacker said that the strongest case for data center virtualization is often made when infrastructure hardware is reaching its end of life or lease. The prospect of acquiring fewer next-generation servers through consolidation can be a compelling motivator for management. If there’s still life left in the equipment, virtualization’s need for newer and more powerful equipment might just get the initiative shelved.

Virtual deployment and management complexity

TechTarget survey respondents who opted to not yet virtualize also disclosed a range of complexity worries when it came to deployment and management of virtual servers. More than 22% of respondents said that virtualization was just too complex. Another 11% said they were waiting for Microsoft options to mature, while more than 7% were waiting for XenServer options to mature.

Respondents who were concerned about product maturity should take a fresh look at the three major virtualization platforms, which show signs that they’re ready for enterprise-class deployment.

Perhaps the biggest area of development for virtualization platforms is ensuring security between VMs.

“Right now the technology is probably ahead of where most people are with their deployments,” Bowker said, adding that organizations virtualizing 30% to 40% of their infrastructures would probably benefit from better integration. But he said organizations that are still virtualizing 20% to 30% of their infrastructures should find virtualization platforms well-suited for deployment today.

Survey respondents also said they are holding off on data center virtualization because of a lack of virtualization skills. Almost 10% of respondents cited a lack of in-house management skills, while almost 9% worried about a lack of in-house installation skills. Although there is no specific skill set or certification level that would lend itself to a successful virtualization rollout, experts say that virtualization administration is often a cross-platform or cross-discipline endeavor.

Data center managers don’t need to be recognized experts in any one area, but should understand the implications and relationships among their servers, applications, storage and networking. They may also find value in fresh evaluations of current platforms.

“XenServer has been simplified to the point where -- even if you didn’t have that skill set or that exposure back in the early adoption days -- you would easily be able to get a deployment done,” Hacker said.

Comprehensive proof-of-principle testing is one way to manage the perception that virtualization is complex. Testing is a valuable way to overcome initial adoption jitters in areas such as setup, provisioning and management. Testing is also crucial when planning to expand the deployment of virtualization to more mission-critical workloads such as Exchange.

Some organizations with limited IT resources may address the complexity of virtualization by hiring an outside contractor or a value-added reseller (VAR) that can perform the evaluations and proof-of-principle testing. The VAR can also handle the initial rollout and management and then provide training and support so organization can eventually assume full control over their projects.

In addition, survey respondents expressed concern about their applications, and almost 11% said they decided to wait because they felt the applications were not a good fit for virtualization. Homegrown or custom-developed applications may still prove problematic when virtualized, but experts have said that the concern has largely fallen by the wayside for commercial Windows applications.

“I have not seen anything that works in Windows Server 2003 that doesn’t work in a Windows Server 2003 virtualized environment,” said Scott Roberts, director of IT for the Town of South Windsor, Conn. “We have not experienced any major failure of an application because it was virtualized.”

Although the success rate for virtualized commercial applications is generally quite high, vendor support for virtualized applications is still spotty.

Bowker said that plans to virtualize should include the application vendor and a serious discussion about the vendor’s support policy if the application is virtualized. A critical vendor reneging on support can hugely impact the success of a virtualization initiative.

Justifying the need to virtualize

Survey respondents also expressed significant worries about justifying the move to virtualization. More than 20% of respondents said that the move didn’t make sense for the number of servers or applications that were in service. Another 20% said that server utilization was not a big enough problem to prompt the use of virtualization.

At first glance, it might seem like small- and medium-sized businesses (SMBs) would have the most trouble justifying virtualization -- to a point.

It’s true that it’s hard to justify virtualization when there is little hardware to consolidate, no service-level agreements to guarantee, no critical uptime requirements or no concrete recovery goals. But times are changing, and the needs of SMBs are more urgent than ever.

For example, an SMB with five servers may depend on 24/7 e-commerce resources to generate revenue. As a result, the justification for SMBs is not just a matter of hardware consolidation, although the prospect of running three workloads on one server can still be big for a small business. But, today, it’s often more about embracing the business value of availability, backup and recovery that virtualization makes possible.

A major speed bump for data center virtualization is not the initial cost. Rather, it’s failing to recognize the return on that investment. Bowker said that most companies aren’t able to achieve the promises of virtualization right away, and this can skew the perception of the return.

With virtualization, it’s about the expectation of the technology versus the reality of the deployment. “We still see low consolidation ratios out there -- three-, four- and five-to-one VMs per physical server,” he said. “But from a technology perspective, you see the virtualization vendors promising 10, 20, 40 VMs per physical server.”

Research and testing are vital for any project justification. Take time to evaluate the available platforms and frankly discuss the results. The decision to virtualize -- and your selection of platforms -- will have a long-lasting effect on your organization.

“Once you get involved in it, you also realize that it’s not easy to just change midstream when you invest all that time, effort, labor and money,” Roberts said, adding that the time and labor are often more expensive investments than the outright cost of the platform.

Finally, don’t overlook the value of VARs. The services of a VAR can often overcome a lack of time or in-house skills. An IT department grappling with day-to-day operations can engage a VAR to perform the initial planning, deployment and setup of a virtualized environment and then provide training that allows the staff to take over management. The organization may approach the deployment in phases, using the VAR for each subsequent planning and deployment phase, and then transfer control to the in-house staff.

Let the VAR make a strong case for ROI. The trick is to do your homework and find a VAR that is well-versed in virtualization technology, familiar with the needs of your vertical market and experienced enough to handle projects that match your scope.  

Stephen J. Bigelow, a senior technology editor in the Data Center and Virtualization Media Group at TechTarget Inc., has more than 15 years of technical writing experience in the PC/technology industry. Bigelow holds a bachelor of science in electrical engineering, along with CompTIA A+, Network+, Security+ and Server+ certifications, and has written hundreds of articles and more than 15 feature books on computer troubleshooting, including Bigelow’s PC Hardware Desk Reference and Bigelow’s PC Hardware Annoyances. Contact him at sbigelow@techtarget.com.

This was last published in February 2011

Dig Deeper on Virtualization and private cloud

PRO+

Content

Find more PRO+ content and other member only offers, here.

Start the conversation

Send me notifications when other members comment.

By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy

Please create a username to comment.

-ADS BY GOOGLE

SearchWindowsServer

SearchServerVirtualization

SearchCloudComputing

Close