The maturity of server virtualization technologies, particularly VMware Inc.'s ESX Server software for data center architectures, is prompting an increasing number of companies to place virtualized servers directly into production or sometimes use them with data replication to secondary or tertiary sites for disaster recovery. As more companies go this route, experts suggest they hammer out best practices, while noting potential pitfalls along the way.
Preparing for server virtualization
1. Optimize before you virtualize
Phil Dawson, a London-based analyst with Gartner Inc., said virtualization should be part of a "three-to-five year game plan" that includes retiring hardware you no longer need. The aim: simplify architecture before collapsing physical workloads onto virtual machines.
He proposes starting by virtualizing storage from servers (either through network devices or SANs), making strategic investments along the way in network administration tools -- such as those provided by CA Inc., Tivoli, NETView Communications Inc. and other vendors -- to manage both physical and logical hardware.
"If you move old, unmanaged, distributed resources to a new consolidated, managed environment, you're going to get operational and cost efficiencies of two-to-three times the benefit," Dawson said.
2. Consider the impact on applications
Many vendors require that their applications run on dedicated hardware, according to specified configurations. Should problems arise with applications running on virtual machines, a vendor may not provide support unless users can replicate the problem on physical machines.
"The upstream impact is that end users won't be able to perform the business functions for which that application is responsible," said Kris Domich, a consultant with Dimension Data in Reston, Va.
Domich recommends reviewing support contracts before running applications on virtual machiness, including vendors' service level agreement obligations and chargeback specifications. Also, be sure virtualized applications perform well on single boxes before attempting to manage them across multiple environments.
"If your systems and applications are not running optimally on hardware, then don't expect them to run any better in a virtualized environment," Domich said.
3. Pay attention to emerging technologies
EMC Corp. subsidiary VMware gained market dominance by applying mainframe virtualization processes to x86 machines. The company boasts the most x86 virtualization customers, such as Philadelphia-based GMAC Commercial Holding Corp., which is using VMware to consolidate about 1,000 servers.
"We went down the path of virtualizing the hardware side of servers about two years ago and have cut our internal hardware costs about 30% to 35% so far," said Niraj Patel, the company's chief information officer.
VMware's chokehold on the market, however, means commodity pricing has yet to arrive for virtualization products. But competitors are gathering.
Analysts expect open source vendor XenSource to emerge as one of VMware's biggest rivals. In April XenSource Inc. released XenEnterprise, its first commercial version, to enable virtualization of Windows, Linux and other guest operating systems. Novell, Intel Corp, Linux and Advanced Micro Devices Inc. contributed to its code base, while Microsoft Corp. is licensing its virtual hard disk format to Xen.
"Having Xen in your test-dev environment puts you in a stronger negotiating position to beat up VMware on price," Dawson said.
VMware is taking notice, offering a free version of its virtual hard disk specification and outfitting its ESX 3 server with features for para-virtualization. Also looming on the horizon is Microsoft's Vista operating system for Windows. The forthcoming release is expected to feature hypervisor architecture that parallels that of VMware.
Pitfalls to avoid
1. Virtual machine overload
The blessing of virtualization can also be a curse. Rick Villars, an analyst with IDC in Framingham, Mass., said he has heard of instances of developers going overboard.
"They set up virtual servers to test new applications and keep setting them up and setting them up – until the server blows up because they didn't put a limit on how many virtual servers could be placed on a physical machine," Villars said.
Applications that are I/O intensive or suffer from low utilization usually are good candidates for virtualization, but pay attention to "what happens to I/O when scaling out the disk, the network and things like that, because sometimes you could be restricted by the physical limits of the platform," Dawson said.
Avoid overzealous consolidation efforts, too. Dawson advises clients to "optimize the best Web server, the best application server, the best database server, etc. -- because if you try putting all three workloads on the same platform, you're going to cause performance issues."
2. Failure to manage
Although GMAC netted savings by not having to purchase new machines, Patel quickly realized during virtualization that he still faced many familiar administrative headaches.
"We virtualized servers but didn't get rid of them. We still have all the management and maintenance issues you have to contend with when not really consolidating the (physical) servers," Patel said.
"There's no point in going to a platform like VMware if your cost of operations is [going to be] increased," Dawson said.
Also, the ability to quickly add capacity necessitates changes in IT management practices and processes, said Rick Villars, an analyst with IDC.
"One of the challenges is putting in a system that makes it easier to provision applications in a rational and logical way so that when you virtualize assets, you can do so without disrupting the environment," Villars said.
3. Cultural resistance
The management challenges point to another obstacle: overcoming pushback from users caught in the "one application, one box" syndrome.
Enterprises tend to neglect soliciting input from key stakeholders, including storage, networking, information security, facilities managers, application developers and, ultimately, end users.
IT should lead the way in educating users about the power, and the limitations, of virtualization platforms. Education is especially crucial as technologies evolve to enable broad deployment of virtual servers in production settings, Domich said.
"It's not solely a technical decision and it's not an architecture that should be developed in a vacuum without the involvement of the business units," Domich said.