A common adage is that technology evolves a few steps faster than needed monitoring tools. Consequently, IT managers often find themselves playing catch-up when searching for tools to manage new systems that offer them tremendous potential benefits. Such is the case with virtualized management tools. As businesses have expanded the reach of their virtualized systems, a raft of vendors has developed products to oversee these systems.
While companies find plenty of choices, their feature sets fall short of customer desires. "Vendors have made significant progress in delivering tools to manage virtualized servers, but more work needs to be done," said Dave Bartoletti, senior analyst at Forrester Research Inc.
The gap exists because virtual management tools requirements have been changing. Years ago, servers came one to a box, and systems administrators spent their time moving from one machine to the next, with
Performance bottlenecks in physical servers were relatively easy to pinpoint because processing functions were associated with specific components. If an application was continually running out of storage, companies would just add more.
Third-party tools deliver robust features when companies have specific virtualization problems, but they have a niche rather than a broad focus.
However, that management paradigm does not mesh with how virtualized systems function. Workloads are not tied to specific systems. Instead jobs move dynamically among a number of different machines, so it becomes more difficult track what is happening with each component because so many different elements are interconnected. On a virtual server, a performance issue could stem from storage spindle contention or an oversubscription of server RAM.
Physical management tools provided rough indications about whether components are up or down and how much raw capacity was used. Corporations desire the same information virtualized systems and management tools.
"With virtual servers, however, the question is not how much processing power resides on one server, but instead how much capacity is available in the pool of systems," said Bartoletti. This level of visibility is not found in the physical world and is still a work in progress with the virtual systems.
Potential solutions have emerged in a hodgepodge manner from vendors with divergent vantage points. Virtualization suppliers, such as Citrix Systems Inc., Microsoft, and VMware Inc., have moved into the management space and experienced some success.
In business for 75 years, Columbia Sportswear Co. is at the forefront of the virtualization movement. In 2009, the company decided to build a new data center to support its 4,000 users and 700 terabytes (TB) of information. "We needed to improve our disaster recovery capabilities," said Michael Leeper, director of global technical infrastructure at Columbia Sportswear, which had experienced power outage problems at its Tokyo data center after a tsunami hit in March 2011.
The company decided to build a new data center and, after an initial examination, determined that moving the old servers to the new data center required as much work as migrating them to a virtualized environment. So the company, which had virtualized about 25% of its applications, decided to have just about every application run on a virtualized server.
With the bulk of its infrastructure running a virtual environment, the sportswear supplier needed more visibility into those systems. After evaluating its options, Columbia Sportswear chose VMware's VMware vCloud Suite. When the company took a quick look at the system a few years earlier, there were questions about its scalability. However, the vendor had made significant strides in that area, according to Leeper. He said another plus was the suite's level of visibility in the server hypervisors.
In addition to virtualization suppliers, established systems management vendors -- BMC Software Inc., CA Technologies, Hewlett-Packard Co., IBM and Dell Software-- have broadened their product lines with new virtualization management tools. In some cases, these suppliers developed the features in-house, and, in other instances, they came by acquiring startups focused on virtualization management.
Familiarity is a benefit in using products from an established supplier; IT administrators already understand how to operate the user interface. The downside is the integration between the physical and virtual management tools can be a bit cumbersome since they started from different design foundations.
Tools also are available from third parties. Independent software vendors such as Embotics Corp., Netuitive Inc. and ToutVirtual Inc. have focused on building highly functional solutions for virtualized systems. These tools deliver robust features when companies have specific virtualization problems, but they have a niche rather than a broad focus.
While the vendors have been making progress, more work remains. "Right now, we can see how an application runs in a virtualized environment, but it is not easy to get a complete picture of how all of our system resources are functioning," said Leeper.
As a result, customers can struggle to design optimal configurations. Delivering such features is not easy. Monitoring functions adds overhead to system resources, so running them constantly saps system performance. Also, enterprises can collect so much data that making sense of the information becomes a challenge. Consequently, better analytics is on customers' wish lists.
Another emerging need is integration between virtual servers and cloud systems. "As more and more workloads are deployed into production cloud environments, the need for real-time performance and availability monitoring increases," said Mary Johnston Turner, research vice president of enterprise systems management software at IDC.
About the author
Paul Korzeniowski is a freelance writer who specializes in cloud computing and data center-related topics. He is based in Sudbury, Mass., and can be reached at email@example.com.
This was first published in April 2013