This is the fourth tip in our series on benchmarking best practices in the data center. You can also read the third installment on avoiding server benchmarking mistakes .
Virtualization is at the forefront of technology, but the benefits of virtualization also place a burden on computing resources in the data center. A proliferation of virtual machines (VMs) can sap a server's capacity or choke critical traffic across a network bottleneck before you even realize it. Administrators can use benchmark tools to keep tabs on server resources, utilization, performance and other factors over time, allowing organizations to plan and grow their infrastructure proactively before resources become scarce.
Virtualization capacity planning is a long-term investment
VMware, XenServer and Hyper-V are now deployed with very sophisticated graphical user interfaces (GUIs) that provide a great deal of information. For example, an administrator can see the connected storage repository, how it is being utilized and the space requirements for each VM. The downfall here is the forecasting and metric-planning capabilities of these tools. For the most part, they'll generate reports that are "in the now," but other tools will log data and help you plan for the future.
Experts agree that planning is everything, and is particularly important to start early in the development of your environment. "Much depends on the environment," said Timothy O'Brien, a virtual environment engineer and industry expert. "Prior to rolling out a benchmarking tool, we have to answer some very important questions."
There are two principal questions to consider.
- The business needs to understand the scope of virtualization in the environment. Will it have a majority of its system virtualized? Or will it be just a few VMs running?
- More users, more services and more applications will all affect computing resources that the environment will need to accommodate. Where do you want to go with the environment? That is, have you planned for the future?
One attribute of capacity planning that is often overlooked is end-user performance. As an IT manager, I know that a project will go south very quickly if the end users are unhappy. For example, a SAN is a powerful and necessary tool in a virtual environment. However, just because you have a lot of storage does not mean you can get the performance you expect. If you over-utilize your SAN, users will experience poor performance in applications and services. It's critical to select tools that can benchmark and track behaviors relevant to the user experience on an ongoing basis. Any unexpected variations in this data can then correlate back to changes, faults or other issues in the data center.
But it's not just a matter of tracking the attributes of storage and servers -- benchmark tools should provide network-level capabilities as well.
"Network traffic management plays a huge role in virtual capacity planning," said Cameron Christo, a virtualization architect at consulting firm MTM Technologies. "It's crucial to understand where you are going and where you are starting. A user must have parity with what the user is currently experiencing."
Practical tools for virtualization capacity planning
Data collected from within your virtual environment can provide a wealth of information. IT managers, using gathered information, can predict increases in capacity, stabilize user performance and even find problems before they occur. Each tool you use will collect data differently. Still, these metrics are extremely valuable. For example, if you have a physical server running several VMs on it, you can gauge the capacity you're using over a period of time. After a few months of collecting this information, you're able to make educated predictions on how much space and network utilization you'll be using a week, month or even a year from now. This sort of data prepares your environment for growth and long-term savings. Capacity requirements within a virtual environment can be extremely dynamic. That's why having every little piece of data at your disposal can really make your virtualized infrastructure go a long way.
One tool, VKernel Capacity Analyzer, can help if you have an existing VMware or Hyper-V environment. Administrators can log into their environments and monitor performance as well as resolve some serious issues revolving around their storage and SAN infrastructure. VKernel looks at capacity bottlenecks that are impacting performance and helps resolve the issue: It specifically says how to fix the current problem, how to avoid future ones and where to place VMs to maximize utilization without impacting performance. Going back to data analysis, using capacity forecasting tools such as VKernel will allow you to plan for the future and make very educated decisions on where your environment needs to be both short- and long-term. Setting log metrics and monitoring the statistics can show you where you're capacity requirements are and where they are lacking.
For example, VKernel's Capacity Analyzer will provide detailed visibility for factors such as NFS throughput, iSCSI and Fibre Channel disk latency and I/O throughput. Capacity Analyzer continuously collects and analyzes CPU, memory, disk throughput and disk I/O latency statistics for every virtual machine, host, resource pool, and data center. The analysis includes server-side and storage-side metrics. The net result of the analysis is a specific list of performance problems in application VMs, hosts, clusters and resource pools with a clear explanation of what's causing performance problems and how to fix it.
Of course, VKernel is just one tool in the belt. Since every server workload and virtual host has utilization peaks and valleys, another great tool to utilize is PlateSpin Recon. This piece of software builds scenarios based on the peaks and valleys gathered from server workload snapshots. The ability to use forecasted data ensures that plans are built to accommodate future growth. To have a tool that will help you automatically generate computing capacity reports will go a long way in saving you both time and money.
Know your tools
Before you go out and buy the tools discussed above, download some demos and really try to get to know your current hypervisor. "Newer virtualization tools allow for dynamic memory allocation and more granular capacity planning," O'Brien said. "For example, XenServer's 5.6 release allows the user to allocate memory live to a running VM. VMware has this capability as well." When using these capacity planning tools (or others), there are some important notes to take into consideration.
Always think about the end-user. Performance at the user level is crucial. If there is a bottleneck, or a lack of personalization for the user, the entire project may be scrapped.
Always be aware of the applications you are running. That is, 32-bit and 64-bit applications require different memory resources and utilize the virtual capacity in different ways.
Never underestimate the applications in your environment. Try to be honest with your set-up. If you are trying to save a few dollars now, it may end up costing you later. Be prepared for growth. For example, if you're setting up a new SQL VM, be prepared to add more RAM or even an additional CPU in the near future. As SQL can be very intensive, not forecasting the proper resources can cost your IT department more in the future.
- Always have an expert ready! Before you pull off an implementation, the vendor you're using is usually more than happy to help out. Proper planning from industry experts is crucial to making the best choices upfront rather than restructuring your environment later. Using a consulting engineer with capacity planning experience can save you a headache later.
Having the appropriate tools for virtualization capacity planning will get you only half way. The rest will rely on planning and truly understanding what is demanded from your environment. Always plan ahead and be ready for the dynamic capacity challenges in your virtual infrastructure.
This was first published in July 2010