Access your Pro+ Content below.
Monitoring and optimizing host server performance
This article is part of the Virtual Data Center issue of February 2010, Vol. 19
When server virtualization first became popular among IT professionals, virtualization technology was marketed as a way to decrease costs by making better use of existing server hardware. The idea was that many servers only used 10% to 20% of their available resources, so it made sense to combine those servers onto a single box rather than purchasing dedicated hardware for each server. More on virtualization performance: Virtualization performance and server resource management guide Virtual machine monitoring and security guide Virtual machine host monitoring improves VM performance But server virtualization has undergone a fundamental shift in recent years. Today, virtualization seems to be more about achieving flexibility and resiliency within data centers. Rather than virtualizing only servers with low overhead, some organizations have begun virtualizing entire data centers. Doing that allows organizations to treat the host servers as a pool of resources that can be allocated on an as-needed basis. Although this approach ...
Access this PRO+ Content for Free!
By submitting your email address, you agree to receive emails regarding relevant topic offers from TechTarget and its partners. You can withdraw your consent at any time. Contact TechTarget at 275 Grove Street, Newton, MA.
Features in this issue
The abstraction of virtualization makes resource provisioning tricky. Without proper capacity planning, over-allocation can affect performance and waste resources.
The benefits of Windows server virtualization can quickly turn to nightmares if you choose the wrong backup strategy. Learn to avoid the most common missteps.
Application performance is even more important in physical servers that share multiple VMs, so ongoing monitoring and optimization of virtual host servers should be routine in any virtual data center