It's no secret that hardware prices continue to plummet. It's also no secret that all of this ever-cheaper hardware...
By submitting your personal information, you agree that TechTarget and its partners may contact you regarding relevant content, products and special offers.
is increasingly more powerful. It's almost mind-boggling that every 18 months or so, regular as clockwork, you can get twice as much computing power.
|Expert Bernard Golden, CEO, Navica, Inc.|
In fact, it is mind-boggling, and these high octane servers can cause some hassles in the data center. In this tip, I explore the standard fixes applied when server shoppers go overboard. These include server consolidation and grid computing. Then, I explore another option: virtualization with Xen.
The fact that you can now purchase a 64-bit machine for around $500 is absolutely incredible; however, being able to buy brawny computers available for beer money can cause problems. It's so easy to buy a new server and load it with a new application that many organizations have a cornucopia of computers in their data centers. You see so many 1U pizza boxes in there, you think you're in a Domino's training facility.
What's the problem, you say? They're all cheap, so there's not much capital tied up. Cheap hardware is great. That's true, until you think all that cheap hardware is managed by expensive people. Every new machine imposes an incremental cost in network management, hardware management and system administration. All of which looks worse in light of the fact that many of these machines run at no more than 10% or 15% load.
Consolidation, grid computing: Band-Aids
Server consolidation and grid computing are the most common Band-Aids applied to the management and utilization injuries associated with server glut. Let's take a look at these approaches.
Server consolidation is a fancy name for moving multiple applications onto fewer machines. Running more apps on fewer machines raises hardware utilization rates and reduces costs.
What's the fly in the ointment with server consolidation? Well, many times these disparate applications conflict with one another, rely on different versions of lower-level components and pose challenges in problem isolation. In other words, server consolidation can use up even more expensive people resources. In many cases, server consolidation sounds good in theory, but can cause problems in execution.
There's so much hype around grid computing, you'd think it was the second coming of the Apple Newton.
The notion of grid computing is that applications are no longer tied to specific machines, but exist somewhere out of sight in a sea of machines. All of these servers are tied together with grid computing software that delivers computing power on demand, which raises hardware utilization rates.
At least, I think that's what the notion of grid computing is. I'm always suspicious of a technology that requires multiple presentation slides to communicate what it does. I'm also suspicious of solutions that require additional software to solve a problem. It seems to me that if we're having trouble tracking problems down on individual machines, problem solving in the grid sea is going to be significantly more difficult.
Virtualization offers data centers more
There is another way to get better use of an organization's computing infrastructure, a way that does not lead to application conflict as server consolidation may or implementation of a new, expensive and complex software infrastructure as grid computing does. It is virtualization.
Virtualization is the ability to run multiple instances of an operating system on a single box, making each of them appear to the programs they run as the operating system (OS) that "owns" the entire box and is running as a single OS instance. Put another way, a virtualized OS is a logical, rather than a physical, presentation of resources to running applications.
Virtualization is a terrific technology for IT organizations. It enables them to take advantage of very powerful hardware, yet reduce the number of machines they have to administer and track.
How does virtualization work? A software application runs on the physical machine and, in turn, enables that machine to host multiple copies of an operating system. The virtualization application has the ability to run a number of these virtual operating systems, each of which is isolated from the other virtual operating systems that the virtualization application is executing.
This ability to host multiple virtual operating systems, including different versions of the same OS or even completely different operating systems, is intriguing. This capability makes virtualization a great tool for development or operations groups that need to run multiple versions of an application. So, for example, a development group can be developing an upcoming release on SUSE 9.3, while a QA group can be testing the current beta release on another instance of SUSE 9.3. The operations group can simultaneously be reproducing a problem in the released product on SUSE 9.3. Of course, the real scenario might be more like several different patch levels of the same major version. Virtualization allows each group to run its exact environment, right down to the patch level.
Virtualization offers the benefits of server consolidation, in that multiple applications can be run on a single machine, while avoiding the issues of application conflict and problem isolation. And, of course, it does not solve the machine resource problem by introducing another in the form of a new software infrastructure, as in grid computing.
The Xen path to successful virtualization
Virtualization is a very popular technology, but has been held back from further use by two factors: cost and hardware load.
The most popular virtualization product is VMWare from EMC. It is widely regarded as an excellent product, but it expensive. Its cost has kept it out of reach for many organizations.
Setting cost aside, most previous virtualization products have imposed performance penalties on the virtual operating systems. Put more simply, an application executing within a virtual OS performs poorly compared with the same application running on a native OS. This performance hit is caused by the intervening levels of software interposed between the application and the underlying native OS.
There is a product that can address both virtualization problems: Xen. Xen is a small (< 50 KLOC) virtualization product available under open source license. Due to its small size and clever engineering, Xen imposes far less of a performance penalty on running applications than other virtualization products.
Best of all, since Xen is released under an open source license, it is free.
You'll be hearing a lot more about Xen in the future. It offers real relief to the pizza box overload syndrome while still allowing current system administration practices to continue unchanged. If you want to simplify your server farm, take a look at Xen.