Comparing cloud risks and virtualization risks for data center apps

An IT manager's traditional role has included lowering data center costs, becoming more responsive to business needs and reducing complexity. In the past few years, new application delivery models and technologies, enabled primarily by virtualization, have become available to simplify IT managers' tasks. According to research by Antonio Piraino of the 451 group, more than 70% of enterprises use

Requires Free Membership to View

server virtualization to reduce costs and increase agility. Piraino also indicates that 10% of corporations use or plan to use cloud computing, and potential use for other enterprises is growing rapidly.

However, if poorly managed, the risks of cloud computing and server virtualization could outweigh the benefits. This tip talks about how to handle applications in various execution environments (virtualized servers, internal clouds, public clouds). The main question is how and where IT managers should run their applications and workloads to gain the most benefit while controlling risk.

Integrating new technologies with existing data center infrastructure can lower costs, reduce data center complexity and increase business agility. But without proper planning, you shouldn't expect success. Integration must be as seamless as possible. The results should be transparent to the end user without requiring special efforts by IT staff.

Determining cloud computing and virtualization risks: Where to start
Start by determining the processing and operational requirements of your applications. Look at requirements for storage usage, availability and security as well as regulations and service-level agreements (SLAs). You also need to understand the challenges and risks of cloud computing and server virtualization. There are some tools that can help analyze application requirements. For example, Novell's PlateSpin Recon can help create a list of applications in your data center and provide some of their characteristics and processing requirements.

Integrating new technologies with existing infrastructure can lower costs, reduce complexity and increase agility. But without proper planning, you shouldn't expect success.

When the list of applications and their characteristics is available, use it to help determine which execution environments (on-premises virtualization hosts, internal clouds, public clouds) are most cost effective for your applications while managing risk and complexity. In some cases, applications should not be moved to another execution environment. Compute-intensive applications running on special hardware are not good candidates for cloud computing or virtualization. For example, if you have mainframes, there are a number of business applications that manage large amounts of I/O that should not be moved.

Both cloud computing and virtualization technologies focus on better utilization of resources. Server virtualization consolidates workloads running on multiple physical servers onto one, generally larger, physical server that hosts hypervisor software. A server running a hypervisor can host multiple independent virtual servers each running its own guest operating system and application stack. Virtualization provides the capability to move applications from server to server (physical or virtual), provision servers rapidly, and more. Virtualization provides economics and flexibility for clouds. However, cloud computing is on-demand, elastic and related to services.

Public, private and internal clouds
With a public cloud, a cloud service provider makes resources such as servers, storage, networking and possibly applications available to the user over the Internet. Public clouds are generally multi-tenant, and a customer's applications can run on a physical server shared with another customer. Public cloud services are usually offered on a pay-per-usage model. Amazon Elastic Compute Cloud (EC2) is by far the most popular public cloud today.

There are two forms of private clouds: internal clouds (internal to the enterprise data center, hence inherently private) and external private clouds. You have complete control over the resources in an internal cloud because it's inside your data center. An important difference between server virtualization and internal clouds is that an internal cloud requires that resource requests and provisioning interact as service requests and responses in an automated environment, avoiding manual intervention. Automation and orchestration are key characteristics of internal clouds. An internal cloud relies on the security measures available within the cloud and your data center. In addition, internal clouds avoid certain privacy issues that arise with public clouds. Ubuntu Enterprise Cloud (UEC) and Microsoft Azure are examples of packaged software for creating internal clouds.

External private clouds, such as Amazon Virtual Private Cloud, have characteristics of both internal and public clouds. Like public clouds, they reside outside the enterprise data center. But unlike public clouds, applications run on dedicated servers, and the provider builds container walls around the cloud for enhanced security. The level of control over the data in your cloud makes external private clouds similar to internal clouds. However, data privacy issues with external private clouds are not as clear with internal clouds.

Using the brief descriptions of server virtualization and clouds (referred to as execution environments) discussed above, look at Table 1 below. Table 1 summarizes the concerns and risks of cloud computing and virtualization with respect to application requirements. As an IT manager, you can use Table 1 to get a high-level view of the most appropriate environment(s) in which to run many types of applications in your data center. The list of concerns and risks is not necessarily complete, but it does include those that are most important.

Table 1: Comparing application concerns/risks of execution environments
Click to enlarge.

ABOUT THE AUTHOR: Bill Claybrook is a marketing research analyst with over 30 years of experience in the computer industry, with the last 10 years in Linux and open source. From 1999 to 2004, Bill was Research Director, Linux and Open Source, at Aberdeen Group in Boston. He resigned his competitive analyst/Linux product marketing position at Novell in June 2009 after spending over four and a half years at the company. He is now President of New River Marketing Research in Concord, Mass. He holds a Ph.D. in Computer Science.

What did you think of this feature? Write to SearchDataCenter.com's Matt Stansberry about your data center concerns at  mstansberry@techtarget.com.

This was first published in February 2010

There are Comments. Add yours.

TIP: Want to include a code block in your comment? Use <pre> or <code> tags around the desired text. Ex: <code>insert code</code>

REGISTER or login:

Forgot Password?
By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy
Sort by: OldestNewest

Forgot Password?

No problem! Submit your e-mail address below. We'll send you an email containing your password.

Your password has been sent to:

Disclaimer: Our Tips Exchange is a forum for you to share technical advice and expertise with your peers and to learn from other enterprise IT professionals. TechTarget provides the infrastructure to facilitate this sharing of information. However, we cannot guarantee the accuracy or validity of the material submitted. You agree that your use of the Ask The Expert services and your reliance on any questions, answers, information or other materials received through this Web site is at your own risk.