The emergence of new data center infrastructure technology such as virtualization, automation and consolidated...
devices, has transformed data centers from monolithic, static systems into diverse, dynamic computing hubs capable of "big data" analytics.
In response, several vendors have developed data analysis packages -- think of it as big data analytics for IT --that sort through various configuration possibilities and present corporations with their best design options.
Since they have been the hubs of a company's computing infrastructure, data centers were often treated gingerly. Corporations did not want to create any blips or outages, so once these expensive computing palaces were put in place, they would operate as-is. In some cases, they were upgraded every few years. In other instances, changes stretched out for as much as a decade.
Recent technical advances have made data centers much more dynamic.
"Daily, the number of users, response times and even application design changes -- and often quite dramatically," said John Stanley, analyst for data center technologies at 451 Research, based in New York.
As a result, estimates are that 50% to as much as 75% of data centers operate with obsolete system configurations, according to a 2011 IBM survey.
While data centers should be under near-continual review, many corporations have found performance monitoring and capacity planning quite overwhelming. Traditionally, businesses tried to determine if they needed additional servers, storage or network capacity by running what-if scenarios on spreadsheets. But the list of potential design options has grown too large and complex for that approach. Newer tools are needed, and vendors have begun delivering them.
Like other areas of IT, data center performance has been influenced by the continued improvement in data analytics tools. These products collect mountains of raw information intended to help companies draw conclusions based on the data. Data analytics is used in many industries to allow companies and organization to make better business decisions. It is now being used to help IT managers determine how to configure their data centers.
The emerging data center tools solve various problems.
"One IT challenge has been being able to quickly visualize the health of the data center infrastructure," said Stanley.
With the number of virtual devices increasing quite dramatically, just putting all of the relevant data on a single console can be difficult.
Making sense of what is happening in the data center can also be a tedious process. For instance, Tieto, which has 20,000 employees, is a managed service provider based in Finland.
"We have tens of thousands of physical and virtual servers that constantly are being reconfigured," stated Geoffrey Ekman, technology specialist at the company. Technicians do not have the time to dive deeply into the performanceof various elements and need solutions that quickly and clearly illustrate whether virtual machines, hosts or host groups have too many resources, too few resources or have been configured optimally.
Identifying the problem is only the first step in the planning process. The IT staff needs to understand what action needs to be taken by determining how many servers are truly required, versus how many are in use, based on all various constraints, such as current utilization, company policy and projected growth. The tools must provide specific actions to resolve and prevent shortfalls for host groups, resource pools, hosts, virtual machines (VMs) and virtual I/O server (VIOS), such as rebalancing and allocation changes. They can make recommendations for resizing hosts in order to accept new workloads. Recommendations can be pushed to third-party ticketing, orchestration or management systems to automate key processes and changes.
A number of vendors have developed such tools. Cirba offers the Cirba Control Console, enabling IT organizations to optimize capacity decisions, VM placements and resource allocations for AIX-based IBM PowerVM environments. The product supports advanced PowerVM capabilities, such as Logical Partitions (LPARs), VIOS, shared processor pools, Capacity on Demand and high availability (HA) failover nodes.
IBM's Systems Workload Estimator provides companies with a series of recommendations for sizing their CPUs, RAM and I/O. The tool's financial scenario software deduces if firms need to build, consolidate or maintain the status quo.
Lumina's Analytica is a visual tool for creating, analyzing and communicating capacity decision models. Its Intelligent Arrays can model uncertain schedules so companies have the right capacity whenever new applications come online.
Ravello's Compute Service Planner relies on a SaaS model to provide companies with data center configuration information. The product identifies when specific servers in a pool should be removed and how many new servers should be added in order to minimize the number of required software licenses for the entire pool.
Reflex Systems' vCapacity module integrates monitoring, performance, capacity and configuration management functions. It delivers real-time visibility, correlation and control across a virtual or private cloud environment.
The Sentilla Data Center Performance Management solution provides granular insight into data center performance in areas like physical devices, virtual systems and facilities equipment. It relies on a "manager of managers" approach to illuminate available head room, usage/consumption, peak demand and load limits for items like CPU, memory, storage, network and power.
VKernel's Capacity Analytics Engine can be connected to external systems, such as help desks, ticketing systems, billing applications, self-service portals, reporting frameworks and systems management consoles. More than 50,000 system administrators use the company's products, including Tieto. Ekman said the company selected the product because its algorithms provided a clearer picture of how the managed service providers' servers were operating than alternatives. He said, by quickly identifying underutilized systems, the tool paid for itself within 12 months.
Veeam Software Inc.'s Veeam One provides real-time unattended monitoring of virtual infrastructures. The product's alarm dashboards provide single-click access to objects and performance graphs.
VMware Inc. has VMware vCenter Operations, which integrates performance, capacity and configuration management functions. The product's real-time performance dashboards help companies meet service-level agreement guidelines by illustrating current and potential performance issues.
While helpful, the tools do have some downsides. They can be expensive; pricing starts at a few thousand dollars but can quickly pass the six-figure mark. Because each data center is unique, the tools require a lot of customization. Also, staff often needs to spend a great deal of time to learn how to maximize use of the tools.
Nevertheless, their use is spreading, because data center planning has become quite cumbersome. The emergence of new data analytics tools promises to help IT managers simplify the process and ensure the right size configurations, but time and money may be needed before they realize such results.
ABOUT THE AUTHOR: Paul Korzeniowski is a freelance writer who specializes in data center issues. He is based in Sudbury, Mass., and can be reached at firstname.lastname@example.org.