Article

Capacity planning no longer a guessing game

Johanna Ambrosio

Be prepared to step away from the spreadsheet you've likely been using to track the growth of existing applications -- if you have -- and get ready for a whole new way of thinking about capacity planning.

Experts said the dawn of

    Requires Free Membership to View

The problem is, nobody really knows if the new application is going to take 40% fewer or 5% more resources.


Cheryl Watson
founderWatson & Walker
new data center technologies -- including utility computing and capacity on demand -- will re-frame the thinking behind how to plan for unforeseen requests of IT. It will likely take a full 10 years for this scenario to reach the majority of shops -- and that's a good thing, because today's planning tools will need the time to catch up.

Traditionally, capacity planning has been part art, part science and part willpower. At its best, capacity planning can help predict when a given application might outgrow its existing server or let IT staffers see what will happen when another 500 users are added to the network due to the most recent corporate merger. Done correctly, and done proactively, capacity planning can help avoid downtime and keep the business humming.

Problem is, it's rarely done at all. Capacity planning has a reputation for being time-consuming, costly and difficult, and many IT staffers consider it another source of delay in any given project's rollout. As a result, most shops don't engage in the process except for the most mission-critical applications or highest-profile new projects -- or when performance is coming to a crawl.

As a result, most shops tend to ballpark future hardware needs -- basing their guesstimates on existing applications' performance and past growth patterns.

"But the problem is, nobody really knows if the new application is going to take 40% fewer or 5% more resources," or how quickly an existing system will run out of oomph, said Cheryl Watson, founder of Watson & Walker, an independent systems management consultancy based in Sarasota, Fla.

As a result,

Read more

Good planning can ease space concerns

Virtual clusters: IT management's magic pill?

companies tend to buy more horsepower than they need, based on the peak usage that's possible even if the peak only exists for a week to close the books at the end of the fiscal year, say, or for two days during the holiday rush at a retailer.

But this methodology will have to change, according to Watson and others. Because of a need for increasing levels of IT efficiency, there isn't as much unused capacity around to tide the company over in case capacity needs grow more quickly than planned. "There's been a push for server consolidation and making sure the utilization of existing servers is as high as possible," said Audrey Rasmussen, a vice president at Enterprise Management Associates in Boulder, Colo. "There won't be as much over-provisioning."

As Watson sees it, the coming capacity-on-demand world means that people will have to start thinking about daily average workloads and not the absolute highest performance that could ever be demanded, no matter how infrequently. "You'll need to plan for the machine size that meets most of the work," she said. "For those short bursts of really peak loads, you can get capacity on demand."

Another difference is that in the past, IT infrastructure may have changed every once in a while, but it's mostly been a static environment with periodic updates. In the capacity-on-demand model, "my infrastructure is really doing second-by-second shape-shifting," Rasmussen explained. In this world, resources are constantly being reconfigured to accommodate changing needs. "That makes it much more difficult to do capacity planning for specific applications," she said.

Companies are just starting to consider capacity on demand, and the tools are just beginning to become available, according to industry watchers. For instance, Watson said, IBM provides a specialized processor for its mainframes that offloads Java processing. The issue, Watson said, is "how much Java is being used out of a large WebSphere application? Nobody really knows, so it becomes a capacity-planning issue -- and there are no tools for it yet."

Milind Govekar, a research vice president at Gartner Inc., said IT staffs have a lot of time to work out these issues. He sees it as a slow trek, particularly because the technology's not ready to go quite yet. Capacity on demand is most applicable to server-based applications, not to distributed systems with middleware components and a plethora of moving parts, he said.

And for its part, utility computing will work, but it requires what Govekar calls a "service governor" to keep track of business policies and then align those policies with the available IT infrastructure. It's a matchmaker, of sorts, to make sure the most important applications -- as defined by the business -- have the IT resources they need. These governors, though, won't really be available until 2008 -- and even then, they will be vendor-specific until heterogenous governors become available a few years later.

In the meantime, customers will need to make due with more traditional ways of capacity planning. Most of the methodologies fall into three broad categories. The first is trend analysis, with one or multiple metrics to track things like CPU utilization and response time. The problem here is that this is limited to the current configuration of the device, so "you can't do what-if analysis," Rasmussen explained. Linear trending and multivariate linear trending fall into this camp.

On the next step up are things like simulations and load-testing to predict performance in a production environment. Of course, "you can't duplicate the production environment" in a test lab, but this does give a rough idea of performance, Rasmussen said.

The most expensive and time-consuming, but most accurate, options involve modeling. This allows IT managers to model the existing infrastructure and then track a new, or existing, application from beginning to end, to identify problems or potential bottlenecks. The downside here is that the models take longer to build and to populate than working with other types of tools require.

Still, out of all the planning methodologies, modeling perhaps has the most promise. Some of the newer tools pull real-time data feeds from applications and pop the information directly into the model. Building models becomes easier and less time-consuming, so models can be built more often and for other than the most resource-hogging or most important systems.

Most important, Rasmussen said, is the concept of having to mix-and-match tools and technologies to get a true and accurate capacity-planning picture. "Different tools give different types of information and have different accuracies. Selecting the right kind of technology to do the type of capacity planning you want is critical."


There are Comments. Add yours.

 
TIP: Want to include a code block in your comment? Use <pre> or <code> tags around the desired text. Ex: <code>insert code</code>

REGISTER or login:

Forgot Password?
By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy
Sort by: OldestNewest

Forgot Password?

No problem! Submit your e-mail address below. We'll send you an email containing your password.

Your password has been sent to: