Computing when you need it

As the concept of utility computing takes hold, data center managers should consider how this model can work its way into future infrastructure designs.

This Content Component encountered an error

How nice would it be to tell your CEO, with confidence, that when it comes to your data center you not only know exactly how much computing power you use, but how much it costs? How great would if be, if you've done things right, to give specifics about how much you saved and how much you'll save the following year?

Unfortunately, if you're like many data center managers, the best you can do now is give a vague idea about usage, cost and possible efficiencies. Clearly, this just isn't good enough to bring to your CEO. In fact, it could be grounds to send you packing.

This is one of the most compelling factors driving the demand for utility computing.

Think electric bill

In its most simple form, utility computing allows for the deployment of applications and computing use to customers on an as-needed basis, much the same way that consumers use electricity as needed. So when you have lots of users running an application, and more sign on, you simply allocate more resources to that application and route it to the appropriate location. And, while you're at it, you can automatically bill those users for the application and the processing time, so you know what users are doing and with which applications. Just like an electric company.

A slew of vendors, including IBM, HP and Sun, see utility computing as a cash cow. The problem for managers is that they all approach utility computing from slightly different ways -- and as a result, possibly slowing its adoption in data centers.

IBM, for example, touts its on-demand computing, which critics say is essentially a way for the company to sell computing services on an outsource basis. According to Jasmine Noel, principal of Ptak, Noel and Associates, a New York analyst firm addressing converging IT trends and how to leverage them, people she's talked with understand the concept as having to buy the hardware, which IBM would then manage.

An IBM Global Services Division spokesman said that isn't quite accurate, however. The company offers a wide range of services from selling computing time as if it were a utility to offering hardware/software/consulting services combinations that will help organizations get into the utility computing business.

A model to meet your demand

Bottom line: You don't have to think of utility computing as just a way to outsource your computing. There are other solutions available, and they seem to be pretty much agnostic as to the underlying OS. Linux is a good choice, because of its cost advantages, but there are other operating systems in the fray.

For example, Bernard Golden, CEO of Navica, a system integrator in San Carlos, Calif., notes that there is an open source solution for utility computing. It's called the Globus Alliance, and it has development projects going on for something called the Globus Toolkit, which will let you develop your own customized solutions for utility computing. (It should be noted that utility computing is almost synonymous with grid computing, in that the latter should allow you to switch computing loads and processes between users on the fly.)

Is anyone doing this? Some people are moving that way, according to Noel. She said several users with whom she's talked "are experimenting."

The big install

So how can one go about instituting utility computing? Noel cites four different ways. First, buy a big mainframe or similar computer and hook everyone up to that, and time-share the operation of the computer. That way, you know exactly who is using what, you can reassign computer resources as needed and you can even charge back if you want.

Second, "virtualize everything," Noel said. "Then you can allocate resources as required. Or you can buy a bunch of small computers that are all the same, and virtualize across all of them. People are doing this with blade servers," she added. Finally, she suggested that you can do some sort of resource provisioning to meet computing loads as required. "Theoretically, you can do some version of all those today. But I think that, for example, going to a full grid would take some leap of faith, although I'm sure that with some group of consultants you can do it."

Doesn't sound like a ringing endorsement. Nevertheless, the idea is a sound one. Golden said, "It's much more of a bleeding-edge trend. Who wouldn't want it? But the challenge is setting it up, getting applications that can run in that environment."

But he does think that it can save money in the long run, and Noel agreed. "It's worth doing," she said, "if you can use the hardware that you have today. Then, over the long term, you'll se only the hardware you need. Now many organizations are using only 60% of the hardware they have. That's where people are excited about utility computing." As for charging users, she noted there's no universal measure of cost for computing power, so people are, so far, leery of going that route.

Utility computing has been called "back to the future," Golden said, because it's so akin to mainframes and time-sharing. But it's not exactly that, and it does offer more efficiency and therefore lowered IT costs. So if you want to be a hero, you probably ought to take a look at it. Maybe it will work for you.

David Gabel has been testing and writing about computers for more than 25 years.

This was first published in March 2005

Dig deeper on Data center hosting, outsourcing and colocation

Pro+

Features

Enjoy the benefits of Pro+ membership, learn more and join.

0 comments

Oldest 

Forgot Password?

No problem! Submit your e-mail address below. We'll send you an email containing your password.

Your password has been sent to:

-ADS BY GOOGLE

SearchWindowsServer

SearchEnterpriseLinux

SearchServerVirtualization

SearchCloudComputing

Close