Manage Learn to apply best practices and optimize your operations.

What is the difference between utility computing and grid computing?

What is the difference between utility computing and grid computing? What applications are appropriate for either one? Is this something just for large companies?
The basic principle used in utility computing and grid computing is identical -- providing computing resources as a service. The difference lies in the actual application of this principle.

"Utility computing" refers to the service-oriented delivery of computing resources by a small number of suppliers...

of computing power to a relatively large number of consumers of computing power in a manner similar to how utility companies provide power and water to consumers. Utility computing occurs when a supplier-owned or controlled computing resource is used to perform a computation to solve a consumer-specified problem.

Grid computing, on the other hand, tries to take advantage of aggregating the computing power distributed across a network of computers to create a massive computing resource required for solving complex scientific and engineering problems.

Utility computing is expected to bring the power of computing and the Internet within the reach of billions of people across the globe. It is certainly not just for large companies!

In addition, utility computing is expected to give rise to a whole range of consumer electronics devices and appliances. These intelligent devices will be able to take advantage of utility computing to perform a range of activities that require computing power and access to information over a network.

In the future, you may own a purchase planning device that takes advantage of utility computing. It is not entirely unreasonable to expect a "utility grid" that makes grid computing available as a utility that may in the future power your own personal weather forecasting device!

Dig Deeper on Linux servers