Data Center.com

workload

By Cameron Hashemi-Pour

What is a workload?

In computing, a workload is typically any program or application that runs on a computer. A workload can be a simple alarm clock or contact app running on a smartphone. Or it can be a complex enterprise application hosted on one or more servers with thousands of clients, or user systems connected and interacting with the application servers across a vast network. The terms workload, application, software and program are used interchangeably.

Workload can also refer to the amount of work -- or load -- that software imposes on the underlying computing resources. Broadly stated, an application's workload is related to the amount of time and computing resources required to perform a specific task or produce an output from inputs provided.

A light workload accomplishes its intended tasks or performance goals using relatively little computing power and limited resources, such as processors, central processing units, clock cycles, storage input/output and so on. A heavy workload demands significant amounts of computing resources.

A workload's tasks vary widely depending on the complexity and intended purpose of the application. For example, a web server application might gauge load by the number of webpages the server delivers per second, while other applications might gauge load by the number of transactions accomplished per second with a specific number of concurrent network users. Standardized metrics used to measure and report on an application's performance or load are collectively referred to as benchmarks.

Types of workloads

Workloads are created to perform many tasks in countless ways, so it's difficult to classify all of them with one set of uniform criteria. The following are some examples of how workloads are classified:

The emergence of cloud computing over the last decade has also driven the development of more workload types including software as a service, microservices-based applications and serverless computing.

Choosing where to run workloads: Cloud vs. on premises

Workload deployment -- determining where and how the workload runs -- is an essential part of workload management. Today, an enterprise can choose to deploy a workload on premises, as well as in a cloud.

Traditionally, workloads are deployed in the enterprise data center, which contains all the server, storage, network, services and other infrastructure required to operate the workload. The business owns the data center facility and computing resources and fully controls the provisioning, optimization and maintenance of those resources. The enterprise establishes policies and practices for the data center and workload deployment to meet business goals and regulatory obligations.

With the rise of the internet, cloud computing has become a viable alternative to on-premises workload deployments. Public cloud computing is essentially computing as an on-demand utility. An organization uses a provider's computing resources and services to deploy workloads to remote data center facilities in locations around the world, yet pays for only those resources and services it actually consumes over a given timeframe -- typically, per month. The cloud provider deploys complex software-defined technologies that let users provision and use its resources and services to architect suitable infrastructures for each workload on the cloud platform.

The challenge for any business is deciding just where to deploy a given workload. Most general-purpose workloads can operate successfully in the public cloud, and applications are increasingly designed and developed to run solely in a public cloud.

However, the most demanding workloads might struggle in the public cloud. Some workloads require high-performance network storage or depend on internet throughput. For example, database clusters that need high throughput and low latency might be unsuited to the cloud; the cloud provider might offer high-performance database services as an alternative. Applications that rely on low latency or aren't designed for distributed computing infrastructures are usually kept on premises.

Technical issues aside, a business can decide to keep workloads on premises for business continuity or regulatory reasons. Cloud clients have little insight into the underlying hardware and other infrastructure that hosts the workloads and data. That can be a problem for businesses that must meet data security and other regulatory requirements, such as auditing and proof of data residency. By keeping those sensitive workloads in the local data center, a business can control its own infrastructure and implement the necessary auditing and controls.

Cloud providers are also independent businesses that serve their own business interests and might not be able to meet an enterprise's specific uptime and resilience expectations for a workload. Outages happen and can last for hours and even days, adversely affecting a client's business and customer base. Consequently, organizations often opt to keep critical workloads in the local data center where dedicated IT staff can maintain them.

Some organizations implement a hybrid cloud strategy that mixes on-premises, private cloud and public cloud services. This provides flexibility to run workloads and manage data where it makes the most sense, for reasons ranging from costs to security to governance and compliance. This presents tradeoffs: for example, an organization might keep sensitive data and workloads in its own data center to preserve direct control over them, but it also then takes on more security responsibilities for them.

Benefits and drawbacks of running cloud workloads

Businesses deploy workloads to the public cloud for an array of potential benefits including the following:

There are also serious risks involved with public cloud computing that every cloud user should consider:

Benefits and drawbacks of running on-premises workloads

Many businesses continue to build and maintain more traditional on-premises data centers, which can provide business benefits including the following:

However, on-premises data centers are also subject to important drawbacks that can affect business operations, such as the following:

Workload management tools

Software tools are vital elements of workload management. Tools can report on the availability, health and performance of important workloads within local, cloud and multi-cloud environments.

Tools, typically, track the resources and services available within the infrastructure and report on the behaviors of desired applications. Administrators use tools to quickly determine whether an application is online, which resources it consumes and other metrics related to its activity, such as transactions and concurrent users. Some tools can support both on-premises workloads and workloads in major public clouds within the same pane of glass.

There are many workload management tools available. Often, these fall under the categories of application performance monitoring or application performance management tools. Some examples include the following:

Cloud providers generally offer dedicated tools designed to report on the resources and services a business consumes, as well as the health and performance of applications running in the cloud environment. Examples include the AWS Management Console and Microsoft Azure portal.

Cloud computing and cloud management require a vast skill set, ranging from workload management to various programming languages. Learn the top cloud computing skills for a career in this field.

20 Feb 2024

All Rights Reserved, Copyright 2000 - 2024, TechTarget | Read our Privacy Statement