A workload's variability also weighs in the public cloud vs. on premises decision. Determine application usage patterns, such as how often users access an app and whether that access is steady and predictable or spikes at certain times.
Mobile front-end applications, for example, can be notoriously difficult to manage on premises because the loading structure is difficult to predict on any given day, said Eastwood.
"If you're a retailer and you're headed towards Black Friday, you could have five times the load on the systems on that Friday compared to any other Friday," he said. "It's hard to plan and build for something you're only going to see once a year."
The cloud's scalability and self-service nature accommodate these unpredictable application usage patterns. Users can click a few buttons and spin up an instance in a matter of minutes to respond quickly to workload needs, with built-in automation at every step. Traditional processes require developers to submit a formal request to IT for an additional server or VM, to justify CPU and memory resources -- and then wait.
Enterprises also must track data growth patterns in addition to application usage. Typically, organizations will see 50% growth in data each year, but if you suddenly see that data quadruple, determine the economic effect that will result, and rethink your infrastructure to accommodate that.
Customer-facing applications that typically demand frequent changes might also be a better fit for cloud. For example, an application that an IT team uses only twice a year should stay on premises. But for applications that need frequent updates, such as bug fixes or new features, the cloud might be a better option.
"For [some] workloads, we would like to have changes implemented as fast as possible," said Baillargeon. "And the only way to get to that point is the cloud approach."