This content is part of the Essential Guide: A look inside the DevOps movement
News Stay informed about the latest enterprise technology news and product updates.

How traditional IT shops can be more like public cloud providers

Want your data center to be more like an agile public cloud? Here’s how cloud providers handle apps, "big data," data center hardware and IT staffing.

LAS VEGAS – Enterprise IT shops that want to become more agile should take some cues from public cloud providers, which have raised the bar in terms of efficiency in the past decade.

In a keynote speech at the Gartner Data Center conference here this week, research vice president Ray Paquet outlined the key differences between how enterprise IT and hyperscale cloud providers do business, and then offered some advice on how enterprises can be more cloud-like.

"If we did things similarly, we would drive down our costs dramatically," Paquet said.

Start with the application
"Not every application can scale out, and not every application needs to scale out, but you should identify the applications that are necessary to scale out," Paquet said – preferably a non-mission-critical one. Further, push commercial vendors to do the same for their wares. But don't always believe a vendor that claims their app is cloud-ready. Instead, find out whether the application is designed to be shared nothing, uses asynchronous stateless communications and is hardware fault-tolerant.

Explore "big data"
When poking around for an application to get started with on the cloud, consider big data analytics. "If you want to start with an app with a true cloud architecture, big data is the app," Paquet said.

Hadoop, in particular, uses a scale-out, shared-nothing asynchronous communication model, and is designed to scale out linearly, thanks to its use of MapReduce. It also runs on x86 and upcoming low-energy servers, and uses Linux and the HDFS distributed file system.  For these reasons, "Hadoop is the actual killer app for cloud implementations."

What about big data vendors that aren't Hadoop-based? "Be very worried," Paquet advised.

Buy good enough and cheap
When it comes to designing and buying infrastructure, "Do not buy the most expensive things. Do not over-architect and engineer," Paquet said.

Whenever possible, use open-source software. Prepare for extreme low energy servers and avoid SANs in favor of software-based storage with a distributed file system or object-store model. For networking, use inexpensive top-of-rack switches and get ready for software-defined networking and OpenFlow. In the data center, use free cooling and remove unnecessary redundancies.

"Figure out what's cheapest and then make it good enough," Paquet said.

Hire the right people
Being more cloud-like isn't just about what infrastructure to buy, it's also about the people that use it.

On this front, Paquet told attendees to be on the lookout for people with Linux and open-source skills, notably Hadoop, MySQL and Hive. But more important than specific open-source tools is an understanding of the open-source community. "Hire people that know these communities, because these are community-driven exercises." In other words, "it's more important to know who can fix it, than to know how to fix it yourself."


Dig Deeper on Data center budget and culture

Start the conversation

Send me notifications when other members comment.

Please create a username to comment.