Six data center trends to expect in 2015

These six data center trends will move from leading edge to mass adoption in 2015. Experts and IT organizations that already took the plunge explain why.

This past year, IT pros matched their plans with business objectives and integrated cloud architectures and did...

everything for the sake of the application.

We asked industry experts, leaders and IT professionals what they expect to see shape and take over the data center field in 2015. Flash storage, DevOps, new infrastructure possibilities in the data center and the rise of open source enterprise options topped the list.

1. Open source

Open source communities ensure a fast rate of development and new ideas freely available to all, even if it's chaotic at times. Linux is the exemplar of open source for enterprise IT, with free options through fully supported and stable distributions on a paid model.

Static, specific hardware is no longer the norm.
Adam JollansLinux and open source strategy manager, IBM

*The vast majority (95%) of IT organizations will use some key open source elements in mission-critical workloads by 2016, according to Gartner, jumping by 20% from 2010. This means that enterprise IT shops must support both open and proprietary technologies. Gartner warns that open source adopters routinely undermanage those assets.

"An increasing proportion of workloads are built on top of open technology," said Adam Jollans, Linux and Open Source Strategy Manager at IBM. "Not just Linux, Apache -- now OpenStack, CloudFoundry, Hadoop, Docker, and the list goes on."

Open source will spread into IT hardware over the coming year, led by the Open Compute Project (OCP). So far, the OCP has had an "inspirational nature" to enterprise IT shops below Web-scale, said Jason Taylor, a vice president of infrastructure at Facebook and OCP board member. He expects open hardware to move into enterprise shops to solve real-world problems, and also to grow for software providers that want to offer an "appliance option."

*Experts advise choosing open source options based on whether the technology fits its purpose better than proprietary or homegrown offerings, and based on quality and risk management.

*information added following initial publication.

2. DevOps

DevOps is a strategy for modernizing and aligning IT to the business. It can help save traditional enterprises threatened by "born-on-the-Web" companies, said Shannon Poulin, vice president in the Data Center Group and general manager of Intel's Enterprise IT Solutions Group and Datacenter Marketing Group.

What's holding back your IT organization? Not just the legacy infrastructure, which can be remediated fairly quickly with three- to four-year refresh cycles, he said.

"The problem is your processes, methods, business models and personnel organization -- people that are in silos that aren't tied to the business unit bottom line," Poulin said.

IT gets stuck in a rat race of delivering more capabilities, faster, for lower cost, without ever knowing if they're delivering the right capabilities or cutting the right costs. Instead, with DevOps, IT organizations can rapidly deliver a new service that the business needs, at a reasonable expense, and with agility to react to any future unknowns.

"We started moving [from all on-premises data centers] into cloud to facilitate rapid innovation and scaling," said Justin Franks, lead cloud engineer at Lithium Technologies, Inc., a social software provider in San Francisco. "The old way of doing things wouldn't work well so we had to adopt a DevOps model: configuration management, automation, Chef, a lot of frameworks."

With service discovery and configuration tools like Consul from HashiCorp, Lithium cut 15% of its server deployment from cloud and on-premises. But new tech, new thought processes and new ways of operating mean a lot of training and changes. Lithium isn't at continuous integration and deployment yet, and Franks hopes to see it fully embrace the DevOps flow in 2015.

Gartner Inc.'s researchers recommend operating IT organizations in two modes: Traditional cycles for stable systems of record, and fast-paced DevOps collaboration on systems of innovation and engagement.

Agile methods like DevOps and standardized frameworks like ITIL are both effective mechanisms that can butt heads or can facilitate success together, said David Jones, principal consultant at ITIL training and certification provider Pink Elephant in the U.K. These don't have to be mistrusting or competitive, but rather serve two complementary functions.

"Agile means you want specific deliverables in place by a certain date," Jones said. "IT service management controls come along and say, 'You can't just put that in, mate -- it's got to be tested.'"

3. Software-defined for flexibility

Software-defined hardware was initially about consolidation -- getting 20 physical servers virtualized onto three host servers running 20 virtual machines. The software-defined infrastructure of 2015 and beyond is more about rapid service delivery and a flexible back-end system than utilization.

"I foresee more and more bandwidth at reasonable cost, which will transform locality of data and data storage tradeoffs like compression," Facebook's Taylor said.

Software-defined network and storage won't catch on uniformly across all data centers. Sectors that see immediate payoff -- telcos with software-defined networks, for example -- will adopt the technologies in 2015 with enterprise adoption staggered by need.

"Static, specific hardware is no longer the norm," Jollan said.

Staff changes will come along with DevOps and flexible infrastructure. While the number of servers per administrator scales up, these trends also bring in workload mobility and dynamic workload management.

"You need people to manage that, and probably more people than you expect," said Enzo Greco, VP and GM of software, Data Center Solutions at Emerson Network Power.

Try pilot projects in 2015, IBM's Jollan suggested, and get used to automated orchestration and reshaping platforms every time the workloads demand it.

4. Modular hardware

New and established hardware vendors are presenting more modular, configurable approaches, from hyper-converged infrastructure boxes to OCP's disaggregated racks that make upgrades modular.

"For 20 years we've been stuck in the mind-set of software tied to one server. ... [Disaggregated hardware brings together] several computers to present a service," Taylor said. "When you need more of something, you add in a sled of RAM. Don't open 1,000 servers and add in RAM to each one."

That concept of modularity extends to the processor level. Processors are expected to take on specific tasks and offload others to co-processors. With single cores reaching compute limits of today's fabrication technologies, and workloads diversifying and intensifying simultaneously, servers will rely on processor heterogeneity: x86, GPU, IBM Power and so on, according to IBM's Jollan. Open source hardware at the chip level, as with the OpenPOWER Consortium, is enabling this related trend.

"The notion that you buy compute in a certain building block -- 1U, 2U rack-based system with X number of processors, X number of drives -- is going to come under pressure in the coming five years," Intel's Poulin said.

It will be the year of converged infrastructure (CI) adoption, according to 451 Research, an IT research firm based in New York. Enterprise IT will gain efficiencies from the integration of compute, storage and networking, although it might require adjustments.

"If you look at the cost of a CI box compared to that of a traditional server, it seems like a huge difference," said Steve Schaaf, CIO of Francis Drilling Fluids Ltd., an oil and gas transportation logistics services company that switched from IBM blade servers to the SimpliVity OmniCube in the past year. "But when you factor in the cost of backing up and disaster recovery, of upgrading storage, and huge maintenance costs that I don't have on CI, it adds up."

Schaaf got rid of the offset server and storage upgrade cycles and invested in more OmniCube capacity when projects, such as adding a virtual desktop infrastructure, demand it.

5. Put a flash drive on it

Data center storage has evolved from all hard disk drives to incorporate solid-state drives and now non-volatile memory (NVM). In 2015, expect NVM to slot in anywhere you experience a bottleneck. Rapid flash adoption will push more organizations to segment data based on performance needs.

"Flash was a transformer for our databases, making them far more responsive with fewer outliers and lower latency," Facebook's Taylor said. "It changes how applications perform."

Smaller IT shops also experience flash benefits. Thomas, Judy and Tucker P.A., an accounting firm in Raleigh, N.C., switched from direct-attached spinning disk storage to a StorTrends 3500i storage array with flash and disk storage on iSCSI connections and 1 GbE connectors this year.

"[The direct-attached capacity] was preventing us from expanding," said Drew Green, the firm's IT director. "And snapshots on critical virtual machines slowed applications down to the point of unusability."

Green now hosts some critical VMs in the flash tier and assigns the most-used blocks of data to a caching layer of solid-state storage.

"We take snapshots all day long and there are no noticeable slowdowns," he said.

Hardware vendors are changing configurations to support flash drives on PCI-e interfaces for faster storage, Intel's Poulin said, which helps with the large amount of storage necessary for fields like big data analytics.

Taylor expects better PCI-e cards as adoption scales. But it doesn't stop at PCI-e, or flash. New types of NVM are needed, and should connect directly into the processor, experts said.

6. Facility and IT accountability

Whether it's an owned data center or space in a colocation facility, your IT operations should be monitored and measured at every turn.

Data centers should better understand their consumption, right down to the IT asset level so as to align the physical layer with the IT layer, said Phil Fischer, data center segment manager at Eaton Corporation, which provides power infrastructure.

IT and facilities want better visibility and more clarity. In 2015, companies will improve how they integrate device-level monitoring with data center infrastructure monitoring and building management systems for end-to-end deployments.

"It's begging for integration," said Emerson's Greco. "The data center is still a very hardware-oriented facility with little monitoring and management."

Next Steps

Dig deeper into 2015 trends in server virtualization, including the new wave of containerization

See how the SDN hype will play out in the next year -- and budget cycle

It's all about space: Watch these six storage trends in 2015

Here's the new mobile tech about to bombard your data center

New year, new you? The changing role of IT

IT predictions from 451 Research, Gartner, Forrester

Dig Deeper on Converged infrastructure (CI)

PRO+

Content

Find more PRO+ content and other member only offers, here.

Related Discussions

Meredith Courtemanche asks:

What data center trends do you expect to see in 2015? Why?

1  Response So Far

Join the Discussion

2 comments

Oldest 

Forgot Password?

No problem! Submit your e-mail address below. We'll send you an email containing your password.

Your password has been sent to:

-ADS BY GOOGLE

SearchWindowsServer

SearchEnterpriseLinux

SearchServerVirtualization

SearchCloudComputing

Close