Data center experts predict 2016 IT trends

Software-defined everything, security, IoT, APIs and other technologies top our experts' predictions for 2016 data center trends.

Data center experts touted 2015 as the year of change, but which 2016 IT trends will dominate?

Talk of hyper-convergence and the cloud dominated 2015 headlines. Some big vendors got bigger, as Dell acquired EMC and VMware, while HP split in two. IT organizations leaned on their DevOps teams to become more nimble. The surge of the Internet of Things (IoT) in consumer and business applications also promised major stresses on data centers

With 2015 nearly over, the data center experts of SearchDataCenter predicted what 2016 IT trends will stick around for the foreseeable future.

Software-defined everything

The software-defined everything train will continue rolling into 2016, experts said. Whether its software-defined storage (SDS), networking (SDN) or data centers (SDDC), IT consultant Jim O'Reilly sees 2016 as the year where software-defined fundamentally changes data centers.

"There's a point at which legacy [infrastructure] can't hold up against economics, and we're very close to it and even past it in some ways," O'Reilly said. "Next year could be the tipping point -- SDS and SDN will push this question very hard."

Industry analyst Joe Clabby is a bit more measured in his software-defined predictions. He sees storage as an area where businesses can save a significant amount of money by taking a chance on an unknown vendor with decent SDS functionality.

"2016 will be a year with a leveling effect from SDS," Clabby said.

SDS will allow new vendors with new functionality to compete with established players -- such as EMC and NetApp -- for businesses.

Not everyone thinks this will be the software-defined era, but instead a "hardware-assisted, software-defined" era, said Clive Longbottom of analyst firm Quocirca.

Even in a software-defined era, "there's still an opportunity for hardware innovation" when it comes to networking, O'Reilly said. He also noted the influence of hardware advances, with solid-state disk replacing hard disk drives.

"The talk around [software-defined data centers] will begin to go away," Longbottom said. "It's holistic platform, SDDC."

IoT and security

The rise of IoT is real.

IoT is all about sensors capturing data to feed programs that find problems or create useful information from the data. Whether that's your Fitbit capturing sleep-cycle information, a device attached to the OBD-II port of a UPS truck to track mileage and location, or smart meters on the electric grid, all of that data needs to be stored and analyzed in a data center somewhere, said Paul Korzeniowski, a freelance writer covering cloud and data centers.

More companies will jump on the IoT wagon, Clabby said. All of these companies -- enterprise, financial and healthcare -- use data centers to support their current applications. The question over IoT is how do companies prepare for the inevitable onslaught of IoT services? The answer is to not use x86 for everything.

"Use the right machine for the right job," Clabby said.

That means using a mainframe if you're doing analytics, or clustering IBM Power systems to get faster results, since IoT is all about finding relevant information fast.

Security of all that data becomes the next order of business, especially since many IoT-enabled companies store the data they collect in the public cloud. IT pros know that security is a big issue, but executives are reluctant to spend more money on IT. However, several high-profile data breaches in 2015 have pushed the issue to the forefront.

Infrastructure vendor market

The data center experts were unanimous about changes in the infrastructure-vendor landscape that IT buyers must navigate. Workloads moving to the cloud affects traditional data center vendors, as seen with Dell buying EMC and VMware, HP splitting in two and the Symantec/Veritas breakup.

[Hewlett Packard Enterprise] is out of public cloud, while IBM [went] all in on it with SoftLayer. This confusion from the major players won't be over in 2016.
Joe Clabbydata center industry analyst

Amazon Web Services is competing with IBM and Oracle, convincing businesses to migrate modern and legacy systems to AWS. Microsoft's Azure has also made an impact with legacy systems. In the shadow of the cloud, business leaders don't see the point in spending a lot on IT infrastructure when they can just shift their systems to a provider.  

"Companies are looking to spend as little as possible," Korzeniowski said. "So what is more valuable: cost or features?" 

And incumbent vendors are confused about how to evolve in a growing market that is feeling the impact from globalization and small company agility.

"Midsize service providers are getting acquired by the bigger ones," Clabby said. "[Hewlett Packard Enterprise] is out of public cloud, while IBM [went] all in on it with SoftLayer. This confusion from the major players won't be over in 2016."

Importance of APIs

In order to stay relevant, IT pros need more experience with software and API systems. APIs allow them to create automation in the data center.

"The people involved need to gain new skills," Longbottom said. "They need to learn APIs [and] virtualization management systems. Every company comes up with its own API -- they need to talk across back-end systems."

Clabby sees 2016 as the year where vendors come out with API tools.

"You don't want to have to support all these connections -- it would be best to have a vendor do it -- programmatic interface, send and receive and execute request," he said. "What does the application require, and how does the API tool improve it?"

But it isn't just about managing APIs with software from the likes of Apogee, CloudBees and Varnish. Longbottom stressed the importance of knowing what's above the API level, and understanding programming, rather than scripting.

"Everything [IT pros] do in that software layer will have a bigger impact than hardware advances."

About the data center building

Robert McFarlane, who heads up data center design for the international consulting firm Shen Milsom & Wilke, thinks that data center infrastructure management (DCIM) will settle down into the background in 2016, while retaining its importance as an essential tool for IT and facilities teams.

"I don't want to say that [the DCIM market] will have matured, because I don't think it will have," McFarlane said. "It's just going to fall away from being the forefront industry buzzword."

With more hybrids of virtualized, in the cloud, on premises or in colocation IT systems, having the right monitoring and information management tools is essential, Longbottom said.

Another big topic for facilities will be the new ASHRAE efficiency standard 90.4, slated for release in 2016. This new ASHRAE standard will correct what some might call mistakes in the previous 90.1 standard, which subjected data centers to prescriptive-based standards instead of ones based on performance. The results would have had terrible consequences for reliability. McFarlane, who is a voting member of the ASHRAE SPC 90.4 Standards Committee, noted that there is a difference between publishing a standard and the time when it is widely adopted and enforced.

Next Steps

See the latest IT trend predictions from Gartner

Top skills to develop in 2016

Determining data center design requirements

What is logistic regression?

Dig Deeper on SDN and other network strategies

PRO+

Content

Find more PRO+ content and other member only offers, here.

Related Discussions

Austin Allen asks:

What IT trends will change your data center in 2016?

0  Responses So Far

Join the Discussion

8 comments

Oldest 

Forgot Password?

No problem! Submit your e-mail address below. We'll send you an email containing your password.

Your password has been sent to:

-ADS BY GOOGLE

SearchWindowsServer

SearchEnterpriseLinux

SearchServerVirtualization

SearchCloudComputing

Close