agsandrew - Fotolia
The idea of a full software-defined data center is surrounded with hype, but IT experts are hesitant to accept widespread implementation as a reality. Even as large cloud providers, such as Google and Facebook, continue to pave the way, most of the industry trails behind, with a variety of obstacles obstructing the vision of a full software-defined data center.
The mystery of the SDDC
For starters, the definition of a software-defined data center (SDDC) remains unclear. Some vendors alter the definition of SDDC to align with their own products and services, adding to the confusion, said Richard Villars, vice president of data center and cloud at IDC, a research firm based in Framingham, Mass.
When most people talk about SDDC architecture, they refer to the IT equipment itself -- a way to more easily configure compute, storage and network elements. But that's not necessarily the whole picture.
"Those are all a big part of what's inside a data center, but data centers also include the power, cooling systems and a lot more of the structural things … If you want to talk about the whole data center itself, you have to include other facilities elements, as well," Villars said.
Christian Perryprincipal manager and analyst at TBR Inc.
Other experts contend that SDDC is still a concept and not an existing product ready for purchase.
"It's such a difficult and expensive endeavor to actually virtualize -- or software-define, if you will -- an entire data center," said Christian Perry, principal manager and analyst at Technology Business Research Inc. in Hampton, NH. "It requires a lot of expertise and it's a process that takes time, and we haven't quite seen that yet."
A full SDDC architecture also requires too much movement of data between the hardware and software environments to work effectively, said Clive Longbottom, founder of Quocirca and a TechTarget contributor. Instead, companies may embrace a hardware-assisted, software-defined environment.
"As overall platform performance improves -- via flash memory, for example -- more can be moved to the software level," Longbottom said. "For now, the key is to have a software-level interface that can understand and manage the mix of software-defined and hardware-focused environments."
Adoption in increments
One way companies can make the complex SDDC process more approachable is to attack individual segments and focus first on their most crucial issue, Villars said. For example, IT teams that run into network flow and security issues with their existing compute environment should adopt software-defined networking (SDN). Other companies dealing with an explosion in new content and data must make their storage more scalable and efficient, and software-defined storage helps with that.
The technology has matured enough to allow people to build SDDCs, said Cliff Grossner, a research director at IHS Technology Inc. Large cloud providers such as Google and Amazon lead the way with sophisticated SDDCs, and the Open Compute Project provides further innovation, he said. Most organizations, however, deploy only one piece of their software-defined environment at a time.
More than 10% of data center network cords will have SDN controllers by late 2017, Grossner predicted. If SDN is implemented in a data center, it's likely that the compute and storage elements are already software-defined.
"We've seen server virtualization for a long time now, and that's well established," Perry said. "All of the different elements that would reside in an SDDC are out there, and they're being deployed sort of on a standalone basis."
For many companies, the challenges of adopting an SDDC architecture are organizational in nature. Two of the biggest barriers to SDDC adoption are the separation of various IT infrastructure groups such as networking, storage and servers, and an inability to form a coordinated vision of the organization's goals, Villars said.
To combat that challenge, many companies turn to a managed service option for SDDC, at least for the first six months, but often for the first year -- and sometimes forever. That strategy is fraught with its own political and business challenges "but it's a reflection of what the business units are really asking for: a much more agile delivery of services," Villars said.
Another obstacle is the lack of standards around SDDCs. Over time, the industry will settle on a set of consolidated standards which will drive more momentum around SDDC use, Perry said.
But even with standards in place, widespread SDDC adoption won't happen instantaneously. Data center personnel must be familiarized with the standards, as well as the technology. And once the standards are implemented, there will be a rush for talent among organizations to deploy SDDC environments.
"The amount of knowledge and experience that an administrator needs to have to deal with current IT infrastructure is already amazing," said Sander van Vugt, a consultant and TechTarget contributor. "There's a shortage in skilled staff around the globe."
For SDDCs to be widely adopted, data center administrators' skill sets must evolve, including beyond the data center walls -- and this is already happening. "When you shift to this model, you move to a much more lights-out type operation," Villars said. "The only time[s] you're going to touch [a] device is when you install it, and when you move it because it's either broken or you're retiring it."
The day-to-day jobs of admins will become less about configuration and more about management, analyzing and determining whether their organization is maximizing its use of a pool of compute or storage resources, Villars said.
Moreover, the silo approach to IT also will diminish and converge thanks to virtualization. "We'll see more of an infrastructure administrator -- that's who we would expect to enable an SDDC," Perry said.
Disaster recovery is still a priority in the SDDC
The advantages of blade vs rack servers in an SDDC
Explore the benefits, challenges of an SDDC