kantver - Fotolia
No matter how well standardized your servers and storage, the whole IT platform won't work properly unless the network shapes up. Together, NFV and SDN squeeze out special vendor differentiations that hold back networking.
Network standardization is not a new problem, dating back to Ethernet's march over proprietary upstarts like Token Ring and ARCNET. Even where different speeds of Ethernet are involved, any new version is backwards-compatible with previous versions: A 10 MB Ethernet network interface card operates with a 1 GB local switch, which connects with a 10 GB core switch.
While LANs ubiquitously rely on Ethernet, wide area networks took longer to standardize. Today, full, end-to-end Ethernet transport capability is the norm as more optical fiber goes down and telecoms' operators support more data. So, why aren't we happy yet?
The problem is that IT departments handle mixed data workloads differently. For example, video and voice are real-time traffic: Any lag in the system can lead to major user experience problems. To cope, the IEEE created additional standards around priority and quality of service (802.1p and 802.1q), and many telecommunications companies support Multiprotocol Label Switching. However, fully integrating every piece of network equipment in a chain isn't as easy as it should be.
Many network equipment vendors build extra functionality into their products via network operating systems, such as Cisco IOS Software or Juniper Junos OS. Vendors often include functions that are incompatible with competitors' equipment, or adopt standards differently to ensure market differentiation. Problems arise once a data center moves from a homogeneous to a heterogeneous mix of equipment -- network traffic starts to suffer.
SDN and NFV to the rescue
Software-defined networking abstracts the three functions of a network switch. The data plane, responsible for moving packets of data from points A to B, remains a hardware function. The control and management planes, responsible for identifying, prioritizing and defining actions and managing all aspects of the data, come out of the network equipment and run on standard servers.
OpenFlow is a standardized means of implementing and managing SDN. The Open Network Foundation gathers technology vendors to drive adoption of the OpenFlow standard and SDN concepts.
The SDN ecosystem allows network deployments to adapt and change without operating systems blocking interoperation with other vendors' equipment. Functions are written and rapidly deployed at the software level, and the network equipment itself becomes a dumb box.
In practice, SDN leaves a little to be desired. SDN sends a lot of traffic from the physical to the abstract layer, which increases network chatter and latency. You could end up with degraded, not improved, network performance. Service providers, in particular, find SDN unfit.
The service provider sector has introduced a more aggregated approach: network functions virtualization (NFV). A series of functions rolls up into a single action, minimizing network chatter. If you want to try out new capabilities, NFV lets you deploy within the shell of a nominal standard, without worrying about possible conflicts with existing standards at the physical level.
The right approach probably involves traditional networking, NFV and SDN -- SDN and NFV work well hand in hand, and not everything can be fully abstracted. There will still be instances where the network requires administration at a tactical level, forcing you to use a vendor's highly specific functions.
Networks still need intelligence at the hardware level, but it must be as standardized as possible. To safeguard your network roadmap, watch out for vendors that are going in a different direction than the rest of the industry. They might be taking you down a dead-end road. In the new world of highly hybridized, connected systems, there is little place for vendors' special functions.
About the author:
Clive Longbottom is the co-founder and service director of IT research and analysis firm Quocirca, based in the U.K. Longbottom has more than 15 years of experience in the field. With a background in chemical engineering, he's worked on automation, control of hazardous substances, document management and knowledge management projects.