As open source technology evolves over the years, a community of developers and IT teams from around the world have shared software code and hardware specs. But a lack of unification and vendor support has, in some cases, stalled its adoption in the enterprise.
Initiatives like the Open Compute Project (OCP), however, have increased the interest in open source, as organizations push the concept in their data centers. Here are frequently asked questions that examine recent advances in open source technology, as well as the benefits and challenges of using these technologies in the data center.
How does open source technology benefit data center admins?
Open source technology can bridge the gap between common data center issues in various locations around the world -- a problem that occurs in one location is likely to occur in another. The sharing of specs and code, along with the flexibility to edit or manage that code based on the needs of specific users, allows administrators to fix problems based on similar situations in other data centers.
With open source technology -- be it software, hardware or a cloud platform -- users can minimize the costly IT purchases and upgrades associated with proprietary offerings, and instead utilize code already available through open source sharing. This can also prevent vendor lock-in, as users can pick and choose the features or services they want, based on their needs.
What are some of the latest open source projects from the OCP?
Google and major telecom companies, such as AT&T Inc. and Verizon Communications, are making waves in the OCP, an open source initiative that aims to push more efficient server and data center designs.
Google teamed up with Rackspace to develop Open Compute servers that will use IBM's Power9 CPU after its release in 2017. The move signifies Google's challenge of Intel's dominance in the CPU market.
Google also released blueprints of its 48V server architecture, marking the first time that Google publicized designs through open means. The design focuses primarily on the power mechanisms that Google uses, shifting away from alternating current lines to an approach that reduces cooling requirements and boosts efficiency.
In the telecom industry, AT&T and Verizon recently joined the OCP Telco Project with the goal of sharing information on data center hardware and networking.
Adoption on a wider scale remains slow -- despite financial and oil exploration companies using the technology -- but cost considerations are pushing companies toward open source channels.
What other open source data center initiatives are there besides the OCP?
A new data center open standard has emerged that offers an alternative to some of the more complicated designs associated with the OCP.
The Open19 project, launched by LinkedIn Corp., attempts to appeal to a wider base than the OCP, focusing on servers and related equipment in a 19-inch data center rack. Using multiple vendors is a way to avoid vendor lock-in -- a big appeal for Open19, as smaller organizations want to find open source opportunities. The project also looks to shrink shared hardware in the rack and remove power supply units in servers. LinkedIn hopes to push the standard into a wider base, while coexisting with the OCP as another option for data center standards across different-sized racks.
VMware's new Project Photon -- a container-friendly platform to support cloud-native applications -- was also recently added to the open source mix. In conjunction with the new Photon OS, Project Photon aims to smooth the transition from server virtualization to managing cloud-native apps on an open source operating system. The move signifies a push from VMware toward container workloads, and could be a draw for developers.
What's the biggest challenge data center admins face with open source technology?
When deploying or running open source technology, the lack of professional support can leave IT scrambling. Even after combing through search engine results and discussion boards, admins still might not have an answer for an urgent question.
Professional support is lacking with open source tools, and although some vendors offer support services, they often comes at a cost. When a primary driver to switch to open source is the financial aspect, spending money on the necessary support can create a dilemma. Some larger companies have the resources -- both from a financial and staffing standpoint -- to support open source hardware and software in the data center, but smaller organizations often struggle to do so.
When will open source become more prominent in IT, and what's holding it back?
Since Facebook launched the OCP in 2011, larger companies have made the shift toward open source adoption -- but smaller ones linger behind. There's no telling when, exactly, open source and OCP will be the go-to when it comes to data center hardware and software infrastructure, as problems with staffing and support still hamper movements for small-scale operations.
However, the market for open source platforms such as OpenStack continues to grow. OpenStack revenues will exceed $2 billion by 2017, according to analyst firm 451 Research. Much of that growth will be in the service provider market, but OpenStack distributions and IT training services are projected to drive growth in the enterprise, as well.
With over 600 companies in attendance at the 2016 OCP Summit, including Microsoft, Apple, Google and larger financial services firms, the momentum for open source products seems to be picking up steam at big-name companies. The leaders of the pack are making waves in the field, but adoption remains low in smaller data centers. In production-style deployments, OCP is still off in the future.
Big names in data center support OCP
Explore Facebook's OCP in action
Take this open source cloud test