Open standards model for VNFs is a boon to open source networking

The model will drastically streamline the compliance and verification process of bringing virtual network functions to market

Open Networking Summit Keynote

Linux Foundation Networking, together with the GSMA, has created the first standardised compliance and verification model to help network operators and equipment vendors approve networking apps and increase time-to-revenue.

The model created by the Common NFVI Telco Taskforce (CNTT) replaces the pre-existing method whereby vendors bring virtual network functions (VNFs) to network operators, which then need to be tested before they can be deployed. As the type of tests required varies by operator, this could be a very lengthy process, whereas the new open model provides a single top-line test to be applied across the whole industry.

Advertisement - Article continues below

The new model will allow operators and vendors to profit more quickly from their VNFs and then re-invest that profit back into the open source life cycle, ultimately fuelling more rapid industry growth.

"The speed with which this group has been established and produced its first tangible results are a testament to the close cooperation and collaboration of its industry members," said Alex Sinclair, CTO of GSMA. "A common framework and approach will accelerate adoption and deployment in the 5G era and we look forward to aligning further with our partners on this important project."

While there are many excellent partners and vendors of networking equipment in the market right now, more must be added to diversify the supply chain if the industry wants to progress towards in fully open and scalable way, according to Henry Calvert, head of future networks at GSMA.

Speaking at a panel discussion at Open Networking Summit 2019 following the announcement, Calvert also said key figures need to set aside competition and instead work collaboratively adhering to and further building-out the standardised set of testing procedures that will help drive the industry forward.

Advertisement
Advertisement - Article continues below
Advertisement - Article continues below

"Competition is always our problem, it is the downfall that we actually have," said Calvert during a panel discussion. "So let's see if we can just take that whilst we develop the testing and the conformance plans to get things right. Let's just put competition to the side and work collaboratively together."

The panel also discussed the lack of interoperability in enterprise networking and how moving towards a more open approach can benefit both vendors and network operators.

According to Beth Cohen, NFV/SDN strategist at Verizon, telecom companies don't actually compete over infrastructure as their customers simply don't care about that: They just want it to work. In order for that equipment to work well for Verizon's customers, vendors and operators need to commit to the CNTT model and decrease the time it takes to innovate.

When equipment goes wrong and services are disrupted, it's not the vendor's name that gets soured in the press, it's the network operator's. So, competition between vendors who often work in silos must be eliminated if the industry wants to improve the testing process and ultimately keep these equipment faults to a minimum.

Advertisement - Article continues below

If all vendors committed to open platforms which embrace continuous integration (CI) for upstream projects, standardised tests could be carried out to ensure all network equipment operates at acceptable standards.

By committing to greater levels of collaboration, Calvert said it can reduce the number of test cases that need to be carried out and improve time-to-market.

"It reduces the number of test cases that have to be done on the boxes, or on the system's old software, on the stacks, all the distros that you're actually pulling out there, but it reduces the number of case studies because they become learned and they become well understood," said Calvert. "People can drive through that very quickly. But can only happen through collaboration.

"If you're actually doing it in a silo, you're doing it for yourself, you're not doing for others."

A lack of collaboration isn't the only thing stifling the adoption of open network function virtualisation platforms. Following in IT's footsteps, networks are fast becoming cloud-native and with that presents some difficulties for operators.

Advertisement - Article continues below

Cohen said she thinks "cloud-native is going to make everything that much more difficult", particularly with regard to making testing much more challenging. Lincoln Lavoie, senior engineer, University of New Hampshire InterOperability Lab added that while the industry must accept a cloud-native approach, it must also ensure a "consistent CI pipeline and testing strategy [or] it's going to be difficult".

Featured Resources

Top 5 challenges of migrating applications to the cloud

Explore how VMware Cloud on AWS helps to address common cloud migration challenges

Download now

3 reasons why now is the time to rethink your network

Changing requirements call for new solutions

Download now

All-flash buyer’s guide

Tips for evaluating Solid-State Arrays

Download now

Enabling enterprise machine and deep learning with intelligent storage

The power of AI can only be realised through efficient and performant delivery of data

Download now
Advertisement
Advertisement

Most Popular

Visit/security/privacy/355155/zoom-kills-facebook-integration-after-data-transfer-backlash
privacy

Zoom kills Facebook integration after data transfer backlash

30 Mar 2020
Visit/infrastructure/server-storage/355118/hpe-warns-of-critical-bug-that-destroys-ssds-after-40000-hours
Server & storage

HPE warns of 'critical' bug that destroys SSDs after 40,000 hours

26 Mar 2020
Visit/software/355113/companies-offering-free-software-to-fight-covid-19
Software

These are the companies offering free software during the coronavirus crisis

25 Mar 2020
Visit/cloud/355098/ibm-dedicates-supercomputing-power-to-coronavirus-researchers
high-performance computing (HPC)

IBM dedicates supercomputing power to coronavirus research

24 Mar 2020