According to a Light Reading article on Open RAN, “The virtualized, modular RAN will be here sooner rather than later and vendors will be tripping over each other as they try to get on board.” I agree with that statement, and with much of the article too. That raises the question of just what the success of an open-model RAN (O-RAN, in particular) will mean to the marketplace, buyers and sellers.
There is no question that the relationship between hardware and software has changed dramatically, and the changes go back well beyond the dawn of Linux where Light Reading starts its discussion. Back in the 1970s, we had a host of “minicomputer” vendors, names like Data General, DEC, CDC, Perkin-Elmer, and more. You don’t hear much about those players these days, do you? The reason is software. In the early days of computing, companies wrote their own software, but that limited computing growth. Third-party software was essential in making computing pervasive, and nobody was going to write software for a system that hardly anyone had. The result was a shift to an open-model operating system that could make software portable, and it was UNIX at the time, not Linux, but Linux carries the water for open-model hosting today.
What we’re seeing now, with things like O-RAN and even white-box networking, is the application of that same principle to the networking space. 5G is demonstrating that hosted functions can play a major role in mobile networks, and they already play a major role in content delivery. Security software, which is an overlay on basic IP networking, is demonstrating that same point. How long will it be before we see the same kind of shift in networking that we’ve already seen in computing? This is the question that Cisco’s software-centric vision of the future (which I blogged on yesterday) should be asking. Short answer: Not more than a couple years.
The O-RAN model is particularly important here, not because it’s a new thing (as I just noted, it’s just the latest driver toward openness), but because it’s a bit of a poster-child for what’s needed for something that’s clearly in the best interest of the buyer to overcome seller resistance.
O-RAN as a standards-setter is dominated by operators, something that vendors have always hated and resisted. Past efforts to let network operators dominate their own infrastructure model have been met with resistance in the form of (at the minimum) vendor manipulation and (at worst) threats of regulatory or anti-trust intervention. While the O-RAN Alliance has recently had its share of tension, they seem to have navigated through it.
Why is this important? Well, Linux was the brainchild of Linus Torvalds, a legendary/visionary software architect who did the early work, building on the APIs that UNIX had already popularized. Other open-source projects have been projects, and increasingly projects under the umbrella of an organization like the Linux or Apache foundations. In short, we evolved a model of cooperative design and development, and one of the most important things about O-RAN is that it’s making that model work in the telecom space, where other attempts have failed.
It’s also important because of the unique role that 5G and O-RAN are likely to play in edge computing. Any salesperson will tell you that the first test of whether someone or some organization is a “good prospect” is whether they have money to spend. 5G has a budget and budget momentum, which means that a big chunk of carrier capex for the next three years or so will be focused on 5G infrastructure. What will that infrastructure look like? O-RAN’s goal is to ensure it doesn’t look like a traditional network, a vendor-proprietary collection of boxes designed to lock in users. Open-model 5G, including O-RAN, could deliver us closer to the point where software is what’s important in networking, and devices are just what you run the software on.
What does this have to do with the edge? The answer is that if O-RAN, and 5G in general, delivers a “middleware” or “PaaS” that can not only support 5G elements, but also elements of things like CDNs or general-purpose edge computing, or (dare we suggest!) IoT, then that set of software tools become the Linux of networking.
The rub here, of course, is that Linux had the UNIX APIs (actually, the POSIX standard set from them) to work from, and for networking we’re going to have to build the APIs from the tools, designing the framework for edge hosting based on (at least initially) a very limited application like 5G/O-RAN. Not only is that a challenge in itself, 5G in its 3GPP form mandates Network Function Virtualization (NFV), which is IMHO not only unsuitable for the edge mission overall, but unsuitable for 5G itself.
O-RAN has at least somewhat dodged the NFV problem by being focused on the RAN and the RAN/Radio Intelligent Controller or RIC, which is outside the 3GPP specs. This happy situation won’t last, though, because much of the RAN functionality (the CU piece of O-RAN) will likely be metro-hosted, and so will 5G Core. The latter is defined in NFV terms by the 3GPP. Will the 3GPP change its direction to work on 5G as an edge application? Doubtful, and even if it did, it would likely take five years to do, and thus be irrelevant from a market perspective.
It also seems unlikely that the O-RAN Alliance will expand its scope (and change it’s name?) to address either 5G Core or edge computing in general. There’s little sign that the operators, who drive the initiative, are all that interested, likely because they’ve supported NFV and don’t see any need to expand themselves into the edge at a time when they’re trying out cloud provider relationships to avoid that very thing. All these factors would tend to make another operator-driven alliance to address the edge issue unlikely to succeed as well.
So are we to wait for Linus Torvalds to rescue us? Well, maybe sort of, yes. It may be that a vendor or perhaps a few vendors in concert will have to step up on this one. The obvious question is which vendors could be candidates. Software-side players like Red Hat or VMware have 5G credentials and understand cloud computing, but they also seem wedded to NFV, which is useless for generalized edge computing. Network vendors have generally not been insightful in cloud technology. Cloud providers would surely have the skills, but would surely be trying to lock operators into their solution, not create an open model, and that’s not likely to be accepted.
The big lesson of O-RAN may be that we’re only going to get effective progress in new applications of technology when users rather than vendors dominate the efforts. The best of open-source has come from executing on a vision from a visionary. We need to figure out how to turn buyer communities into visionaries, and that’s the challenge that we’ll all confront over the coming years.