Some religious historians will likely recognize the title “Contra Apionem”. My high-school Latin teacher would surely be amazed (were he still alive) by the fact that I sprinkle Latin into many conversations and writings. Well, in his honor, let’s introduce another similar-sounding phrase; contra aperturam, which translates as “against openness”. I am not against openness, and this probably sounds heretical, but it’s a reflection of a trend that’s become very clear this year, one that impacts enterprises and telcos, but the latter perhaps most decisively.
If you ask, very few will come out and say that they’re against openness in technology. Vendor lock-in in various manifestations has been a fear of tech buyers for decades, and even today well over three-quarters of buyers in any vertical will say that their tech vendors are trying to lock them in, are engaged in manipulative selling and pricing strategies, etc. But where the rubber meets the road, meaning where purchase orders are signed, there are forces that have changed the “open standards, open networks!” battle cry into lip service for many.
The essential idea behind openness in tech is that competition keeps vendors honest and prices at a realistic market-set level. That’s true, of course. Study after study, going back into the 1970s, have shown that you need at least three competitors in a market for fair-market dynamics to work. But in that time, there have been few markets where the requisite three competitors haven’t existed, despite the fact that one essential step in ensuring we have at least that number has created its own issues and often succeeded only in principle.
The only way that open tech to work is for tech in a given area to be somewhat interchangeable at the device level. If it isn’t, then if a giant vendor gets their nose under your infrastructure tent, they can adopt interfaces and protocols that prevent you from introducing other gear. They can also gain an advantage in any area that has to interface with the stuff they’ve installed, spreading out like an evil stain to cover your budget, or so the theory goes.
Interchangeability means either de facto or de jure standards. Interfaces are often formally standardized, protocols are sometimes done like that in the enterprise market, and often done that way in the telecom world. However, formal standards are increasingly problematic, for a number of reasons.
First and foremost, buyers of technology are under-represented in the standards bodies. The vendors have a lot of incentive to send teams to standards meetings and publicize their favored approaches, but buyers often see themselves as budget-constrained in their participation. I’ve been involved in a lot of standards initiatives over my career, and I’ve never seen one where the buyers of the target technology outnumbered and outspent the sellers. This problem is formidable in itself, but it leads to all the other problems, too.
The second problem is that standards-writing is a competitive playing field for the vendors. Every vendor is watching for something that helps, or hurts, their interests. Consensus is difficult to achieve in this environment, and often the compromises needed weaken the technology choices. Sometimes they derail the process completely. In about half the initiatives I’ve been closely involved with, one or more vendors were actively trying to defeat the whole standardization process.
The third problem is that standards take a very long time. With everyone jockeying for position, forward progress often occurs more by having everyone throw up their hands in disgust than by actually achieving a positive result. To win is to outlast, so the longer-lasting the process, the more likely it is that a given vendor can win. So, everyone dallies around.
Which leads to the final problem, which is that standards often lag actual market deployment needs. This is true not only because the processes are drawn out by the competitive dynamic, but also that the compromises made in the creation of standards often mean that it’s necessary, at least in the eyes of some, to go back and add to them. Open RAN is an example of this in the telco world, a follow-on by an industry group to a formal standard, 5G, set by the recognized standards group the 3GPP.
Even if you get standards, though, there’s a final issue that has to be faced, and faced by buyers. It’s this issue that has done more to create the “contra” movement than anything. It’s integration, and it creates problems at every phase of a tech project.
Issue an RFP for a tech system and you likely get, from each vendor who receives it, a complete solution. Issue an RFP for a series of tech devices that you say will be used as a shopping list for building a tech system, and guess what? You’re the general contractor who has to put all the trades together to create something habitable. Then, you have to accept that if anything goes wrong, you get the problem resolution equivalent of the favored criminal defense, SODDI, meaning “some other dude did it.”
Over the last twenty years, both enterprises and telcos have been shifting their procurements to demand complete solutions and not components of those solutions, and have been gradually reducing the number of vendors who were involved. In the 1990s, the average telco wanted five sources for any given area of their network. By the 2010s, that went to three, and in 2025, telcos tell me that they’re happy with two vendor options, and will more than often pick only one. All to avoid the pitfalls of integration.
So why demand “openness”, and in particular the brand of openness that, in the name of encouraging innovation, works to create more and more players to integrate? That’s the question that operators have been confronting, as Vodafone’s decision to award the biggest piece of its Vodafone Three to a single vendor, Ericsson, shows. That’s the wrong question, though. The right one is “what’s the goal of openness”, or more correctly, what is not the goal.
An open technology deployment isn’t necessarily a multi-vendor deployment. A decision to buy from one vendor is not necessarily a repudiation of openness at all. What has happened in the open movement, in telecom in particular, is that it’s been seen as a way of breaking down incumbencies, of letting others get a piece of the pie. That’s fine for the “others” but it flies in the face of the buyers’ desire to minimize the chores of integration. Just because I want to shop five stores to fill my tool chest doesn’t mean I don’t end up buying all my power tools from one source.
There’s a lesson for AI here too. We have had, and we still largely have, only one broad-capability approach to AI, which is the hyperscaler-hosted approach, that has the necessary number of competitors and whose competitors have a full-sized toolkit to offer. However, the actual market needs as enterprise buyers express them, demands a modular, software-component-like, self-hosted version of AI. OK, says many of the enterprise AI sources, we have a solution. True, but it’s not a complete solution. We have chips, servers, models, tools, and all we need as a buyer is the skill set required to build applications from that combination. Good luck having those in your workforce!
I think we can expect the “real”, the enterprise, AI and the real, the telco, network technology, to succeed only if it’s offered by three or four giants who have all the pieces and can present a complete solution. I also think that the de facto, rather than than the standardized de jure, approach is the only way we’re going to get there. That means that the real battle in AI is who, for enterprise AI, those three or four giants will be. We already know who they are for telecom.
