Here’s a seemingly obvious truth for you; there’s no such thing as an infinite TAM. Any market can be saturated, and as saturation approaches all markets can expect to see a slowdown in growth rate. So it is with wireline broadband in general, and cable broadband in the US in particular, according to a Light Reading piece. The thing that I find interesting about these discussions is that people are happy to talk about the trend and seemingly reluctant to accept the cause. Is increased competition behind a decline in subscriber growth when everyone is reporting a decline? A zero-sum game with losers needs winners too, so the truth here should be obvious, as I said.
The wireline broadband market TAM, the total addressable market, tends to grow as the number of financially viable target households grow. This, in turn, is largely driven by a combination of population growth, perhaps more by the growth of the middle class, and the impact of any broadband subsidy programs. But not only does the TAM set a cap on a market, growth is almost always impacted by just how far along you are in the adoption curve. In the early 2000s, broadband Internet was in its infancy, and growth rates were high. As the market matured, not only did we pull off more and more incremental prospects, we picked the low apples, the households with the highest tech literacy, the greatest willingness to pay.
Where we are today in broadband reminds me of where we were with voice in the 1980s. Technology was making bits cheaper, and so there was growing competition for the long-distance calling space because it dodged the high cost of providing access. Voice didn’t consume many bits, so transporting it in aggregate was cheaper by the day. To break out of this, what was needed was a broadly used service that consumed a lot of bits, something that people would value. The Internet gave us that, but since then what’s happened? We’ve run out of opportunity for revenue growth there too.
What competition has really done in broadband is to eliminate any easy path to increasing average revenue per user, by charging more for service. I had a special rate on my home broadband, negotiated by a threat to change providers, and as its expiration approached, the operator contacted me spontaneously and extended the offer. So, no new households spring up, and no willingness to pay a higher price for services either. Consumers cluster at the low end of the service plan inventory because they don’t need more. I’d bet that my home Internet is twice as fast as I actually need, and I’m at the low end of the price/capacity spectrum. Along the way, in fact, I got a 50% increase in my speed, and didn’t even notice a change.
This is the issue operators face, the thing we call “commoditization”. Almost everyone does things online today that they didn’t do before; I stream all the content I consume at home, for example, and yet this increase in usage hasn’t driven me to buy top-tier services. This, I believe, means that operators have missed an important truth, which is that the potential revenues from the increased bandwidth needed for new online services will never offset competitive- and technology-driven reductions in revenue per bit. If you want to make money on new services, you can’t just carry them, you have to build them, and offer them.
This, I think, is why operators’ superficial strategy for things like IoT have failed to do what they hoped, and why most 5G hype turned out to be just that…hype. Connectivity cannot be made valuable in abstract; it’s what you do with it that matters. Operators who want to focus on the former and let others handle the latter surrender all their pricing power and opportunity for benefiting from new applications, as they’ve been doing for decades.
So does that mean the separate-subsidiary community is right? There’s a conference on it today, but I don’t think it can come to any useful conclusion. First, would regulators allow telcos to fund such a subsidiary fully? That was forbidden in the past. Even if they did, all this allows is for a new tortoise to enter the race with a bunch of experienced hares. The OTT community knows how to do value-added services. If there are people in the telcos who also know, where have they been hiding? Who staffs the subsidiary? Will they try to hire everyone from the outside? No, they’ll pull key people from inside, from the pool of connectivity enthusiasts. Will they then be competitive? You tell me; I know my answer, and I’ll bet you do too.
This is the classic dilemma of the “smart versus dumb” network. I debated this with David Isenberg at a BCR event back in 2004. I tool the “smart” side and won the debate in a setup that was designed to go the other way. But we ended up with dumb networks, so did I lose in the end? Yes, in one sense, because we went the dumb-network route. No, in another. What I argued was that there was not enough profit in dumb networks to sustain investment, and to roughly quote my comment then: “There is no chance whatsoever in my lifetime that we’ll re-regulate. If we can’t change fundamental policies and regulations, what chance do we have to repeal basic economics? We’d have to re-regulate to retain capital credibility for the carrier industry in the dumb network scenario.” Isn’t that a pretty good description of the threat facing telecom today?
I also had an impromptu debate with an FCC Chairman at another event, and he said that the FCC was responsible, overall, for the health of the industry. I agree, and I also agree that the dumb network approach helped create the pace of innovation we’ve seen from the Internet. But could regulation have kept the whole industry healthy and still innovative? I think so. I still believe that barring settlement for premium services and handling, an element in most net neutrality policies, hurt the industry overall, and I know this was being discussed almost a decade before my 2004 debate. But most of all, I believe that the fact that regulatory policy shifts in the political winds has hurt.
The health of the industry should have been the regulatory goal, and it was not. It can’t be now; you can’t stuff the deregulation genie back in the bottle. Could the industry work it out? There was, a couple of decades ago, an example of a “standards” initiative that I think is the model that could work. The IPSphere Forum (IPSF) had no membership fees, was run by a “service provider council” whose meetings were open only to providers themselves, and that addressed services and then infrastructure rather than the other way around. The climate of the time wasn’t in favor of the kind of revolutionary thinking that it generated, but now? It might work. In fact, it might be the only sort of thing that could.