Figuring out what’s going on with the cloud is starting to look more complicated every day. A recent story quotes Gartner in saying that Amazon has 40% of the cloud market, beating rivals Microsoft and Google who have 21.5% and 7.7% for Google. Another story cites documents in an anti-trust case saying that Microsoft has lower revenues than most analysts, including Wall Street analysts, had been forecasting. Google Cloud, though, had its first profitable quarter, and Wall Street is still saying that AWS is only “stabilizing” with a mere 12% growth in revenue. Almost every report on enterprise cloud usage says that users have moved more than 50% of their stuff to the cloud, and yet we’re not seeing a slowdown in server sales to enterprises. Are we doomed to cloud mysticism forever? Maybe.
The big problem here is that the cloud is (no pun intended) a really fuzzy thing to analyze. Nobody reports cloud revenues in a consistent way, and everyone who interprets the numbers seems to take their own slant on what they’re reporting on. The Gartner report talks about IaaS, for example, but it’s clear that what Gartner means by IaaS isn’t the original definition. Then there’s the fact that cloud revenues for all the major providers includes both enterprise sales and sales to social-media and other online startups. This, according to what I’m hearing, is actually the majority of Amazon’s sales. Then we have assertions that Microsoft’s numbers might or might not include the Microsoft 365 stuff. We have no common ground to base comparisons on.
What I hear from enterprises has its own twists and turns. There are enterprise planners who do say that everything is moving to the cloud, but in the same companies there are others who say that nothing much has really “moved” at all, and that the sense of everything moving comes from the fact that most new application development is focused on cloud-hosted pieces of applications. Enterprises also rate IBM and Microsoft as being more strategically influential in their cloud evolution than Amazon, and rate Google much closer to Amazon in influence than they are in market share. Half enterprises say that they are still “consolidating” or “stabilizing” their cloud spending, and the other half say that they’re adding new work to the cloud.
Can we make any sense of this? I think it may be possible to get at the truth by wading through data beyond the obvious, activities and commitments and plans rather than spending and cost management. Here’s what I’ve dug out.
Let’s baseline with enterprise views of broad cloud trends. About four-fifths of enterprises say that they spend more on cloud services than they expected, as of August 1st. Slightly less than that say that they spend more than they should, and about two-thirds say that they are looking to reduce cloud spending on the average, for the applications/components they now run in the cloud. But almost exactly that same two-thirds say that they have current projects under development that would lead to higher cloud spending, and only a quarter of companies say that they expect their cloud spending in 2023 to be “significantly lower” than in 2022, with (yes, again) almost exactly the same number saying it’s likely to be higher.
Now let’s try to see why companies think they spend more on the cloud than they should. A third of the group say that they did things in the cloud that should have been done in the data center. Some of this was moving applications or data that should not have been moved, some was shifting front-end handling of users to the cloud when they should have stayed with what they had. Slightly over a third said that they implemented their cloud projects wrong, and the errors led to higher costs, and slightly under a third said they planned for the use of basic services and ended up using more advanced or managed services.
Enterprises who adopted serverless computing were twice as likely to say they paid too much than those who used more traditional services, and those who used pure IaaS were less than half as likely to think they overpaid for their cloud. If you try to strip out service decisions and look at application design, users who employed a lot of componentization for scaling and redundancy were one and a half times as likely to report their cloud costs were higher than expected. Overall, any decisions that were made to reduce operations burdens on enterprises led to an increased chance the user would say their cloud costs were higher than expected. The base model of a car is always a better buy than the higher trim levels, I guess.
What’s a bit surprising is that almost three-quarters of the buyers of advanced features that were supposedly targeted at facilitating development, the so-called “web services”, report their cloud cost more than expected. That’s significantly more than make that report among cloud users at large. If you ask whether advanced features ease development burdens and result in fewer application bugs, the users overwhelmingly say that’s true. Still, they also admit that it costs more, and a majority say that if they’d stuck with the basic IaaS services and did more development work, they’d have saved money overall, considering both operating and development costs.
Another interesting point is that enterprises who use a third-party platform (OpenShift, Tanzu, etc.) reported fewer issues with cloud costs, in fact, only two-thirds as many as the overall cloud user base. One reason that some in this group cite to explain this is that they use the same platform in their data center, and so cloud planning is an extension of data center planning. Best of all, users of these platforms said that they believed that the cost of basic cloud service (IaaS primarily) plus their platform licenses were still lower than the cost of obtaining equivalent higher-layer services from the cloud providers.
There are some interesting takeaways from all of this. First, enterprises aren’t yet fully comfortable with cloud planning at the operations or development levels. They attribute this to their difficulties in acquiring and retaining cloud-skilled personnel. Second, the best cloud users are those who use familiar tools to create their cloud on top of basic IaaS, not those who use specialized cloud features. Users say that familiarity doesn’t breed contempt, it breeds success. Third, hype is still “educating” more users in the cloud than we’d like, in part because almost all users say that cloud documents tend to be polarized into high-level gloss or a low-level slog for experts only. This in turn is generating user mistakes that are costing them money and time.
But the biggest takeaway I got was something that was more implicit than explicit in most responses, though a few thoughtful types brought it out in words. According to them, the cloud tended to disconnect developers from the tension between technology practices and costs. In a data center, many enterprises treat resource costs as a kind of paperwork, not something explicit. They are encouraged to think of a cloud application as something that evolves as cloud-think evolves, exploiting new techniques and features. They are less aware of the costs of those things, and only a fifth of enterprises said that “cost testing” was an explicit part of their application development cycle. Those who used third-party platforms, which make resource utilization explicit too, did a better job of cost assessment because they could apply the same techniques in the cloud. Those who didn’t have prior experiences to draw on, lacking the kind of specific education they’d need to assess the trade of technique and cost, didn’t make that trade well.
My final interesting data point is this one. Only a tenth of enterprises said the cloud was “cheaper” than they thought it would be. Only a third said that cloud computing overall was likely to be cheaper than in-house hosting. The benefits they cited to justify what they admit is likely to be a more costly approach range from availability to scalability, but even here a narrow majority of enterprises admit that they never really tested or validated those claims. Were they all caught up in a “cloud moment?” It seems possible.
I think things go back to that third point about hype doing the education. We depend on technology more and more, and yet the number of truly qualified technologists is growing far more slowly than the number of firms that need them. Users rarely can match the pay levels of vendors, and even vendors are having a hard time filling jobs and retaining employees. This is a problem I’ve blogged about for decades, and I have to admit that I don’t see much progress toward a solution. Until we have one, we may be stuck with cloud trial-and-error to get to a good result.
