I blogged at the end of last year that the question about Oracle’s cloud service wasn’t whether it was hot, but why it was hot. I provided some answers then, and I want to point out another initiative that also demonstrates that Oracle is determined not to be just another cloud vendor. In fact, they may want to define the cloud and data center relationship of the future.
One of the big overlooked truths about the cloud is that everything isn’t moving there. If you compare the cost of a VM in the cloud with the cost of a comparable VM in a decent-sized enterprise data center, you’d find that the cloud is more expensive. Cost advantages for the cloud won’t pull applications out of the data center, but some applications and even more components of applications, are and will remain candidates. That means that “hybrid cloud” is really what the focus of cloud computing has been all along, in a factual sense, and that’s particularly true when you consider the “edge computing” applications that drive IoT and many believe are the cloud’s future.
All of the Big Three in cloud providers have long provided a mechanism for hosting a piece of the cloud in the data center, largely to accommodate the fact that “the edge” today is largely confined to the customer locations near the applications that are sourcing the events to be processed. Giving enterprises a piece of the cloud at their edge locations is smart not only because it facilitates the movement of traffic to the cloud. It’s smart because if “the cloud” does evolve to include “the edge”, then cloud providers will be able to absorb premises-edge applications easily, if those applications really do have public hosting potential.
Oracle knows this too, but I think they’re taking a different approach to the relationship between cloud and premises infrastructure. Instead of targeting some isolated and dispersed “edge devices” near manufacturing or other facilities, they’re actually targeting the data center. Compute Cloud@Customer is a fully managed piece of Oracle’s cloud, subscription-priced and complete with all the OCI features, but running in your own data center. The new service lets an enterprise adopt OCI but keep their data and processing within their own facilities, where latency is controllable, where hybrid connections are direct and in-house, and where security is handled in a way that enterprises have long accepted as comfortable.
Oracle is in a good position to do this because their database tools are a fixture in many enterprises already, which means that a lot of the data center part of hybrid cloud applications run on Oracle database software already. With the new service, the database can be run “in the cloud” and “in the data center” at the same time, and the cost (according to a study Oracle cites and two enterprises told me they’ve validated) is far less than running Oracle’s databases in another provider’s cloud.
The databases a company uses for its mission-critical applications are a major anchor point for the data-center piece of the hybrid cloud. Most enterprises don’t want to move these databases, period, and so they tend to keep the transaction processing piece of applications in the data center, moving the “front-end” piece to the cloud. This also lets current analytics tools running in the data center avoid the latency and cost associated with accessing data that’s been stored in the cloud instead of the data center. On the downside, it means that cloud analytics and AI incur those very costs and latencies, and if we assume that we’re actually heading toward what I’ve called a “metaverse-of-things” application framework down the road, the separation of front and back-end elements can be a problem keeping the “digital twin” foundation of MoT synchronized with the real world.
I think it’s very likely that the Big Three in public cloud will follow Oracle’s lead here, but none of them have the advantage that Oracle’s database offers. It’s a literal camel’s nose in the data center tent, waiting to pull in their hybrid compute offering. IBM, who I’ve noted is Oracle’s real rival in terms of cloud service innovation, is of course also likely to follow suit, but since both IBM’s and Oracle’s databases are entrenched in their customer’s data centers, that competition isn’t likely to hurt Oracle’s exploiting their own database incumbency. It might have an impact on uncommitted customers, though, or on enterprises who make a major change in their application architecture and thus could consider a different database technology.
It’s also a potential answer to the current pushback against public cloud hosting due to cost overruns. The data center can be not only the data center of old, but the cloud of the future. There doesn’t have to be any hybrid connection at all. That could be true for some very governance/security-sensitive applications or even for all applications. In effect, the Oracle service could become a form of data center outsourcing, something that could appeal to a lot of enterprises, depending on how their own particular situation prices out with Oracle’s model.
All this is competitively interesting, but that’s not the only thing that could come out of Oracle’s announcement. It probably isn’t the most important thing either, at least for people not in Oracle’s sales food chain. That honor goes to the fact that the Oracle service creates a candidate model for a totally agile compute platform, something that could live in the cloud, the edge, the data center, multiple clouds, and pretty much everywhere. It could define the old notion of the “private cloud” as a virtual element that can be “private” or “public” interchangeably.
I’ve noted in past blogs that the enterprises most satisfied with public cloud service costs and capabilities were those that used the same platform software (Red Hat’s OpenShift or VMware’s Tanzu) in both data center and cloud. What Oracle has done is to create a data center model that transports the full cloud feature set into the data center, letting enterprises “write for the cloud” and run anywhere. That’s a pretty significant step, and it could be Oracle’s secret weapon or a feet-of-clay problem.
The potential benefit to Oracle here is obvious. Any enterprise who uses Oracle database technology can quickly unify the pieces of their hybrid application, and even get something that’s expensed and managed. Any enterprise without a firm commitment to a competitive database could be induced to consider Oracle based on these obvious benefits.
The potential problem is that this vision elevates what’s a database-and-data-center-centric evolution story to a universal virtual platform story. That’s a story that not only can Amazon, Google, and Microsoft tell, but Red Hat and VMware can tell, too. The wider competitive field might not impact Oracle’s success with its own database users, but it could snare uncommitted users.
Another interesting example of a broad impact of the Oracle services is the potential it has for eliminating traffic charges. I can’t readily determine how Oracle plans to assess charges for traffic between its premises service and its cloud; one enterprise tells me there is no charge and the other says there still is one. Whatever the current situation is, it would clearly be possible and even likely that Oracle would discount or dispense with traffic charges between its elements. If that’s the case, it removes a major barrier to where application components are placed, moved, or scaled.
Overall, I think this new service is another example of how second-tier public cloud providers in general have an opportunity to do disruptive things, transformative things, to the cloud market. Oracle seems to realize this, and seems more determined to take the necessary steps to leverage that opportunity. They also have a unique position in the database and hybrid cloud space that could well let them rise further and faster than other second-tier players, and end up having a major influence on the direction cloud services take in the long term.