It’s likely not a surprise to any who have followed my views to hear that I don’t think AI is going to revolutionize the WAN. The data center network, yes. It may surprise some to hear me say that I think AI isn’t the only force acting on the data center, though, and to hear that the other perhaps-major force on the data center would likely revolutionize the WAN too. This other force might also be the major driver of AI, which might kind of close the circle here. Complicated, huh? So let’s get to it.
Technology isn’t revolutionary in itself, it’s what we do with it. That’s true of network impact too. Networks carry data, support data movement, so what we have to be looking at if we want to track network change are the changes in data movement, and their causes. Enterprises have already been tracking one, and one that AI applications are impacting. “Horizontalization”
In the old days of monolithic applications, everything followed a nice input-process-output flow model. A user at a terminal generated a transaction, which flowed to the computer system running their application, which generated some database activity, and then returned something to that user. You could plot this top-down, which is what led to it being characterized as a “vertical” flow.
In the 1990s, we saw some kinks in this simple model of data flow, emerging from a combination of the trend to componentize applications and the trend to integrate applications by creating paths between them for shared action support. This, of course, was “horizontal” traffic. One enterprise noted that, within their company, they’d seen vertical traffic roughly double in the last ten years, but horizontal traffic had exploded by twelve times.
The force behind all of this was the need to achieve more productivity by improving information integration, quality, and distribution. Early IT was a bunch of application silos, and horizontal traffic the result of recognizing that businesses didn’t run on silos, but on the whole of their interactions with customers, suppliers, regulators and government agencies, and so forth. There’s strength in numbers, it’s said, and though the saying was talking about a different kind of numbers, it’s true for the business kind as well.
AI continues this trend, because what businesses want from AI is a set of agents that can do stuff that would otherwise require human action. That doesn’t necessarily mean closed-loop autonomy, but it does mean that the agents have to assimilate what the humans it augments/replaces knows or can know. More horizonalism, and the next and biggest force creates the most horizontal traffic of all.
Real-time services have a couple of faces of their own. One face, the most traditional, is the force of IT proximity. For seventy years, progressive growth in IT commitment and IT spending has been driven by bringing IT closer to workers. We started with mainframe computers in glass rooms tended by acolytes to personal computers or even smartphones, sitting on nearly everyone’s desk. We used to bring work to the mainframes, but we ended up bringing IT close to the work, making it a part of working.
But what about the no-desk situations? That’s the second of our faces of real-time services. It might seem like an evolution of the current trends, but it’s aiming at a different kind of work and workers. We’ve managed to enhance the productivity of those who could work through IT, but not those who’d need IT to work through them. Real-world systems are ones that make IT a cooperative piece of real-world processes, even ones involving people, and so they have to know about the real world, both conditions and rules. The new information relationships needed are a potential driver for AI, and of course new information relationships mean new information flows, which means network requirements.
I think that the initial focus of real-time services will be one of facility optimization, working to improve process control with a plant or campus. Most of the information needed for this will surely have been collected by early systems, so this is more a compute or compute-and-AI manipulation of a world model (digital twin).
The orderly expansion of this first step would take things to larger facilities, where inevitably fixed conveyance (like assembly lines, belts, etc.) give way to moving vehicles of some sort, and where it’s more and more likely that human labor has to be integrated. Both these introductions require new data to be collected, and in the case of the worker additions, also likely require new things be distributed to synchronize human and automated elements.
Running a world model, even for a “local world” is computationally intensive and generates horizontal data flows within the model, which is likely to be distributed both in terms of hosting elements and in terms of physical location. Thus, the initial impact of our real-time evolution will be sharp growth in horizontal traffic. Yes, there will be new telemetry and control flows, but these will likely stay on private facilities for now.
What expands us outward is the tightening of relationships between world models and human workers. It’s probably inevitable that, to the extent this trend is even recognized, that it gets twisted into robots and humans working side by side, but that’s a development that neither enterprises nor I believe will come along before about 2030. Initially, it’s likely to take the form of guidance, relying first on a phone/tablet display and then on augmented/virtual reality. Eventually it may involve simply timing mechanical movement to synchronize it with human movement, but all of this will mean analysis of video images to determine what the human workers are doing and where they are in the cooperative process sequence.
This process of coordination between humans and machines is what ultimately generates the network traffic, even in the WAN. Some jobs, like almost all of them in the public safety sector and the military, are long on the need to collect “awareness augmentation” information, both to alert a person to conditions they need to handle and to populate a model that enables broader process control, like a “smart city” environment.
What this all means for data centers and networking is what will drive the future of both, and that’s next week’s topic!
