Sometimes you strike gold, and that’s particularly true when you’re talking about freely offered user commentary on a hot technical issue. I had such an experience last week, with an enterprise expert in the applications of IoT and telemetry to business automation overall. An email chat was great in itself, but the expert offered to have a call, and so I spent almost an hour going over viewpoints. I’ll share as much as I can, consistent with my promise not to reveal the company, person, or any confidential activity.
At a high level, the expert identified four technologies that were critical to business automation. First was IoT, then edge computing, then digital twins, and finally AI. The order presented here reflects an expert view into the evolution of automation, and I want to start by saying that the expert defines “IoT” as any source of status telemetry or point of process control (“sensors” and “effectors” in IoT terms).
The expert’s view on “business automation” was interesting, and a good starting point. The contact came about because of my blogs on the empowerment of the roughly 40% of the workforce that doesn’t live at a desk. In the expert’s own company, the actual number is 47%, representing almost 60% of the company’s labor cost. This group of workers is responsible for the actual production, while the office people are almost totally focused on “administration”. I think this is representative of a lot of companies; where actual physical goods or hands-on service is what a company sells, a lot of people have to be providing it.
“Business automation”, to the expert, is the empowerment of worker activity through IT. In roughly a fifth of the cases within this company, the automation actually displaces significant human activity, and in the remainder of cases it either reduces human resource consumption or makes it better, safer, etc. That ratio also seems pretty consistent across this class of enterprise.
The company got started with “automation” over 30 years ago, using a combination of IoT and edge computing, but in the context of what used to be called “computer numerical control” or CNC. With CNC, the computing and the tools were integrated, supplied by one company. Over the next ten years, they evolved from a pure integrated CNC model to one where the tool provider offered software that ran on an edge platform that used more standard hardware and software. This was almost exclusively what came to be known as “embedded control” platform software, but the expert says it’s slowly evolving to a point where they favor a real-time version of Linux over a specialized OS.
The transition in platform strategy, according to the expert is at least in part due to a shift from tool-integrated and vendor-defined systems to one where a facility to be automated picks the optimum plant equipment and then applies custom or customized software to automate operation. Sensors and effectors to provide the ends of the control loop are a part of this. I worked on (and developed software for) transportation and facility management applications in the early days of this transformation. RFID and hand-held devices figured in this early work.
Transitioning to the present, the expert says the problem with all of this is that software has to control the process, and tends to be specialized to it. That makes automation expensive, slow to deploy, and brittle in the sense that it’s easily broken by changes in either the automation’s own tools or the plant elements they’re intended to automate. This is where digital twins come in.
The challenge with “process automation”, says the expert, is that sensor data doesn’t convey the meaning of the information in terms of the state of the overall process. OK, Sensor A is triggered by the movement of a part. What’s supposed to happen? In traditional IoT, that’s determined by the software. Just short of 20 years ago, the expert’s company found that they were essentially writing model-driven software to better accommodate changes and improve reusability, making them perhaps a very early user of digital twin technology.
The notion of a process or systemic model of a real-world capability seems to be essential to successful automation. The model can exist implicitly, but it would be best if there was some sort of modeling framework, template, or language to make it explicit. That’s what the expert believes digital twin technology needs to bring.
Digital twin technology is what the expert sees as the current focus of IoT and business automation. If we had explicit twinning tools and a digital twin platform available to customize, it would advance the whole task of business automation considerably. The Digital Twin Consortium work in this area is the current focus of the expert’s company, largely because the project has both industrial/manufacturing and business projects underway. Their “periodic table” is of particular interest, but work based on this is only in the planning stage for his company, at this point, according to the expert. For those interested, he recommends reading THIS paper first.
One thing critical to the digital twin concept is sensor coverage. The twin is only a twin to the extent that the properties of the real-world system being twinned are fully represented. However, the expert thinks that it is possible to “over-observe”, to gather information that’s not actually useful and might even pose a risk, and by doing so increase the complexity of the twin and the processing and connection resources needed. One example offered was having transportation twins hold the manifests of goods involved, rather than holding a package code that could then be used to obtain the full manifest if needed. The former approach increases sensor traffic by over an order of magnitude, and if manifest information was intercepted it could be used to plan a hijacking.
AI is the next step in the progression, says the expert, who believes that to use AI effectively you have to package it in a digital-twin framework. Early AI initiatives using both fully open-source and Meta Llama models uncovered some issues that, while surmountable, were unexpected. Almost all related to training, and the core issue was getting the AI model to learn everything needed to perform that essential role of systemic awareness. It was fairly easy, the expert said, to do that with business analytics because the historical data to train on was almost always available. They found it best to first collect training data from composite sources to get all the needed information in one place; early trials found that things like the opening of new facilities, the introduction of new products, changes in overall economic environment, and even changes in mode of transportation of parts and finished goods tended to confuse the training if they weren’t recorded.
The biggest training problem occurred with process automation, where sensor/effector deployment was already under some form of computer control. Getting past sensor data was usually difficult (it either wasn’t routinely saved or older data had been discarded), and the systemic relationship between sensor elements and the correlation between events that started a control-loop process and the effector actions at the end of the loop had to be inferred. This was the primary reason why the expert believed that digital twins, and their inherent ability to contextualize diverse events and actions, were essential for optimum use of AI.
The expert is a believer in business automation at all levels, and in particular a believer in having a “company twin” and perhaps even a “market twin”. How quickly this could all come about will depend on how fast initiatives like the Digital Twin Consortium push things, or how far individual IT providers advance the state of the art. It’s a heavy lift for a single company to undertake, even a large one, and so it’s important that, somehow, collective work is organized and advanced.