You probably know by now that one of my pet peeves is a technology concept that’s essentially an invitation to self-definition. Such a concept cannot fail because you get to define it around your standard of success. It’s universal because everyone sees in it what they want to see. It’s great for generating media attention because it can’t ever get old because it can’t ever get pinned down. Such a concept is “digital transformation”. If you look the term up online you get the granddaddy of all glittering generalities. It’s the integration of tech, the rewiring of an organization. Well, what the heck does that even mean?
Maybe the saddest thing here is that we actually could come up with a good definition for the term, one that’s justified by decades of IT empowerment. The problem with that, besides its potentially killing off a great and durable hype wave, is that the objective definition doesn’t exactly fit the current perceptions.
The age of business IT began in the 1950s with the deployment of computer systems in commercial missions. At the time, the primary goal of IT was the analysis of business activity, and so activity was captured after the fact by punching records into Hollerith-coded cards. There is value to this, of course, but it’s not really doing anything to change how business activity was conducted, just recording the result.
In the 1960s, computer systems got more powerful, but the big change was in the operating system. When IBM brought out the first true “mainframe” computer in the mid-60s, what made it different was its ability to support applications that were linked to business activity in real time. OS/360 was the beginning of transaction processing, of data communications. Because IT could now be projected outward to the work, rather than requiring a record of work to be captured, this was the real start of the “digital transformation” process.
Fewer than 5% of CIOs see it that way, probably because the term wasn’t used at the time. However, I had a chance to talk in some depth about the evolution of the role IT plays in business with 39 CIOs. If I asked them whether having online, real-time, information available to workers, they all agreed this was digital empowerment. If I asked whether having workers able to interact directly with core business applications to do their jobs, all but one agreed that was digital empowerment too. If I asked what the “next step” in digital empowerment was, no answer got more than a quarter of the group’s support, and the one with the most support was “AI”. Is that likely because AI is the current hot tech button? I think so.
To me, there’s a trend here, visible if you look at the way that IT spending growth and GDP growth have related to each other since the dawn of IT. We’ve had three distinct waves, corresponding to what I believe were the advent of mainframes, the advent of real-time minicomputer applications, and the personal computer. In each of these waves, we’ve been transitioning our IT model. First, we had a retrospective analytics approach, second we had a real-time core application target, and third we had a personal empowerment target. If there is a real future digital transformation step to be taken, what is it based on the steps of the past? To answer that, we have to look at that past sequence a different way.
Step one: Exploit better understanding of what business activity has been. Step two: Create a direct link between the applications that support that activity and the workers who undertake it. Step three: Give the worker a personalized resource to run things that they find helpful. In all these examples, what we’re doing is bringing workers to the IT resources. Could the next step be to bring IT resources to the worker? We do transaction processing in computer application terms. Should applications be supporting how workers do things rather than requiring that workers do them the “IT way”?
It seems to me that the notion that digital transformation is “integration” and “rewiring” really has to embrace this evolution. We already use IT for jobs that are information-centric, and we do it by accommodating the way we obtain and display the information in the work practices. That’s fine where the job is information-centric and where the work practices are malleable enough to adapt to IT. What about jobs that aren’t now information-centric and are tied to real-world conditions that have to be accommodated?
A “job” is a structured set of “tasks”. For many workers, these tasks revolve around the relationship of the worker with real-world elements, things like vehicles or warehouses or assembly lines or substations or refineries. We have to present IT into the real world, not expect the real world to conform to it. In order to do that, we have to have a way of making the real world known to/visible to IT. We need a digital twin.
A “digital twin” in this context is a computer model of a real-world process, synchronized with that process via IoT sensors, and influencing the process with model-driven controllers, instructions or guidance to the worker, or both. Because the model is synchronized, it can provide relevant information and commands, and the better the synchronization the more relevant the model can be.
It’s interesting to me that all the technology pieces to support this kind of digital transformation are already available. All that’s needed is to integrate them, and that integration has been done in selective industrial spaces already. All we need is to think of a business as a real-world process that integrates a bunch of jobs, each of which integrate a bunch of tasks. Could we define a hierarchical digital twin of an entire business? I think we could, and I think that would achieve most of the various, vague, definitions we see for digital transformation.
“Could” and “will” are, of course, very different things. As is often the case these days, we seem to be caught in what VCs call “boiling the ocean” traps. The scope of our opportunities is large, so large that while the opportunities are measured in the tens of billions of dollars, the complexity of the solutions mean a major task in creating the necessary tools would likely have to be shared across multiple players. How does that happen? Open-source software, some sort of consortium? I think that we probably will see a bit of both being tried over the next couple of years, but for that to happen we may need at least a recognition of a big benefit, and surprisingly that’s probably going to be inhibited by the rise of AI.
We live in an age of bubbles, of hype and promotion. It’s not always without foundation, but it’s always at least a bit of an exaggeration. When a hype bubble bursts there’s an opportunity to find another, but while one is going forward successfully nobody wants to step on it. That means we can’t try to do something massive with metaverse-of-things or digital twins until AI runs its course. Unless, of course, somebody links all this to AI. Is that possible? Hey, anything is possible.