Preface
This article assumes you have a certain familiarity with the Digital Twin concepts.
The Digital Twin & Machines Journey to the Digital World
I started to focus on Digital Twins and the potential for Systems of Asset about 18 months ago, at that time it looked like a rare secret material! Nowadays, Google sends me a daily alert with 4 hits on average, I used to curate all these mentions on a Pinterest board, but there is too much volume to handle now and I am starting to give up.
Much is being said on the Internet about the idea of modeling physical things - read machines - and all the potential outcomes that businesses can derive. Fundamentally, it is a fairly simple idea. Machines have started their journey to the Digital World with control systems: computers that were designed to operate the system within required specifications and also guarantee safety. In the Industrial space of complex machinery, safety is paramount. The next step has been to use some of the data of these control systems to do something else. An interesting example can be found in aviation: the flight data recorder. It captures a select number of parameters and record these in order to provide audit and forensics in case of a catastrophic event. The various control systems of subsystems in an airplane were not designed for that, it was added after. Interestingly enough, in the 90s, British Airways started to do what most industries are doing nowadays under the broad term "Industrial Internet": they looked at the flight recorder data and used some of the derived insights to better maintain their airplanes. As instrumentation grew, more sensors and better actuators, we can now fully digitalize machines.
So, the Digital Twin idea can be simply explained: as physical systems are now designed to be instrumented, you can collect data all along their lifecycles: from invention, to design, to manufacturing, to operation and maintenance, until decommissioning. As all this data is persisted, and understood through proper metadata management, you can start to develop intelligence by using analytics and machine learning. The accumulated data gives you the pictures of the past and present conditions and performance. The intelligence gives you early warning and predictions, in short, the future. Learn from the past and present, predict the future. The Digital Twin is the proxy of a physical system in the digital space, it can tell you everything it knows about that system and offers predictions. Overtime, the Digital Twin will also be used to control the physical system it is paired to. Every physical system will have its unique Digital Twin. Note that complex systems might not have only one unique Digital Twin of everything, but could be a composite of the Digital Twin of different parts, but this is beyond the scope of this discussion.
This technology serves important use cases in the industrial world: monitoring and diagnostics, maintenance optimization, individual and collective operations optimization, close back the loop from equipment performance to equipment design, etc.
Now that we have set the context of the idea of the Digital Twin and its potential business value, I would like to reflect on the technological aspects and the possibilities.
The RDBMS Analogy
To better understand the technology vision and how it should materialize to open up lots of possibilities, there is a very interesting analogy or parallel that we can draw with a technology that dominated the 80s: Relational Database Management Systems, aka RDBMS.
Digital Twin technology has three pillars: Digital Twin Classes (for a class of machines, let's say a specific model of a compressor), Digital Twin Instances (for specific machines of that certain type, let's say Pump #1057), and Digital Twin Software Platform (where Digital Twin data is persisted and intelligence is built).
RDBMS technology also has three pillars: Database Structures (metadata), Records (data) and Database Engine (that became platform in the long term).
Today's state of the art in the science of Digital Twins is still not as powerful as RDBMS are. There is a language to create relational database structure, it is called DDL: data description language. Once you have the structure, records are created and manipulated through APIs, these are INSERT/UPDATE/DELETE/SELECT in RDBMS. Database engines allow for optimization of indexing and storage of the records, as well as performance of the APIs. None of these artifacts have been standardized for Digital Twins yet, and I personally think this is going to be a critical step in the potential mass adoption of the concept.
If you drill down into the analogy, there are even more interesting insights, it has to do with Master Data and single source of truth.
Master Data and Single Source of Truth
When we started to deploy RDBMS, an explosion of Apps happened, pretty much every single Enterprise started to build database structure and loading records about their customers, products, transactions, etc. Long story short, this lead to a Master Data management nightmare. Think about two examples. First, the definition of a customer would be deeply integrated into the App, meaning that the Ordering system would have its own, and the Invoicing would have a second one, and another for Customer Service. Second, most Banking System nowadays are still based on the initial assumption that you are not a customer, but an account number. Banks portal do a good job at hiding that, but not all of their operations. If you call your bank to inquire about your checking account, you will first be authenticated. Likely at the end of the discussion, the bank person will ask you if you need something else. If you answer "yes, I would also like to discuss my investment account", the clerk will be very happy to transfer you, and guess what the investment account guy will do: authenticate you again.
Vendors made fortune selling you App integration software - who remembers New Era of Networks acquired by Sybase (now SAP) for close to half a billions? Until... the Salesforce of the world came in.
Salesforce is well known for its success as a SaaS vendor, riding the Cloud wave like now one (except maybe Amazon AWS and nowadays Microsoft). But, much more importantly to my judgement, Salesforce had the vision of a common object model, extensible, for the definition of a customer. It started with CRM, extended to Customer Servicing and Marketing, and opened it up as a Platform play. The rest is history.
I can clearly see the same path here.
Today, a lot of Industrial companies are focused on delivering outcomes, and build or buy Apps that serve these outcomes. Pretty much all these Apps have a portion of the Digital Twin vision or approach. Yet, their Digital Twins are imprison in the Apps. There are as many Digital Twins as Apps. This will also lead to integration nightmares, until someone comes out with a platform play.
It took decades for Master Data Management to become a hot issue. Yet we had time because when we started by building single purpose Apps, there was no internet and no electronic ecosystems immediately forcing us to think in terms of single source of truth, reuse and sharing. The infrastructure is here now, and Digital Business Platforms are a requirement for pretty much any business to stay competitive.
As we digitalize the physical world, I predict a lot of chaos in the short run because of a lack of "single source of digital truth" for the physical world that we want to integrate in our digital economies.
The Digital Twin approach will be the solution, but we need vision, platforms and discipline to succeed. RDBMS revolutionize the way we handle transactional data and create the "Systems of Record" category. Lots of business value was created in the race to dominate that category. Digital Twin technology offer the same potential for "Systems of Asset" to emerge as a category.
It's really about the possibilities.
Note: opinions and analyses expressed here are mine and do not necessarily reflects those of past and present employers.