Table of contents
Some time ago in 2019 Tom Jenks wrote a great article that defined a Digital Twin and provided a high-level overview of the potential benefits. This technology has come of age, given recent advances that have occurred with the Internet of Things (IoT), machine learning and Artificial Intelligence. By providing access to a greater number of data points coupled with the ability to better predict other variables, it is now possible to achieve far greater calibration between the physical and digital worlds. These developments have significantly increased the value of the Digital Twin.
Research findings validate that the Digital Twin is not just a passing fad (source):
- The Digital Twin will become standard feature/functionality for IoT Application enablement by 2021
- Over 90% of software providers now recognize the need for IoT APIs and Platform integration to support Digital Twin functionality
- Nearly 75% of the executives interviewed across a broad spectrum of markets plan to incorporate Digital Twin technology within their operations by 2020
What is Driving Adoption of the Digital Twin?
Clearly the market has embraced the Digital Twin as a viable technology worthy of investment – so what has been driving the interest? I think the better question to ask is “Why now?” The use of technology to model and simulate product or system performance has long been an engineering standard. Anytime it is possible to create a model that simulates real-world activities, the benefits seem to outweigh the costs.
Engineers couldn’t imagine designing new products without the use of CAD or PLM technologies; the use of a Digital Twin is simply an extension of this same concept. What has changed is that some of the complexity has been removed from the process. Today’s technologies have made it much easier to extract a greater number of measurements and data points. The ubiquity of Internet access has made the sharing of this knowledge instant. What has resulted in a Digital Twin with greater accuracy that can deliver a far better virtual representation of products and systems.
Imagine the complexity of trying to simulate all the parts and systems within an automobile, including the engine, tires, seats and interior? Multiple factors contribute to wear and tear such as heat, friction, and the use of fluids. A model that could accurately predict future wear patterns would be extremely helpful to design for quality and help predict when future maintenance actions should be taken. The Digital Twin can now help overcome these challenges.
Here are a few ways this technology is now delivering even greater value to manufacturers.
1. Accelerating New Product Introduction (NPI)
One of the challenges with New Product Introduction is that new variants or process improvements may look fine in isolation, but when operating as a continuous system, circumstances can arise that simply couldn’t have been reasonably anticipated, resulting in delays in the launch of new products. Through the use of a more robust Digital Twin technology it is now possible to simulate far greater complexity across systems. And, it is possible to maintain a higher level of accuracy, despite the increasing number of product updates that can impact the NPI process.
One example is that a higher volume of data packets might be run through a networking chip that previously considered possible – the Digital Twin could reveal this higher performance now possible. Add to this the performance of other complementary systems with an equal level of complexity, and it starts to become clear a new level of simulation is now needed, which can be achieved with a Digital Twin. The amount of data has become so large that new simulation tools are now needed to keep up.
2. Accommodating Greater Product Mix with a Dynamic Supply Chain
In this case we have two complex systems operating separately from each other, impacted by different critical factors, yet dependent upon each other for overall throughput. These types of simulations are very difficult to optimize so can be too difficult to effectively model as a combined system. Here is where the value of a Digital Twin becomes increasingly apparent. By providing the ability to virtually represent and simulate highly complex, integrated systems and perform scenario testing to the entire ecosystem, it is possible to be better prepared for a potential supply chain disruption. Other insights include visibility into ways to reduce risk of downtime or quality degradation.
3. Bill of Materials Management
Complex manufacturing today comprises work orders with potentially hundreds of thousands of parts, thousands of suppliers, and millions of lines of programming code. The use of a Digital Twin can reveal patterns of performance to yield future productivity improvement. By modeling the production process for complex products with sufficient detail that extends back into the supply chain, it is possible to potentially predict when a possible future product recall might occur, to then proactively takes steps to avoid it.
The Final Frontier
The limits of what a Digital Twin might be able to do in the future seem almost limitless. Perhaps the next use of the Digital Twin might even be our own brain! As (still) one of the most complex systems known to the human race, the creation of a Digital Twin that models how we think might unlock enormous understandings of how we operate as humans – and how we can overcome some of our own shortfalls. Of course, this type of modeling would likely require the next-generation of supercomputers (perhaps a Quantum computer?) and some amazing collaboration between brain researchers and computer engineers.
You must be logged in to post a comment.