- Views: 1
- Report Article
- Articles
- Computers
- Programming
Composability as a Catalyst for Scaling Digital Twins
Posted: Aug 15, 2024
Digital twins empower businesses to create virtual models of buildings, products, production lines, facilities, and processes. These digital replicas can improve performance, quickly identify quality issues, and support more informed decision-making.
In an ideal scenario, engineers could swiftly develop increasingly complex digital twins to represent turbines, wind farms, power grids, and energy companies. However, this process is often complicated by the many components of a digital twin, which extend beyond physical models to include data management, semantic labels, security, and user interfaces (UI).
Innovative approaches to assembling digital elements into larger models and assemblies could streamline this process, making it more efficient.
A New Approach to Modeling
The models used to build digital twins share similarities with those employed in analytics and artificial intelligence (AI), but there are notable differences. All of these models start with the collection of relevant and timely historical data, which informs the design of the model and aligns the current state with the model’s outputs.
Unlike traditional statistical learning methods, the structure of models in digital twin simulations isn’t directly derived from the data. Instead, modelers develop a structure through interviews, research, and design sessions with domain experts, ensuring the model addresses pre-defined strategic or operational challenges.
Involving domain experts in verifying and guiding the model structure is essential. However, this time investment can limit the use of simulations to applications that require ongoing scenario analysis. Creating a digital twin model is a continuous process that demands careful consideration of model granularity and system boundaries. This balance is crucial for ensuring the model is appropriate for the issues it is designed to address. If companies cannot appropriately limit the details captured by a simulation model, achieving a satisfactory return on investment (ROI) will be highly challenging.
The Rise of Composable Digital Twins
This is where the Composable Process Technology (CPT) framework comes into play. The CPT framework provides diverse teams with a standardized approach for collaborating earlier in the development cycle. It includes a reference framework for addressing six key competency areas: data services, integration, intelligence, user experience (UX), management, and trustworthiness.
This framework can help businesses identify composability challenges that need either internal or external solutions. Additionally, it assists in pinpointing specific integrations at the capability level, enabling companies to build a portfolio of reusable competencies. This approach reduces redundant services and efforts, streamlining the digital twin development process.
Packaging Digital Twin Capabilities
A composable digital twin consists of six clusters of capabilities that help manage the integrated model and other digital twin instances based on it. It can also integrate IoT and other data services to provide an up-to-date representation of the entity the digital twin simulates. The CPT framework presents these capabilities in a periodic table format, making it independent of any specific technology or design. By describing a digital twin in terms of its capabilities, it becomes easier to match a particular implementation with the technologies required to deliver those capabilities. This approach aligns with the broader industry trend toward modular business applications, allowing engineers, scientists, and other experts to compose and recompose digital twins for various business needs.
Moreover, it opens up opportunities for new packaged business capabilities that can be applied across different industries.
Challenges in Achieving Composability
Many digital twin projects are currently in the pilot phase or are limited to a single asset or process. While digital twins hold great promise for improving operational efficiency and reducing costs, challenges with composability are a major barrier to their widespread adoption. Engineers face difficulties in integrating the different ways that equipment and sensors collect, process, and format data. This complexity is further compounded by the lack of common standards and reference frameworks to enable seamless data exchange.
Companies looking to scale digital twins encounter several significant obstacles, including an immature data landscape, system complexity, talent shortages, and limited vertical integration in off-the-shelf platforms and solutions.
Weaving the Components Together
The next step involves developing a second-layer composability framework with more detailed definitions of capabilities. An associated effort toward a "digital-twin-capabilities-as-a-service" model will outline how digital twin capabilities can be defined and delivered in a zero-touch manner through a capabilities marketplace.
Ultimately, these efforts could lay the foundation for digital threads that connect processes across different digital twins.
In the near future, a digital thread-centric approach will become central to enabling integration at both the data platform and organizational levels. DataOps-as-a-service for data transformation, harmonization, and integration across platforms will be essential for making digital twin initiatives modular and scalable.
Peter is the Editor at AiTech365.com & works with his team on latest technologies in AI