[SEO Subhead]
This Guidance demonstrates how to leverage AWS services to create digital twins for industrial Internet of Things (IoT), spatial compute, and simulation use cases. It shows how to securely ingest and manage industrial IoT and spatial assets through comprehensive digital twin dashboards. Additionally, the Guidance illustrates how to ingest and manage on-premises spatial data with a spatial data plane, facilitating seamless integration of critical spatial information into the digital twin environment. Furthermore, the AWS Cloud builds, stores, calibrates, and orchestrates simulations using a diverse collection of storage and compute primitives, empowering you to conduct virtual testing, experimentation, and scenario planning.
Please note: [Disclaimer]
Architecture Diagram
-
High-Level Overview
-
IoT Data
-
Spatial Data Plane
-
Building and Orchestrating Simulation Twins
-
Product Design Example
-
High-Level Overview
-
This architecture diagram consists of three integrated modules that address key stages of workforce safety and compliance and create the Digital Twin framework: IoT, spatial compute, and simulation components.
Step 1
Spatial data for the Digital Twin Framework is ingested into a spatial data plane using Visual Asset Management System (VAMS) or similar digital asset managers.
Step 2
Amazon API Gateway manages the storage of file data in Amazon Simple Storage Service (Amazon S3) and metadata in Amazon DynamoDB.
Step 3
IoT data to fuel the Digital Twin Framework is ingested using a preferred Industrial Data Fabric solution.
Step 4
Amazon S3 provides object storage, while AWS IoT SiteWise and AWS IoT TwinMaker provide structure and semantics to the IoT data.
Step 5
Simulation and AI/ML models created by engineers are stored in AWS. The simulation runtime environments are containerized and stored in Amazon Elastic Container Registry (Amazon ECR). Configuration files and model weights are stored in Amazon S3, and the simulation source code is stored in Git repositories.Step 6
Simulations workflows are managed with orchestrators like TwinFlow, Amazon Managed Workflows for Apache Airflow (Amazon MWAA), and AWS Step Functions. Simulations and AI/ML models are evaluated in AWS Batch or Amazon SageMaker using Amazon EventBridge events.
Step 7
Periodically, EventBridge will initiate AWS Batch processes that re-calibrate the simulation or model with updated data stored in Amazon S3, AWS IoT SiteWise, or other database solutions such as Amazon Timestream.Step 8
Real-time 3D rendering of the digital twin can be done in the user’s browser using AWS IoT TwinMaker or can be rendered in the cloud and displayed in the user’s browser with WebRTC video streams.Step 9
Users view digital twins with Amazon Managed Grafana or with custom dashboards built with IoT App Kit and hosted with Amazon CloudFront. Additionally, users can interact with virtual workstations using NICE DCV remote desktop software or using cloud-hosted applications with Amazon AppStream 2.0.Step 10
Users may opt to view digital twins on mobile devices and AR/VR headsets. Users may also choose to integrate generative AI chatbots into their dashboards.
-
IoT Data
-
This architecture diagram shows how to connect IoT data to the digital twin.
Step 1
The asset connects to an industrial personal computer (PC) at the customer's location through a programmable logic controller (PLC).
Step 2
This PC runs AWS IoT Greengrass edge runtime and AWS IoT SiteWise Edge. AWS IoT SiteWise Edge collects the asset telemetry data and pushes it over a secure connection, such as a virtual private network (VPN), to AWS IoT SiteWise.
Step 3
AWS IoT SiteWise organizes assets in a hierarchy, computes key performance indicators (KPIs), and stores time-series data in different storage tiers.Step 4
The data in AWS IoT SiteWise, Amazon S3, and other data sources is connected to AWS IoT TwinMaker to build knowledge graphs and 3D digital twins.Step 5
Amazon Managed Grafana provides operational dashboards along with 3D digital twins leveraging AWS IoT TwinMaker and AWS IoT SiteWise plug-ins for Grafana.Step 6
Users, authenticated through their company’s single sign-on (SSO) federation, can safely observe the dashboards over the public internet. -
Spatial Data Plane
-
This architecture diagram shows how to create the spatial component of a digital twin, including the ingestion and processing of data into real-time 3D assets.
Step 1
Spatial assets such as computer aided design (CAD) files, photos, videos, and light detection and ranging (LIDAR) point clouds accumulate on-premises in engineering workstations and product lifecycle management (PLM) systems.Step 2
The design user uploads spatial assets content to the cloud through a custom web application hosted with Amazon CloudFront and Amazon S3. The web app uses API Gateway to integrate with the spatial data plane. The design user may also write custom automation scripts that interact with API Gateway directly.Step 3
API Gateway manages uploads of large binary files to Amazon S3 and stores location, tags, versions, and other metadata in DynamoDB tables.Step 4
API Gateway uses Amazon Cognito and AWS Lambda authorizers to manage role-based access controls (RBAC) and asset-based access controls (ABAC).Step 5
API Gateway also manages the pipelines and workflows that convert source assets into real-time 3D assets used in rendering the digital twin.Step 6
AWS IoT TwinMaker provides an interactive real-time 3D scene component, binding spatial data to industrial IoT data.Step 7
Amazon Managed Grafana provides dashboards that include 3D context alongside charts and graphs, leveraging the AWS IoT TwinMaker plug-in for Grafana. It alerts the operational technology (OT) user when there is an exception. -
Building and Orchestrating Simulation Twins
-
This architecture diagram shows how to simulate a digital twin.
Step 1
An engineer or scientist builds the simulation twin from a local workstation.
Step 2
The engineer builds containerized simulation runtime stored in Amazon ECR. Model weights, configuration files, and task definitions are stored in S3 buckets or Git repositories.
Step 3
The simulations run on data ingested from the industrial edge into an industrial data foundation, commonly a mix of AWS IoT SiteWise, Amazon S3, and if real-time capabilities are needed, Amazon Timestream.Step 4
Orchestration of static data, IoT data, models, configuration, execution of simulations, and assessment of calibration requirements is performed by one of many offerings, such as EventBridge, Step Functions, Amazon MWAA, and opensource TwinFlow.Step 5
Periodically, the simulations are re-calibrated to align the predictions with current results and data. Orchestration can invoke AWS Batch workflows that update the simulations stored in Amazon ECR or model weights stored in Amazon S3.
Step 6
The simulations run on demand as scheduled by an orchestrator and feed results back to the industrial data foundation to be presented to users by AWS IoT TwinMaker and an Amazon Managed Grafana dashboard.
-
Product Design Example
-
This architecture diagram shows a specific implementation of the Digital Twin framework used for product design.
Step 1
A design engineer uses a local workstation to author an Omniverse application for the spatial twin of the device.Step 2
CAD assets are converted to universal scene description (USD) assets. Using a jump box, the CAD assets are installed as an Omniverse application on a GPU instance in a private subnet.Step 3
The simulation engineer uses a local workstation to author an Ansys Twin Builder Reduced Order Model of the asset.Step 4
The model is installed using a jump box on a GPU instance in a private subnet.Step 5
The IoT data from the asset is ingested to AWS IoT SiteWise by AWS IoT Greengrass edge runtime. The IoT data is consumed by both the simulation twin and the spatial twin.Step 6
The spatial twin renders into a WebRTC video stream consumed by an iframe within the dashboard of a self-hosted Grafana instance.Step 7
The industrial designer can control the physical asset from the dashboard. Widgets in the dashboard will initiate changes to an AWS IoT Device Shadow, read by an AWS IoT Greengrass component that sends commands to the asset.
Step 8
Similarly, the designer can control the simulation asset. Widgets in the dashboard will initiate the simulation twin, which will send results to AWS IoT SiteWise and render into the Omniverse spatial twin video stream on the Grafana dashboard.
Get Started
Well-Architected Pillars
The AWS Well-Architected Framework helps you understand the pros and cons of the decisions you make when building systems in the cloud. The six pillars of the Framework allow you to learn architectural best practices for designing and operating reliable, secure, efficient, cost-effective, and sustainable systems. Using the AWS Well-Architected Tool, available at no charge in the AWS Management Console, you can review your workloads against these best practices by answering a set of questions for each pillar.
The architecture diagram above is an example of a Solution created with Well-Architected best practices in mind. To be fully Well-Architected, you should follow as many Well-Architected best practices as possible.
-
Operational Excellence
The majority of AWS services used in this Guidance are serverless, lowering the operational overhead of maintaining the Guidance. The VAMS and Industrial Data Fabric solutions leverage AWS Cloud Development Kit (AWS CDK) to provide infrastructure as code. Using AWS CDK and AWS CloudFormation, you can apply the same engineering discipline that you use for application code to your entire environment.
Integration with Amazon CloudWatch enables monitoring of incoming data and alerting on potential issues. By understanding service metrics, you can optimize event workflows and ensure scalability. Visualizing and analyzing data and compute components using CloudWatch helps you identify performance bottlenecks and troubleshoot requests.
-
Security
By using AWS Identity and Access Management (IAM), Amazon Cognito, API Gateway, and Lambda authorizers, this Guidance prioritizes data protection, system security, and asset integrity, aligning with best practices and improving your overall security posture. Amazon Route 53, AWS WAF, and AWS VPN are used in this Guidance to provide secure connections for both public and private networking between on-premises facilities and the AWS Cloud.
We recommend enabling encryption at rest for all data destinations in the cloud, a feature supported by both Amazon S3 and AWS IoT SiteWise, to further safeguard sensitive information.
-
Reliability
Through multi-Availability Zone (multi-AZ) deployments, throttling limits, and managed services like Amazon Managed Grafana, this Guidance helps to ensure continuous operation and minimal downtime for critical workloads. Specifically, AWS IoT SiteWise and AWS IoT TwinMaker implement throttling limits for data ingress and egress for continued operations, even during periods of high traffic or load.
Furthermore, the Amazon Managed Grafana console provides a reliable workspace for visualizing and analyzing metrics, logs, and traces without the need for hardware or infrastructure management. It automatically provisions, configures, and manages the workspace while handling automatic version upgrades and auto-scaling to meet dynamic usage demands. This auto-scaling capability is crucial for handling peak usage during site operations or shift changes in industrial environments.
-
Performance Efficiency
By utilizing the capabilities of AWS IoT SiteWise to manage throttling, in addition to the automatic scaling of both AWS IoT SiteWise and Amazon S3, this Guidance can ingest, process, and store data efficiently, even during periods of high data influx. This automatic scaling eliminates the need for manual capacity planning and resource provisioning, enabling optimal performance while minimizing operational overhead.
-
Cost Optimization
The majority of AWS services used by this Guidance are serverless, cost-optimized services, providing digital twin capabilities at a low price point. These services offer a pay-as-you-go pricing model, meaning you are only charged for data ingested, stored, and queried.
AWS IoT SiteWise also offers optimized storage settings, enabling data to be moved from a hot tier to a cold tier in Amazon S3, further reducing storage costs.
-
Sustainability
The services in this Guidance use the elastic and scalable infrastructure of AWS, which scales compute resources up and down based on usage demands. This prevents overprovisioning and minimizes excess compute capacity, reducing unintended carbon emissions. You can monitor your CO2 emissions using the Customer Carbon Footprint Tool.
Additionally, the agility provided by technologies like digital twins (built with AWS IoT TwinMaker), event-based automation, and AI/ML-based insights empowers engineering teams to optimize on-site operations, increasing efficiency and minimizing emissions from industrial processes.
Related Content
[Title]
Disclaimer
The sample code; software libraries; command line tools; proofs of concept; templates; or other related technology (including any of the foregoing that are provided by our personnel) is provided to you as AWS Content under the AWS Customer Agreement, or the relevant written agreement between you and AWS (whichever applies). You should not use this AWS Content in your production accounts, or on production or other critical data. You are responsible for testing, securing, and optimizing the AWS Content, such as sample code, as appropriate for production grade use based on your specific quality control practices and standards. Deploying AWS Content may incur AWS charges for creating or using AWS chargeable resources, such as running Amazon EC2 instances or using Amazon S3 storage.
References to third-party services or organizations in this Guidance do not imply an endorsement, sponsorship, or affiliation between Amazon or AWS and the third party. Guidance from AWS is a technical starting point, and you can customize your integration with third-party services when you deploy the architecture.