[SEO Subhead]
This Guidance shows how historical and current operational data, which has been captured with internet of Things (IoT) devices and camera streams, can be viewed simultaneously by constructing a digital twin. It supports multiple layers of data visualization, each tailored to the specific needs of airline operations. Passenger flow management, baggage handling, predictive maintenance for equipment, tracking movable assets, aircraft turnaround management, and a Building Management System (BMS) can all be integrated for optimal operations.
Please note: [Disclaimer]
Architecture Diagram
[Architecture diagram description]
Step 1
Extract historical and predictive data from your airport data platform, which uses AWS Database Migration Service (AWS DMS), AWS Data Exchange, Amazon Kinesis Data Streams, and Amazon Simple Storage Service (Amazon S3).
Step 2
Query Amazon Redshift from the digital twin view for historical data.
Step 3
Query Amazon SageMaker from the digital twin view for predictive and “what-if” analysis.
Step 4
Query data from your airport Ops360 datastore in Amazon DynamoDB by exposing an Amazon API Gateway endpoint through an AWS Lambda integration.
Step 5
Capture Internet of Things (IoT) sensor data from building management systems, baggage handling, passenger-flow trackers, and more. Aggregate the data and transmit it to the cloud on a schedule using AWS IoT Greengrass.
Step 6
Run inferences on your camera streams at the edge using AWS Panorama.
Step 7
Centralize all IoT data ingested into the cloud using AWS IoT Core to verify authentication and authorization standards. Use AWS IoT SiteWise to build the IoT data model for the digital twin to consume.
Step 8
Make your live video feeds available for consumption through the digital twin by using Amazon Kinesis Video Streams.
Step 9
Use AWS IoT TwinMaker to build your scenes and overlay data on your 3D models. Publish the scenes on Amazon Managed Grafana or on a custom front end built using AWS IoT Application Kit and hosted on AWS Amplify.
Well-Architected Pillars
The AWS Well-Architected Framework helps you understand the pros and cons of the decisions you make when building systems in the cloud. The six pillars of the Framework allow you to learn architectural best practices for designing and operating reliable, secure, efficient, cost-effective, and sustainable systems. Using the AWS Well-Architected Tool, available at no charge in the AWS Management Console, you can review your workloads against these best practices by answering a set of questions for each pillar.
The architecture diagram above is an example of a Solution created with Well-Architected best practices in mind. To be fully Well-Architected, you should follow as many Well-Architected best practices as possible.
-
Operational Excellence
This Guidance lets you use an Amazon CloudFormation template or an AWS Cloud Development Kit (AWS CDK) for scripts, so you can quickly and safely deploy changes and updates to your workloads. By using infrastructure-as-code tools, you can automate deployment and security checks for all infrastructure and software updates. For observability, you can use Amazon CloudWatch, which provides level metrics and personalized dashboards and logs. You can then set up dashboards and alarms to notify you when your environment is not operating as expected. You can even set up automatic workflows to remediate certain states.
-
Security
This Guidance uses AWS IoT Core to securely connect all IoT devices to AWS. The service encrypts all communication and requires all its clients (connected devices, server applications, mobile applications, or human users) to use strong authentication (including X.509 certificates, AWS Identity and Access Management (IAM) credentials, or third-party authentication through Amazon Cognito). AWS IoT Core also offers fine-grained authorization to isolate and secure communication among authenticated clients.
This Guidance also uses Amazon Managed Grafana, which lets you control and restrict incoming traffic that can reach your workspace. It also encrypts data at rest without special configuration or third-party tools and encrypts data in transit using SSL.
-
Reliability
This Guidance uses AWS Panorama so that devices can run machine learning (ML) models locally while also sending data to the cloud for further processing. This edge ML deployment reduces your dependency on cloud connectivity, improving reliability and reducing downtime risks.
-
Performance Efficiency
This Guidance uses AWS IoT SiteWise, which efficiently processes a large volume of machine data at scale to help you derive insights faster. Additionally, AWS IoT TwinMaker improves efficiency by accelerating digital twin creation through prebuilt components, templates, and automation.
-
Cost Optimization
This Guidance helps you optimize data storage costs by using Amazon S3, which provides features like life cycle policies and S3 Intelligent-Tiering to automatically move data to the most cost-effective tiers, such as S3 Standard-Infrequent Access (S3 Standard-IA) and S3 Glacier Flexible Retrieval.
-
Sustainability
This Guidance reduces the need to connect to the cloud continuously by using AWS IoT Greengrass, which deploys ML models and logic to devices to facilitate autonomous operations locally. This lets devices perform compute, messaging, data caching, syncing, and ML inferencing at the edge, helping you minimize your power usage and reduce your carbon footprint.
Implementation Resources
A detailed guide is provided to experiment and use within your AWS account. Each stage of building the Guidance, including deployment, usage, and cleanup, is examined to prepare it for deployment.
The sample code is a starting point. It is industry validated, prescriptive but not definitive, and a peek under the hood to help you begin.
Related Content
Guidance for Airport Data Management on AWS
How digital twins can optimize Travel and Hospitality operations
Disclaimer
The sample code; software libraries; command line tools; proofs of concept; templates; or other related technology (including any of the foregoing that are provided by our personnel) is provided to you as AWS Content under the AWS Customer Agreement, or the relevant written agreement between you and AWS (whichever applies). You should not use this AWS Content in your production accounts, or on production or other critical data. You are responsible for testing, securing, and optimizing the AWS Content, such as sample code, as appropriate for production grade use based on your specific quality control practices and standards. Deploying AWS Content may incur AWS charges for creating or using AWS chargeable resources, such as running Amazon EC2 instances or using Amazon S3 storage.
References to third-party services or organizations in this Guidance do not imply an endorsement, sponsorship, or affiliation between Amazon or AWS and the third party. Guidance from AWS is a technical starting point, and you can customize your integration with third-party services when you deploy the architecture.