This Guidance demonstrates architectural options for building a supply chain operational data hub. The hub ingests data from thousands of disparate sources, including internal sources about planning and execution and external sources about shipment tracking. The hub then generates a single, harmonized view of data. Visibility into data from various enterprise and execution systems can be used for real-time planning around demand forecasts, inventory, and procurement. The data hub helps supply chain organizations make data-driven decisions that improve delivery times and increase customer satisfaction.
Architecture Diagram
Step 1
Supply chain data is collected from multiple data sources across the enterprise, including enterprise resource planning (ERP) and customer relationship management (CRM) software-as-a service (SaaS) applications, manufacturing shop-floor edge devices, logs, streaming media, and social media.
Step 2
Based on the type of data source, AWS Database Migration Service (AWS DMS), AWS DataSync, Amazon Kinesis, Amazon Managed Streaming for Apache Kafka (Amazon MSK), AWS IoT Core, and Amazon AppFlow ingest data into the supply chain data lake hosted on AWS.
Step 3
AWS Data Exchange integrates third-party data that may be useful in predicting shipment estimated time of arrival (such as weather data) into the supply chain data lake.
Step 4
AWS Lake Formation helps with building the scalable supply chain data lake.
Step 5
Amazon Simple Storage Service (Amazon S3) is the foundation for supply chain data lake storage.
Step 6
AWS Glue extracts, transforms, catalogs, and ingests data across multiple data stores like ERP, planning, and shipment visibility systems.
Step 7
Amazon Athena is a serverless interactive query service that analyzes data in Amazon S3 using standard SQL.
Step 8
Amazon QuickSight provides dashboards that help planners analyze data about supply chain planning, execution, and real-time shipment status to make informed business decisions.
Step 9
Amazon Redshift, a cloud data warehouse, analyzes structured and semi-structured data.
Step 10
Amazon EMR provides the cloud big data platform for processing vast amounts of data using open source tools.
Step 11
Amazon SageMaker builds, trains, and deploys ML models, and AWS AI services add intelligence to supply chain applications.
Step 12
Amazon Neptune graph database optimizes network queries for speed and accuracy.
Well-Architected Pillars
The AWS Well-Architected Framework helps you understand the pros and cons of the decisions you make when building systems in the cloud. The six pillars of the Framework allow you to learn architectural best practices for designing and operating reliable, secure, efficient, cost-effective, and sustainable systems. Using the AWS Well-Architected Tool, available at no charge in the AWS Management Console, you can review your workloads against these best practices by answering a set of questions for each pillar.
The architecture diagram above is an example of a Solution created with Well-Architected best practices in mind. To be fully Well-Architected, you should follow as many Well-Architected best practices as possible.
-
Operational Excellence
This Guidance is deployed with infrastructure as code (IaC), a DevOps principle that helps you maintain infrastructure through repeatable and reliable processes. Supply chain stakeholders, including business, development, and operations teams, should align on an IaC strategy.
-
Security
DataSync uses cross-account access to delegate access to data and resources across different AWS accounts. QuickSight uses fine-grained access control to secure access to dashboards.
-
Reliability
Services such as Amazon S3, AWS Glue, DataSync, Athena, and QuickSight are highly available, allowing you to scale workloads based on demand.
-
Performance Efficiency
Serverless technologies in this architecture allow you to provision the exact resources needed at any given time.
-
Cost Optimization
Services in this architecture can automatically scale to meet demand, so you only pay for the resources consumed without under or over provisioning.
-
Sustainability
Services in this architecture are serverless and scalable, optimizing backend resource consumption to reduce environmental impact.
Disclaimer
The sample code; software libraries; command line tools; proofs of concept; templates; or other related technology (including any of the foregoing that are provided by our personnel) is provided to you as AWS Content under the AWS Customer Agreement, or the relevant written agreement between you and AWS (whichever applies). You should not use this AWS Content in your production accounts, or on production or other critical data. You are responsible for testing, securing, and optimizing the AWS Content, such as sample code, as appropriate for production grade use based on your specific quality control practices and standards. Deploying AWS Content may incur AWS charges for creating or using AWS chargeable resources, such as running Amazon EC2 instances or using Amazon S3 storage.