This Guidance demonstrates architectural options for building a supply chain operational data hub. The hub ingests data from thousands of disparate sources, including internal sources about planning and execution and external sources about shipment tracking. The hub then generates a single, harmonized view of data. Visibility into data from various enterprise and execution systems can be used for real-time planning around demand forecasts, inventory, and procurement. The data hub helps supply chain organizations make data-driven decisions that improve delivery times and increase customer satisfaction.
Supply chain data is collected from multiple data sources across the enterprise, including enterprise resource planning (ERP) and customer relationship management (CRM) software-as-a service (SaaS) applications, manufacturing shop-floor edge devices, logs, streaming media, and social media.
Based on the type of data source, AWS Database Migration Service (AWS DMS), AWS DataSync, Amazon Kinesis, Amazon Managed Streaming for Apache Kafka (Amazon MSK), AWS IoT Core, and Amazon AppFlow ingest data into the supply chain data lake hosted on AWS.
AWS Data Exchange integrates third-party data that may be useful in predicting shipment estimated time of arrival (such as weather data) into the supply chain data lake.
Amazon Simple Storage Service (Amazon S3) is the foundation for supply chain data lake storage.
AWS Glue extracts, transforms, catalogs, and ingests data across multiple data stores like ERP, planning, and shipment visibility systems.
Amazon Athena is a serverless interactive query service that analyzes data in Amazon S3 using standard SQL.
Amazon QuickSight provides dashboards that help planners analyze data about supply chain planning, execution, and real-time shipment status to make informed business decisions.
Amazon EMR provides the cloud big data platform for processing vast amounts of data using open source tools.
Amazon SageMaker builds, trains, and deploys ML models, and AWS AI services add intelligence to supply chain applications.
The AWS Well-Architected Framework helps you understand the pros and cons of the decisions you make when building systems in the cloud. The six pillars of the Framework allow you to learn architectural best practices for designing and operating reliable, secure, efficient, cost-effective, and sustainable systems. Using the AWS Well-Architected Tool, available at no charge in the AWS Management Console, you can review your workloads against these best practices by answering a set of questions for each pillar.
The architecture diagram above is an example of a Solution created with Well-Architected best practices in mind. To be fully Well-Architected, you should follow as many Well-Architected best practices as possible.
This Guidance is deployed with infrastructure as code (IaC), a DevOps principle that helps you maintain infrastructure through repeatable and reliable processes. Supply chain stakeholders, including business, development, and operations teams, should align on an IaC strategy.
DataSync uses cross-account access to delegate access to data and resources across different AWS accounts. QuickSight uses fine-grained access control to secure access to dashboards.
Services such as Amazon S3, AWS Glue, DataSync, Athena, and QuickSight are highly available, allowing you to scale workloads based on demand.
Serverless technologies in this architecture allow you to provision the exact resources needed at any given time.
Services in this architecture can automatically scale to meet demand, so you only pay for the resources consumed without under or over provisioning.
Services in this architecture are serverless and scalable, optimizing backend resource consumption to reduce environmental impact.
The sample code; software libraries; command line tools; proofs of concept; templates; or other related technology (including any of the foregoing that are provided by our personnel) is provided to you as AWS Content under the AWS Customer Agreement, or the relevant written agreement between you and AWS (whichever applies). You should not use this AWS Content in your production accounts, or on production or other critical data. You are responsible for testing, securing, and optimizing the AWS Content, such as sample code, as appropriate for production grade use based on your specific quality control practices and standards. Deploying AWS Content may incur AWS charges for creating or using AWS chargeable resources, such as running Amazon EC2 instances or using Amazon S3 storage.