This Guidance demonstrates how AWS Supply Chain Control Tower can improve visibility into business-critical systems. Analyze a constant stream of data in near real-time to determine actionable insights and predictive recommendations for your supply chain.
Supply Chain Control Towers (SCCT) rely on data inputs from external systems such as logistic partners. File-based integrations are often used. Modern approaches include posting/pushing data to a secure API that is built using Amazon API-Gateway (Amazon Gateway) and AWS Lambda.
A Consumer Packaged Goods (CPG) organization relies on critical systems that manage everything from raw materials to production capacity. A range of integration methods can be used to integrate with these systems: Amazon EventBridge to deliver events as they occur, Amazon AppFlow for turn-key integrations with systems such as SAP, and for systems that are limited to file based integrations. AWS Transfer Family can be used to manage secure file transfers such as SFTP jobs.
Various equipment across your supply chain also has data critical to SCCT. Connected devices can transmit messages through AWS IoT Core using the MQTT protocol.
Services such as AWS DataBrew and AWS Glue can be used to transform and normalize data before pushing to the data platform. Amazon Textract can be used to extract important data from images/paper documents (such as dock slips) for ingestion into the data-platform.
At the heart of the SCCT is your data-platform that will be your single source of truth for your data. There are many patterns that are suitable.
Amazon SageMaker can be used to build, train, and deploy machine learning models that are focused on specific use cases such as ETA prediction. Amazon Forecast can be used with time-series data for demand forecast, inventory, production, and required materials.
Amazon QuickSight can be used with Amazon OpenSearch Service for live analytics, Amazon Athena for impromptu queries for data in your data lake, and Amazon Redshift for complex queries and views.
QuickSight can be used to visualize the data analyzed in your data platform to create actionable insights for specific SCCT use cases (listed).
Your SCCT can be extended with microservices for specific use cases. Example: Query transport location data from Amazon DynamoDB through Amazon Lambda for visualization with Amazon Location Services.
A scalable, secure and serverless front-end is created leveraging AWS Amplify for easy code development, Amazon Cognito for identity management, Amazon CloudFront for content distribution, Amazon Simple Storage Service (Amazon S3) for storage of static assets, and Amazon Route 53 DNS.
The AWS Well-Architected Framework helps you understand the pros and cons of the decisions you make when building systems in the cloud. The six pillars of the Framework allow you to learn architectural best practices for designing and operating reliable, secure, efficient, cost-effective, and sustainable systems. Using the AWS Well-Architected Tool, available at no charge in the AWS Management Console, you can review your workloads against these best practices by answering a set of questions for each pillar.
The architecture diagram above is an example of a Solution created with Well-Architected best practices in mind. To be fully Well-Architected, you should follow as many Well-Architected best practices as possible.
By solely leveraging AWS managed services, each service emits its own set of metrics into Amazon CloudWatch, where customers can monitor for errors.
For public facing services (such as the UI), Amazon Cognito is used to ensure secure access to the core applications and services - this includes role-based access controls. Provisioned API endpoints are also secured with appropriate access, authentication, and authorization controls to ensure use by allowed systems and users. For other AWS services, AWS Identity and Access Management (IAM) role-based access controls are leveraged to ensure least privileged access between services.
The AWS managed services used in this architecture support secure communication by encrypting data in transit. Where data is stored (such as Amazon Redshift and Amazon S3), data is also encrypted at rest.
This architecture leverages managed services that are designed to be highly available by default. Some services (such as Amazon Redshift) can also be configured to be deployed over multiple availability zones. Each component in this architecture is designed to maintain availability in the event of disaster events. AWS managed services are designed to span multiple availability zones. Other services, like Amazon Redshift, can also be deployed over multiple availability zones. In the case of availability zone failure, services deployed can continue to operate.
Scalable and highly available services like Amazon S3, Amazon Kinesis, AWS Glue and Amazon Redshift are purposefully built for data analytics workloads.
This architecture follows a serverless-first approach. Where possible, serverless services scale according to load to ensure you only pay for what is used. In addition, AWS managed services are used that allow for utility billing.
Data transfer is a consideration for any data-orientated architecture. In this solution, the biggest data volume is ingested from source systems which is generally uncharged in the inbound direction. Data transfer for AWS SFTP file transfers, API gateway requests, and Amazon AppFlow data are charged as per the documented service pricing. All other data is kept within the region for processing to minimize transfer charges.
AWS managed services help to scale up and down according to business requirement and traffic, and are inherently more sustainable than on-premises solutions. Additionally, leveraged serverless components automate the process of infrastructure management and make it more sustainable.
The sample code; software libraries; command line tools; proofs of concept; templates; or other related technology (including any of the foregoing that are provided by our personnel) is provided to you as AWS Content under the AWS Customer Agreement, or the relevant written agreement between you and AWS (whichever applies). You should not use this AWS Content in your production accounts, or on production or other critical data. You are responsible for testing, securing, and optimizing the AWS Content, such as sample code, as appropriate for production grade use based on your specific quality control practices and standards. Deploying AWS Content may incur AWS charges for creating or using AWS chargeable resources, such as running Amazon EC2 instances or using Amazon S3 storage.