This Guidance demonstrates how to use SAP to integrate data from enterprise resource planning (ERP), manufacturing execution systems (MES), or product lifecycle management (PLM) into engineering and business intelligence dashboards on AWS. Data is collected from multiple sources, transferred to a data lake, prepared for analytics, and then integrated with a business insights dashboard for supply chain stakeholders to report, monitor, and optimize engineering operations.
Architecture Diagram
Step 1
Determine what is needed to integrate engineering, manufacturing execution system (MES), product lifecycle management (PLM), and enterprise resource planning (ERP) data.
Step 2
Transfer existing MES, PLM, and ERP data into AWS through Amazon API Gateway and AWS Lambda.
Step 3
Data required for engineering workloads is moved into AWS, which will use the necessary AWS infrastructure.
Step 4
Engineering data (such as results and failures) is transferred from the engineering virtual private cloud (VPC) to the data lake in Amazon Simple Storage Service (Amazon S3) using AWS Database Migration Service (AWS DMS) and Amazon Relational Database Service (Amazon RDS).
Step 5
Using AWS IoT Greengrass, AWS IoT Core, and Amazon Kinesis, suppliers and foundries transfer yield and product data into the Amazon S3 data lake.
Step 6
Using AWS IoT Greengrass, AWS IoT Core, and Kinesis, manufactures and Outsourced Semiconductor Assembly and Test (OSAT) providers transfer product data into an Amazon S3 data lake.
Step 7
SAP is used to sort and process the data in the Amazon S3 data lake.
Step 8
Data is prepared for analytics by running extract, transform, and load (ETL) jobs using AWS Glue to discover, prepare, move, and integrate data from multiple sources.
Step 9
Amazon Redshift, Amazon Athena, and Amazon QuickSight can build a business insights dashboard.
Step 10
View data, create complex data models, and take needed actions using combined data from multiple sources.
Well-Architected Pillars
The AWS Well-Architected Framework helps you understand the pros and cons of the decisions you make when building systems in the cloud. The six pillars of the Framework allow you to learn architectural best practices for designing and operating reliable, secure, efficient, cost-effective, and sustainable systems. Using the AWS Well-Architected Tool, available at no charge in the AWS Management Console, you can review your workloads against these best practices by answering a set of questions for each pillar.
The architecture diagram above is an example of a Solution created with Well-Architected best practices in mind. To be fully Well-Architected, you should follow as many Well-Architected best practices as possible.
-
Operational Excellence
The QuickSight dashboard provides the actionable insights required to quickly react to changes in supply and demand, and enables yield management across the entire semiconductor supply chain.
-
Security
For secure authentication and authorization, customers can implement federated AWS access to directory services on-premises and use AWS Directory Service. Authentication and authorization can be used across the entire architecture, from the remote desktop to running batch jobs.
-
Reliability
By leveraging AWS IoT Core services, customers are leveraging secure connections between devices and the AWS Cloud. AWS IoT Core services use highly reliable, messaging queuing telemetry transport (MQTT) for the data input from third-party partners. Therefore, temporary transfer issues should be minimized or even eliminated.
-
Performance Efficiency
The services in this Guidance were selected to ensure compatibility for hi-tech, electronics, and semiconductor customers. Many of these companies have MES/PLM/ERP applications, and the services selected enable data integration from each of those applications.
-
Cost Optimization
Companies can save time and cost when using AWS Glue for data integration. AWS Glue is a fully managed extract, transform, and load (ETL) service that makes it simple and cost-effective to categorize data, clean it, enrich it, and move it reliably between various data stores and data streams.
-
Sustainability
By taking a microservices approach coupled with AWS managed serverless services, this Guidance allows customers to limit the amount of infrastructure they needed to provision.
Implementation Resources
A detailed guide is provided to experiment and use within your AWS account. Each stage of building the Guidance, including deployment, usage, and cleanup, is examined to prepare it for deployment.
The sample code is a starting point. It is industry validated, prescriptive but not definitive, and a peek under the hood to help you begin.
Related Content
Disclaimer
The sample code; software libraries; command line tools; proofs of concept; templates; or other related technology (including any of the foregoing that are provided by our personnel) is provided to you as AWS Content under the AWS Customer Agreement, or the relevant written agreement between you and AWS (whichever applies). You should not use this AWS Content in your production accounts, or on production or other critical data. You are responsible for testing, securing, and optimizing the AWS Content, such as sample code, as appropriate for production grade use based on your specific quality control practices and standards. Deploying AWS Content may incur AWS charges for creating or using AWS chargeable resources, such as running Amazon EC2 instances or using Amazon S3 storage.