This Guidance leverages SAP data and business logic extraction to build a SAP context-aware data warehouse that complements or replaces existing Data Warehouse solution(s), allowing customers to shift from overnight data processing to near real-time analytics.
Architecture Diagram

Step 1
Configure (or activate) the data source in your SAP system (for example, activate SAP extractors).
Step 2
Configure operational data provisioning (ODP) for extraction in the SAP Gateway of your SAP system.
Step 3
Create the OData system connection from Amazon AppFlow to your SAP source system. This is through AWS PrivateLink for SAP on AWS. You can connect with AWS via VPN/AWS Direct Connect, or over the internet.
Step 4
In Amazon AppFlow, create the flow using the SAP source created in step 3. Run the flow to extract data from SAP, and save it to an Amazon Simple Storage Service (Amazon S3) bucket.
Step 5
Use an AWS Glue crawler to create a data catalog entry with metadata for the extracted SAP data in an Amazon S3 bucket.
Step 6
Use AWS Glue DataBrew and/or AWS Glue Studio jobs to cleanse and optimize the format of the SAP data format, then save the transformed data into another Amazon S3 bucket.
Step 7
Load data into Amazon Redshift through simple ‘COPY’ commands. Model the data with other non-SAP sources in your data warehouse.
Step 8
Build SQL-based ML models to drive insight from your data.
Step 9
Create the dataset in Amazon QuickSight with Amazon Redshift as the data source.
Step 10
Create a dashboard to visualize the business data as per user requirements. Use inbuilt ML and insight features to help enable speed to insight.
Step 11
With Amazon SageMaker, build and train ML models.
Well-Architected Pillars

The AWS Well-Architected Framework helps you understand the pros and cons of the decisions you make when building systems in the cloud. The six pillars of the Framework allow you to learn architectural best practices for designing and operating reliable, secure, efficient, cost-effective, and sustainable systems. Using the AWS Well-Architected Tool, available at no charge in the AWS Management Console, you can review your workloads against these best practices by answering a set of questions for each pillar.
The architecture diagram above is an example of a Solution created with Well-Architected best practices in mind. To be fully Well-Architected, you should follow as many Well-Architected best practices as possible.
-
Operational Excellence
The guidance can be fully deployed with code. You can incorporate this automation to your own development pipeline to enable iteration and consistent deployments across your SAP landscape. Observability is derived from the managed services used for processing and process level metrics, logs and dashboards are available from Amazon CloudWatch.
-
Security
The serverless components in the architecture are protected with AWS IAM -based authentication for secure validation of user identity. The managed services only have access to the data that has been specified and access to the SAP workload is via Amazon Appflow. Amazon Appflow supports PrivateLink. Data is encrypted in transit and at rest. Amazon Redshift can be deployed into a customers VPC.
-
Reliability
All the serverless components are highly available. All non-SAP components are automatically scaling. Amazon Appflow can move large volumes of data without breaking it down into multiple batches to increase reliability. Amazon Redshift continuously monitors health and automatically re-replicates data from failed drives and replaces nodes as necessary for fault tolerance.
-
Performance Efficiency
By leveraging serverless technologies, you only provision the exact resources you use. Using Amazon S3 as the corporate data memory optimises the storage of the architecture with processing of the data performed in Amazon Redshift. For improved performance and agility configure multiple flows in Amazon Appflow for different groups of business data.
-
Cost Optimization
By utilizing serverless technologies, you only pay for the resources you use. To further optimize cost, extract only the business data group that you need and minimize the number of flows being executed based on the granularity of your reporting needs. Amazon S3 lifecycle policies can be put in place for data.
-
Sustainability
By utilizing managed services and dynamic scaling, we minimize the environmental impact of the backend services. As new options become available for Amazon AppFlow, make sure these are adopted to further optimise the volume and frequency of extraction. Reducing the quantity and frequency of extraction will improve sustainability as well as help reduce cost and improve performance.
Related Content

Disclaimer
The sample code; software libraries; command line tools; proofs of concept; templates; or other related technology (including any of the foregoing that are provided by our personnel) is provided to you as AWS Content under the AWS Customer Agreement, or the relevant written agreement between you and AWS (whichever applies). You should not use this AWS Content in your production accounts, or on production or other critical data. You are responsible for testing, securing, and optimizing the AWS Content, such as sample code, as appropriate for production grade use based on your specific quality control practices and standards. Deploying AWS Content may incur AWS charges for creating or using AWS chargeable resources, such as running Amazon EC2 instances or using Amazon S3 storage.