This Guidance demonstrates how to combine and consolidate SAP and non-SAP data from disparate sources using AWS Datalakes and Machine Learning services allowing customers to unlock hidden business insights.

Architecture Diagram

Download the architecture diagram PDF 

Well-Architected Pillars

The AWS Well-Architected Framework helps you understand the pros and cons of the decisions you make when building systems in the cloud. The six pillars of the Framework allow you to learn architectural best practices for designing and operating reliable, secure, efficient, cost-effective, and sustainable systems. Using the AWS Well-Architected Tool, available at no charge in the AWS Management Console, you can review your workloads against these best practices by answering a set of questions for each pillar.

The architecture diagram above is an example of a Solution created with Well-Architected best practices in mind. To be fully Well-Architected, you should follow as many Well-Architected best practices as possible.

  • This Guidance can be fully deployed with code. You can incorporate this automation to your own development pipeline to enable iteration and consistent deployments across your SAP landscape. Observability is derived from the managed services used for extraction and transformation. Logs and dashboards are available from CloudWatch.

    Read the Operational Excellence whitepaper 
  • The serverless components in the architecture are protected with AWS Identity and Access Management (IAM) for secure validation of user identity. The managed services only have access to the data that is specified. Access to the SAP workload is through Amazon AppFlow. Data is encrypted in transit and at rest. For audit logging, CloudTrail can be used to log the API calls with the various services used for the data lake.

    Read the Security whitepaper 
  • All the serverless components are highly available. All non-SAP components automatically scale. Amazon AppFlow can move large volumes of data without breaking it down into multiple batches to increase reliability. Amazon S3 offers industry-leading scalability, data availability, security, and performance for your data lake.

    Read the Reliability whitepaper 
  • By leveraging serverless technologies, you only provision the exact resources you use. Using Amazon S3 as the data lake optimizes the storage of the architecture with transformation of the data performed in AWS Glue DataBrew. For improved performance and agility, configure multiple flows in Amazon AppFlow for different groups of business data.

    Read the Performance Efficiency whitepaper 
  • By utilizing serverless technologies, you only pay for the resources you use. To further optimize cost, make sure you are extracting only the business data groups that you need. To further optimize cost, extract only the business data groups that you need and minimize the number of flows being executed based on the granularity of your reporting needs. Organization of old or unwanted data can also be setup through Amazon S3 data tiering or deleting the data. 

    Read the Cost Optimization whitepaper 
  • By utilizing managed services and dynamic scaling, we minimize the environmental impact of the backend services. As new options become available for Amazon AppFlow, make sure these are adopted to further optimize the volume and frequency of extraction. Reducing the quantity and frequency of extraction will improve sustainability as well as help reduce cost and improve performance.

    Read the Sustainability whitepaper 

Implementation Resources

A detailed guide is provided to experiment and use within your AWS account. Each stage of building the Guidance, including deployment, usage, and cleanup, is examined to prepare it for deployment.

The sample code is a starting point. It is industry validated, prescriptive but not definitive, and a peek under the hood to help you begin.

[Subject]
[Content Type]

[Title]

[Subtitle]
This [blog post/e-book/Guidance/sample code] demonstrates how [insert short description].

Disclaimer

The sample code; software libraries; command line tools; proofs of concept; templates; or other related technology (including any of the foregoing that are provided by our personnel) is provided to you as AWS Content under the AWS Customer Agreement, or the relevant written agreement between you and AWS (whichever applies). You should not use this AWS Content in your production accounts, or on production or other critical data. You are responsible for testing, securing, and optimizing the AWS Content, such as sample code, as appropriate for production grade use based on your specific quality control practices and standards. Deploying AWS Content may incur AWS charges for creating or using AWS chargeable resources, such as running Amazon EC2 instances or using Amazon S3 storage.

References to third-party services or organizations in this Guidance do not imply an endorsement, sponsorship, or affiliation between Amazon or AWS and the third party. Guidance from AWS is a technical starting point, and you can customize your integration with third-party services when you deploy the architecture.

Was this page helpful?