[SEO Subhead]
This Guidance provides the essential data foundation for empowering customers to build data and analytics solutions. It shows how to integrate data from SAP ERP source systems and AWS in real-time or batch mode, with change data capture, using AWS services, SAP products, and AWS Partner Solutions. This Guidance includes an overview reference architecture showing how to ingest SAP systems to AWS in addition to five detailed architectural patterns that complement SAP-supported mechanisms (such as OData, ODP, SLT, and BTP) using AWS services, SAP products, and AWS Partner Solutions.
Please note: [Disclaimer]
Architecture Diagram

Overview of Architecture Patterns
This architecture diagram shows the pattern options for ingesting SAP systems to AWS. For detailed architecture patterns, open the other tabs.
Step 1
SAP data hosted on SAP RISE, SAP HANA Cloud, AWS, or on-premises systems can be extracted in real-time or batch and full or incremental mode from SAP NetWeaver systems, such as SAP ERP Central ECC, SAP S/4HANA, or SAP BW.
-
A. AWS Managed Services
This architecture diagram shows how to ingest SAP data to AWS using AWS Glue. For the other architecture patterns, open the other tabs.
Step 1
Use the following AWS Managed Services options to extract data from SAP: -
B1. AWS Partner Solution by BryteFlow
This architecture diagram shows how to ingest SAP data to AWS using the Partner Solution: BryteFlow SAP Data Lake Builder. For the other architecture patterns, open the other tabs.
Step 1a
For application-level data extraction, configure SAP OData services based on CDS views, BW extractors, BW InforProviders, or HANA information views. -
B2: SAP Datasphere and Data Services
This architecture diagram shows how to ingest SAP data to AWS using SAP Datasphere or SAP Data Services. For the other architecture patterns, open the other tabs.
Step 1
Data from SAP ERP hosted on RISE, AWS, or on-premises can be extracted using:A. SAP Datasphere
B. SAP Data Services
-
B3: SAP SLT
This architecture diagram shows how to ingest SAP data to AWS using SAP SLT. For the other architecture patterns, open the other tabs.
Step 1
Configure RFC destination in SAP SLT to the source SAP ERP system. -
C: SAP NetWeaver Add-On Solution by SNP
This architecture diagram shows how to use SAP NetWeaver add-on solution SNP Glue to extract data from SAP to AWS. For the other architecture patterns, open the other tabs.
Step 1
Install and configure SNP Glue ABAP add-on on the SAP ABAP-based source system (such as S/4HANA, ECC, CRM, or BW) Netweaver 7.1 SP14 or higher.
Get Started

Well-Architected Pillars

The AWS Well-Architected Framework helps you understand the pros and cons of the decisions you make when building systems in the cloud. The six pillars of the Framework allow you to learn architectural best practices for designing and operating reliable, secure, efficient, cost-effective, and sustainable systems. Using the AWS Well-Architected Tool, available at no charge in the AWS Management Console, you can review your workloads against these best practices by answering a set of questions for each pillar.
The architecture diagram above is an example of a Solution created with Well-Architected best practices in mind. To be fully Well-Architected, you should follow as many Well-Architected best practices as possible.
-
Operational Excellence
AWS CloudFormation automates the deployment process, while CloudWatch provides observability, tracking, and tracing capabilities. The entire solution can be deployed using CloudFormation, which helps automate deployments across development, quality assurance, and production accounts. This automation can be integrated into your development pipeline, enabling iterative development and consistent deployments across your SAP landscape.
-
Security
IAM secures AWS Glue and Amazon AppFlow through permission controls and authentication. These managed services access only specified data. Amazon AppFlow facilitates access to SAP workloads. Data is encrypted in transit and at rest. AWS CloudTrail logs API calls for auditing. S3 buckets and cross-region replication can store data. For enhanced security, run Amazon AppFlow over AWS PrivateLink with Elastic Load Balancing and SSL termination using AWS Certificate Manager.
-
Reliability
Amazon AppFlow and AWS Glue can reliably move large volumes of data without breaking it down into batches. Amazon S3 provides industry-leading scalability, data availability, security, and performance for SAP data export and import. PrivateLink is a regional service, and as part of the Amazon AppFlow setup using PrivateLink, you will set up at least 50 percent of Availability Zones in the Region (minimum two Availability Zones per Region), providing an additional level of redundancy for ELB.
-
Performance Efficiency
The SAP operational data provisioning framework captures changed data. Parallelization features in Amazon AppFlow and AWS Partner Solutions like BryteFlow and SNP enable customers to choose the number of parallel processes to run in the background, parallelizing large data volumes. Amazon S3 offers improved throughput with multi-part uploads through supported data integration mechanisms. The parallelization capabilities and seamless integration with Amazon S3 allow for efficient and scalable data ingestion from SAP systems into AWS.
-
Cost Optimization
By using serverless technologies like Amazon AppFlow or AWS Glue and Amazon EC2 auto scaling, you only pay for the resources you consume. To optimize costs further, extract only the required business data groups by leveraging semantic data models (for example, BW extractors or CDS views). Minimize the number of flows based on your reporting granularity needs. Implement housekeeping by setting up data tiering or deletion in Amazon S3 for old or unwanted data.
-
Sustainability
Data extraction workloads can be scheduled or invoked in real-time, eliminating the need for underlying infrastructure to run continuously. Using serverless and auto-scaling services is a sustainable approach for data extraction workloads, as these components activate only when needed. By leveraging managed services and dynamic scaling, you minimize the environmental impact of backend services. Adopt new options for Amazon AppFlow as they become available to optimize the volume and frequency of extraction.
Related Content

Replicate SAP to AWS in Real-Time with Business Logic Intact Using BryteFlow
Disclaimer
The sample code; software libraries; command line tools; proofs of concept; templates; or other related technology (including any of the foregoing that are provided by our personnel) is provided to you as AWS Content under the AWS Customer Agreement, or the relevant written agreement between you and AWS (whichever applies). You should not use this AWS Content in your production accounts, or on production or other critical data. You are responsible for testing, securing, and optimizing the AWS Content, such as sample code, as appropriate for production grade use based on your specific quality control practices and standards. Deploying AWS Content may incur AWS charges for creating or using AWS chargeable resources, such as running Amazon EC2 instances or using Amazon S3 storage.
References to third-party services or organizations in this Guidance do not imply an endorsement, sponsorship, or affiliation between Amazon or AWS and the third party. Guidance from AWS is a technical starting point, and you can customize your integration with third-party services when you deploy the architecture.