Skip to main content

Guidance for Data Federation between SAP and AWS

Overview

This Guidance outlines the process of federating data between SAP and AWS cloud analytics services, enabling you to establish a data mesh architecture. SAP provides enterprise software for running business processes, from enterprise resource planning to customer relationship management. By connecting SAP with AWS, you can easily transform and visualize your data in a scalable, secure, and cost-effective way, helping you inform your decision-making.

How it works

This architecture diagram shows how to federate data between SAP and AWS cloud analytics services, enabling you to establish a data mesh architecture

Well-Architected Pillars

The architecture diagram above is an example of a Solution created with Well-Architected best practices in mind. To be fully Well-Architected, you should follow as many Well-Architected best practices as possible.

Amazon CloudWatch monitors the AWS Lambda functions for Athena Federated Queries as they pull data from SAP HANA in real time. AWS CloudTrail then logs all the API requests when SAP Datasphere pulls data from Athena. Together, these services provide visibility so that you can review any errors and appropriately respond to incidents.

Read the Operational Excellence whitepaper

AWS Secrets Manager stores SAP HANA Cloud and SAP Datasphere access credentials for Athena. SAP Datasphere uses AWS Identity and Access Management (IAM) permission controls and programmatic access to federate data from Athena into SAP Datasphere. Additionally, SAP Datasphere uses Java Database Connectivity to access Amazon Redshift. Working together, these services use key rotation, minimum-permission policies, and other security guardrails to maintain fine-grained access control to critical business data.

Read the Security whitepaper

This Guidance uses serverless components, which maintain high availability to help you support your business-critical analytics applications. For example, Athena implements queries using compute resources across multiple facilities and automatically reroutes queries in the case of failure. Additionally, Amazon S3 provides 99.999999999 percent durability, and you can enhance availability for this Guidance through Amazon Redshift Reliability and by deploying it across multiple Availability Zones.

Read the Reliability whitepaper

Athena provides a number of performance optimization techniques, including query optimizations and data partitioning. It also lets you use a variety of file formats (such as Apache Parquet or Apache Optimized Row Columnar) for optimum access. Additionally, Amazon Redshift provides performance tuning options such as massively parallel processing, data compression, query optimization, and data compression.

Read the Performance Efficiency whitepaper

This Guidance uses serverless services such as Athena, Amazon S3, and Amazon Redshift, which bill for only the resources you use. Serverless services automatically scale up and down based on demand, so you can avoid the cost of overprovisioning resources to support peak demand. Additionally, SAP HANA Cloud provides high price performance by using AWS Graviton processors.

Read the Cost Optimization whitepaper

By using managed services and dynamic scaling through services like Athena and Amazon S3, you can minimize the environmental impact of the backend service. Serverless infrastructure automatically scales up and down to match demand, so you can avoid the energy expenditure of overprovisioning hardware.

Read the Sustainability whitepaper

Disclaimer

The sample code; software libraries; command line tools; proofs of concept; templates; or other related technology (including any of the foregoing that are provided by our personnel) is provided to you as AWS Content under the AWS Customer Agreement, or the relevant written agreement between you and AWS (whichever applies). You should not use this AWS Content in your production accounts, or on production or other critical data. You are responsible for testing, securing, and optimizing the AWS Content, such as sample code, as appropriate for production grade use based on your specific quality control practices and standards. Deploying AWS Content may incur AWS charges for creating or using AWS chargeable resources, such as running Amazon EC2 instances or using Amazon S3 storage.