AWS for SAP
Architecture Options for Extracting SAP Data with AWS Services
Gartner found that nearly 97% of data sits unused by organisations, and more than 87% of the organisations are classified as having low maturity levels in terms of business intelligence and analytics capability. This capability deficit could severely restrict a company’s growth and introduce risk to its existence as it cannot reinvent itself. Every company must move quickly to assess its data analytics capabilities and chart a course for transformation to a data-driven enterprise. It is a crucial part of becoming more responsive to customers and to market opportunities, and more agile given the rapidly changing nature of technology and the marketplace
Here are a few AWS customers which benefited by being data-driven enterprise :
- Moderna is a biotechnology company pioneering a new class of messenger RNA (mRNA) medicines. Leveraging its mRNA platform and manufacturing facility with the AWS-powered research engine, Moderna delivered the first clinical batch of its vaccine candidate (mRNA-1273) against COVID-19 to the National Institute of Health (NIH) for the Phase 1 trial 42 days after the initial sequencing of the virus. By building and scaling its operations on AWS which includes SAP S/4HANA, Amazon Redshift and Amazon Simple Storage Service (S3), Moderna is able to quickly design research experiments and uncover new insights, automate its laboratory and manufacturing processes to enhance its drug discovery pipeline, and more easily comply with applicable laws and regulations during production and testing of vaccine and therapeutic candidates.
- Zalando (Europe’s largest online fashion platform) started migrating its SAP systems to AWS to increase agility, simplify IT maintenance, and build a future-ready data architecture as part of its digital transformation. With a hybrid data lake on AWS that is tightly integrated with one of the world’s largest SAP S/4HANA systems, Zalando has reduced its cost of insight by 30% while improving customer satisfaction. Zalando built its data lake with services like Amazon Redshift, AWS Glue, Amazon S3.
The first step to getting more out of your SAP data is getting it to your AWS data lake. This enables you to uncover new opportunities and solve business challenges. In this blog, we will discuss Architecture options to extract SAP Data to AWS based on your SAP ERP or S/4HANA versions.
We will focus on AWS Services such as Amazon Appflow, AWS Glue, AWS Lambda, Amazon API Gateway, as well as SAP solutions such as SAP Data Services, SAP Data Intelligence in order to provide a baseline scenarios.
There are a number of AWS Partners solutions that can help with extraction, processing and analytics of SAP Data such as Qlik, Bryteflow, HVR, Linke, Boomi and others. However they will not be discussed in this blog, but you can visit AWS marketplace or contact your AWS contact point to find out more. If you need assistance on implementing these AWS Services, you can contact AWS Professional Services or AWS partners which are listed in AWS Partner Discovery Portal.
The key considerations when extracting data from SAP Systems fall into two major categories, 1/ Commercial, and 2/ Technical.
Buy vs Build
To integrate AWS with SAP, developers can implement minimum line of code. While running custom code can be cost effective at first, however it typically requires you to maintain the custom code. On the other hand, there are a number of SAP solutions (such as SAP Data Services) or AWS Managed Services (such as Amazon AppFlow ) or other commercial off the shelf (COTS) solutions, that are highly specialized. They come with a large set of pre-built capabilities for ease of use. It is important to consider the full total cost of ownership (TCO).
Middleware Software vs. Cloud Native
Leveraging middleware software for integration between SAP and AWS means additional administrative effort (installation, patching and upgrade) as well as runtime costs (software license). In order to address this, AWS introduced a managed service that eliminates the administrative effort and runtime costs to integrate SAP and AWS. AppFlow provides a no-code and serverless option to extract SAP Data, as well as write this data back to SAP..
SAP License Impact
When extracting data from SAP and writing back data to SAP, you will need to consider your SAP Licensing requirements.
Note : Before implementing data extraction or write back to/from SAP systems please verify your licensing agreement.
Price vs Value
When you buy off-the-self software such as SAP Data Services, you can procure a perpetual license which allows you to use the software for an indefinite period of time by paying a single fee. With a perpetual license, it can be difficult to determine costs vs business value for a certain initiative. When you use cloud native services such as Appflow, you pay-per-use based on the number of flows and data volume that are required. This pay-per-use, or utility, model enables you to understand the true cost versus achieved business value of a certain initiative.
Pull vs. Push the Data
At a high level there are two type of mechanism to extract SAP Data :
- To pull data from SAP and then push it to AWS services like Amazon S3. This method is usually executed as batch and requires and SAP system to be accessible by the extraction tools. Some customers may have security concerns around this approach and therefore it may be less preferred for them.
- To push data from SAP to AWS Services. This method is good for near real time extraction using available methods such as SAP Intermediate Documents (IDOCs).
For relatively small tables, for example master data tables, repetitive full loads may be acceptable when extracting SAP data. For large tables, for example transactional data tables, the transfer of deltas may be preferred for performance and cost reasons. With delta extraction only the data that has changed since last extraction is identified. Common SAP delta mechanisms are Application Link Enabling (ALE) Change Pointers, Operational Data Provisioning (ODP) Delta Queues, Change Data Capture (CDC), and timestamp fields via querying the last changed date and time.
SAP Upgrade Impact
For SAP Customers who are running SAP ECC 6.0 or prior (SAP Business Suite), a concern would be the upgrade impact of the SAP Data Extraction mechanism that is being established. This challenge may lead to a solution that avoids database-level extraction, because of the fact that major changes to database schema can be expected when an upgrade happens to S/4HANA.
Taking into account the solution considerations above and considering practical aspects of SAP systems, we have created a decision tree (below) to help guide customers to choose which method is appropriate to extract your SAP Data.
An important practical consideration is SAP Gateway availability. SAP Gateway allows you to leverage the OData protocol to consume SAP Data via RESTful APIs. OData is an Open Data Protocol which is an OASIS standard, It is ISO/IEC approved and runs on HTTPS protocol. It supports secure connectivity over the internet and also supports a hybrid multi-cloud construct with capability to scale with data volume. SAP Gateway will provide you with a broad range of options for extracting SAP data, without restricting yourself to a legacy protocol such as RFC or IDOC.
- If you have SAP Gateway, then the next consideration would be the SAP ERP version that you are currently running:
- If you are running the latest SAP S/4HANA, you will have many prebuilt OData services that you can leverage for extraction. In the latest S/4HANA, there are more than 2000+ prebuilt OData Services. Most of these OData Services are built with the Fiori User Interface in-mind. For a large data extraction you may still want to leverage the SAP BW Extractors through ODP because they include delta, monitoring, and troubleshooting mechanisms. SAP BW Extractors provide application context, thus reducing transformation works at the target system or data lake.
- If you are running on ECC 6.0 EHP7/8, you will have limited prebuilt OData services, but you can still leverage SAP BW Extractors through ODP for most of the extraction.
- If you do not have SAP Gateway you are most likely running SAP ECC 6.0 EHP8 or prior. You may be concerned about the upgrade impact on your extraction mechanisms after you perform a SAP upgrade. In order to minimize this impact we recommend use of the Standard SAP BW Extractors through ODP, Standard BAPIs or Standard IDOCs.
- Custom BW Extractors, BAPIs, IDOCS, Database and Files extraction methods are viable however these may increase your Total Cost of Ownership TCO because you will have to build, operate and maintain the custom code yourself.
- You still can use RFCs/BAPIs and IDOCs in S/4HANA however since these are legacy protocols, your extraction tool choices and network connectivity options may be restricted because these protocols were built for LAN and WAN environments. There will be challenges to traverse the internet and these protocols may not perform optimally in a hybrid cloud environment. The recommendation is therefore still to consider OData as a first choice because it is an Open Data Protocol which is flexible to implement and is supported in a hybrid multi-cloud environment.
Figure 1. Guideline Decision Tree for Extracting SAP Data with AWS Services.
Architecture Design Pattern Characteristics
Below is a summary of the Architecture Design Patterns and their characteristics that are tagged in the decision tree above. This will help you decide on an extraction method for your SAP Data.
|Number||Architecture Pattern||Extraction Method||Delta Handling||Middleware Services||Pros and Cons|
|S/4HANA or ECC 6.0 EHP7/8, OData, with SAP Gateway|
|A1||S/4HANA or ECC 6.0 EHP7/8 with pre-built OData Services||Pre-Built Standard OData Services||Consider timestamp field||
SAP Data Intelligence
SAP Data Services
|A2||S/4HANA or ECC 6.0 EHP7/8 with Data Extractors (BW extractor) through OData||Standard BW Extractors (ODP Based)||Delta is handled within ODP||
|A3||S/4HANA or ECC 6.0 EHP7/8 with Custom OData Services||Custom OData (ABAP CDS View)||Consider timestamp field||
|ECC 6.0 EHP8 or prior, RFC, no SAP Gateway|
|A4||ECC 6.0 EHP7/8 or earlier with Data Extractors (BW Extractors) thru RFC||Standard BW Extractors (ODP Based)||Delta is handled within ODP||
SAP Data Services
|Custom BW Extractors||To be built within BW Extractors|
|A5||ECC 6.0 EHP7/8 or earlier with BAPI thru RFC||Standard BAPI||Consider timestamp field|
|Custom BAPI||Consider timestamp field|
|ECC 6.0 EHP8 or prior, HTTP-XML, no SAP Gateway|
|A6||Any Versions of ECC or S/4HANA with IDOCs||Standard IDOCs||Delta is handled within IDOCs||API Gateway/AWS Lambda||
|Custom IDOCs||Delta is handled within IDOCs|
|ECC 6.0 EHP8 or prior, JDBC, no SAP Gateway|
|A7||Any Versions of ECC or S/4HANA with Database||Database||Consider timestamp field||AWS Glue/Lambda||
|ECC 6.0 EHP8 or prior, Files, no SAP Gateway|
|A8||ECC 6.0 EHP7/8 or earlier with BAPI thru Files||Flat Files||Consider timestamp field||AWS Glue/Lambda||
In this blog, we have discussed the Architecture Patterns for extracting SAP Data to AWS. Each of the patterns is described along with their pros and cons based on key considerations such as delta handling, licensing, running costs and upgrade impact. With the decision tree provided you can assess and decide on which pattern is suitable for your scenario.
Here are some further references that you may find useful. They outline more end to end scenarios that become possible once your SAP data has been extracted to AWS.
- Extract data from SAP ERP and BW with AppFlow.
- Building data lakes with SAP on AWS.
- SAP on AWS Beyond – Lab Repository.
- How to connect SAP solutions running on AWS with AWS accounts and services.
- Query SAP HANA using Athena Federated Query and join with data in your Amazon S3 data lake.
- Data Extraction from Data Lake and Amazon Redshift Using SAP Data Services.
- Data Federation Between SAP Data Warehouse Cloud and Amazon Redshift.
- Extend your SAP business processes using Amazon AppFlow and AWS native services.
You can find out more about SAP on AWS, Amazon AppFlow, AWS Glue, AWS Lambda, from the AWS product documentation.