This Guidance demonstrates how to use the cloud to build innovative solutions or applications, and experiment with new business models by combining data from SAP and non-SAP applications.
Configure OData service for extraction SAP system (such as, Sales Order). Extract is possible from ODP or OData Data Entities.
In Amazon AppFlow, create the flow using the OData connection created in step 2 to extract data from SAP and save to an Amazon Simple Storage Service (Amazon S3) bucket.
Use AWS Glue to cleanse and transform the data fields, integrate with other data, then save the transformed data into another Amazon S3 bucket.
Enrich the data with AWS services such as artificial intelligence/machine learning (AI/ML) models, Internet of Things (IoT), analytics, and data lake capabilities.
Create dashboards to visualize the business data as required by ML-powered visualizations (for example, forecast or anomaly detection).
In Amazon AppFlow, create a flow using the OData connection created in step 2 and start to write data from Amazon S3 to SAP using OData.
The AWS Well-Architected Framework helps you understand the pros and cons of the decisions you make when building systems in the cloud. The six pillars of the Framework allow you to learn architectural best practices for designing and operating reliable, secure, efficient, cost-effective, and sustainable systems. Using the AWS Well-Architected Tool, available at no charge in the AWS Management Console, you can review your workloads against these best practices by answering a set of questions for each pillar.
The architecture diagram above is an example of a Solution created with Well-Architected best practices in mind. To be fully Well-Architected, you should follow as many Well-Architected best practices as possible.
The solution can be fully deployed with code. You can incorporate this automation to your own development pipeline to enable iteration and consistent deployments across your SAP landscape. Observability is derived from the managed services used for extraction and transformation. Logs and dashboards are available from Amazon CloudWatch.
The serverless components in the architecture are protected with AWS Identity and Access Management (AWS IAM) -based authentication for secure validation of user identity. The managed services have access only to the data that has been specified. Access to the SAP workload is through Amazon AppFlow. Data is encrypted in transit and at rest. For audit logging, AWS CloudTrail can be used to log the API calls with the various services used for data export and import. The Amazon S3 bucket and cross-region data replication can also be applicable.
To increase security footprint, it’s always recommended to run the Amazon AppFlow connection using AWS PrivateLink. This setup spans Elastic Load Balancing (ELB) using either Application Load Balancer (ALB) or Network Load Balancer (NLB) with SSL closure, with AWS Certificate Manager.
All the serverless components are highly available. All non-SAP components are automatically scaling. Amazon AppFlow can move large volumes of data without breaking it down into multiple batches to increase reliability. Amazon S3 offers industry-leading scalability, data availability, security, and performance for SAP data export and import. PrivateLink is a regional service. As part of the Amazon AppFlow setup when using PrivateLink, you will set up at least 50% of availability zones in the region (minimum two availability zones per each region), that gives you an additional level of redundancy for ELB.
By using serverless technologies, you provision only the exact resources you use. Using Amazon S3 as the target or source for data export or import optimizes the storage of the architecture. Use PrivateLink for improved performance and agility. Configure multiple flows in Amazon AppFlow for different groups of business data. Follow performance efficiency practices for ELB and PrivateLink.
By utilizing serverless technologies, you pay only for the resources you use. To further optimize cost, make sure you are extracting only the business data groups that you need. To further optimize cost, minimize the number of flows being run based on the granularity of your reporting needs. Housekeeping of old or unwanted data can be set up using Amazon S3 data tiering or deleting the data. Follow cost optimization practices for ELB and PrivateLink.
By utilizing managed services and dynamic scaling, you minimize the environmental impact of the backend services. As new options become available for Amazon AppFlow, make sure these are adopted to further optimize the volume and frequency of extraction. Reducing the quantity and frequency of extraction improves sustainability and helps reduce cost and improve performance.
The sample code; software libraries; command line tools; proofs of concept; templates; or other related technology (including any of the foregoing that are provided by our personnel) is provided to you as AWS Content under the AWS Customer Agreement, or the relevant written agreement between you and AWS (whichever applies). You should not use this AWS Content in your production accounts, or on production or other critical data. You are responsible for testing, securing, and optimizing the AWS Content, such as sample code, as appropriate for production grade use based on your specific quality control practices and standards. Deploying AWS Content may incur AWS charges for creating or using AWS chargeable resources, such as running Amazon EC2 instances or using Amazon S3 storage.