This Guidance helps customers obtain sustainability metrics using the SAP Sustainability Control Tower® software-as-a-service (SaaS) offering. SAP Sustainability Control Tower® is an environmental, social, and governance (ESG) accounting tool powered by the SAP Business Technology Platform (SAP BTP), SAP® Data Warehouse Cloud, and SAP Analytics Cloud, all of which are hosted on AWS. With this Guidance, customers can track their carbon footprint—one of the key metrics for ESG disclosures—using pre-built data models. These data models support established reporting frameworks such as the Global Reporting Initiative, World Economic Forum, and Task Force on Climate-Related Financial Disclosures (TCFD) in addition to extensible data models based on SAP and third-party structured and unstructured data.
Architecture Diagram
Step 1
Import ESG data and integrate it with SAP data sources to enhance its value.
Step 2
Send ESG data from non-SAP sources, such as SaaS apps, file shares, and Internet of Things (IoT) devices, to the AWS data lake. Depending on the data source, multiple services can ingest the data, including Amazon API Gateway, Amazon Kinesis Data Streams, AWS DataSync, Amazon AppFlow, AWS IoT Core, or AWS IoT Greengrass.
Step 3
Process and manage non-SAP data in your data lake using AWS Glue, AWS Step Functions, AWS Lake Formation, AWS Glue Crawler, and AWS Glue Data Catalog.
Step 4
Use SAP® Data Warehouse Cloud’s native data integration capabilities and Amazon Athena federated query to virtualize data stored in Amazon Simple Storage Service (Amazon S3) or Amazon Redshift.
An alternative to federating AWS data through Athena is to physically replicate the required non-SAP data from Amazon S3 or directly from the source into SAP® Data Warehouse Cloud (or HANA Cloud) using SAP® Data Warehouse/HANA Cloud’s native replication.
Step 5
Use SAP Analytics Cloud to report regulatory-compliant ESG metrics, gain deeper insights on ESG performance, set targets, and monitor progress.
Step 6
Set up AWS services to monitor data activity and manage who has access to data.
Well-Architected Pillars
The AWS Well-Architected Framework helps you understand the pros and cons of the decisions you make when building systems in the cloud. The six pillars of the Framework allow you to learn architectural best practices for designing and operating reliable, secure, efficient, cost-effective, and sustainable systems. Using the AWS Well-Architected Tool, available at no charge in the AWS Management Console, you can review your workloads against these best practices by answering a set of questions for each pillar.
The architecture diagram above is an example of a Solution created with Well-Architected best practices in mind. To be fully Well-Architected, you should follow as many Well-Architected best practices as possible.
-
Operational Excellence
The Guidance uses standard service metrics to monitor the health of individual pipeline components, such as concurrency limits. Step Functions provides visibility into the processing pipeline status. Amazon CloudWatch provides centralized logging with metrics and alarms to raise alerts for operational anomalies.
-
Security
Lake Formation provides a single, central location to manage fine-grained access control to data in your data lake. AWS Identity and Access Management (IAM) grants operators permissions to use resources through least-privilege access and role-based access. For example, you can use IAM policies to grant permissions to execute Athena queries and to select the IAM roles that SAP® Data Warehouse Cloud uses. We recommend data be encrypted in-transit and at rest using AWS Key Management Service (AWS KMS) and customer-managed AWS KMS keys. You should routinely rotate these keys.
-
Reliability
The services in this Guidance have initial service limits that accommodate a large majority of customer workloads. If necessary, you can request that service quotas be expanded. Examples include concurrent executions of AWS Glue jobs or concurrent active Data Manipulation Language (DML) queries in Athena. Additionally, this Guidance uses AWS services such as Amazon S3 and Amazon Redshift for data storage, both of which provide built-in functionality for data backup and recovery.
-
Performance Efficiency
Rather than using query federation through Athena, you can directly ingest data from your data lake into SAP® Data Warehouse Cloud to optimize performance. This Guidance uses serverless managed services that automatically scale up and down in response to changing demand, reducing overhead resources. Storing data in Amazon S3 allows you to bring various tools and services to your data that are tailored to your needs. For example, you can query data directly in Amazon S3 using Athena, or you can integrate Amazon QuickSight for business intelligence (BI) dashboards.
-
Cost Optimization
This Guidance relies on serverless AWS services such as AWS Glue, Step Functions, and Athena. These services are fully managed and scale automatically according to workload demand so that you pay only for the resources you use.
-
Sustainability
You can limit your data footprint using Athena for query federation from SAP® Data Warehouse Cloud, which reduces the need for additional copies of data. By using SAP Sustainability Control Tower®, you can collect sustainability insights from your data lake without having to provision and manage additional resources within AWS.
Implementation Resources
A detailed guide is provided to experiment and use within your AWS account. Each stage of building the Guidance, including deployment, usage, and cleanup, is examined to prepare it for deployment.
The sample code is a starting point. It is industry validated, prescriptive but not definitive, and a peek under the hood to help you begin.
Related Content
[Title]
Disclaimer
The sample code; software libraries; command line tools; proofs of concept; templates; or other related technology (including any of the foregoing that are provided by our personnel) is provided to you as AWS Content under the AWS Customer Agreement, or the relevant written agreement between you and AWS (whichever applies). You should not use this AWS Content in your production accounts, or on production or other critical data. You are responsible for testing, securing, and optimizing the AWS Content, such as sample code, as appropriate for production grade use based on your specific quality control practices and standards. Deploying AWS Content may incur AWS charges for creating or using AWS chargeable resources, such as running Amazon EC2 instances or using Amazon S3 storage.