[text]
This Guidance provides a technical foundation to visualize and monitor key performance indicators (KPIs) for reporting on climate-related physical risks to your operations. In particular, it demonstrates how you can enrich physical assets using large-scale, high-quality climate projection datasets made available through the Amazon Sustainability Data Initiative (ASDI). Physical climate risk assessment provides critical insights into the expected impacts of climate change to businesses, organizations, and communities. Understanding the potential risks and consequences of climate change empowers decision-makers to create and execute effective strategies for adaptation and resilience.
Please note: [Disclaimer]
Architecture Diagram
[text]
Step 1
Analysts and operations managers collect geo locations, such as latitudes and longitudes, of interested sites, in addition to site metadata, such as site type, square footage, and occupancy rate.
Step 2
Sustainability subject matter experts (SMEs) generate a config file, specifying climate models, scenarios, and metrics for physical climate risk assessment.
Step 3
Ingest data into the AWS Cloud using Amazon API Gateway or AWS Tools and SDKs.
Step 4
Store static files in a data lake using Amazon Simple Storage Service (Amazon S3). Use Amazon DynamoDB or Amazon Relational Database Service (Amazon RDS) to store site data and geo locations, depending on data consumption patterns. Use DynamoDB to track the state as data moves through the pipeline.
Step 5
AWS hosts a broad variety of high-quality climate datasets through the Amazon Sustainability Data Initiative (ASDI) to reduce barriers to data access and store large-scale sustainability datasets. These datasets are optimized for the cloud and accessible through public S3 buckets.
Step 6
Enrich site data with downscaled climate models from ASDI. Perform geospatial joins to extract metric statistics from raster climate data files using AWS Lambda. Export processed data to Amazon S3, and use AWS Glue Data Catalog to store the metadata of your datasets. Use AWS Step Functions to orchestrate data processing pipelines.
Step 7
Build a dashboard using Amazon QuickSight or custom web application using AWS Amplify to visualize physical climate risks related to your operations. Analysts can also directly query from Amazon Athena to gain insights on custom metrics.
Well-Architected Pillars
The AWS Well-Architected Framework helps you understand the pros and cons of the decisions you make when building systems in the cloud. The six pillars of the Framework allow you to learn architectural best practices for designing and operating reliable, secure, efficient, cost-effective, and sustainable systems. Using the AWS Well-Architected Tool, available at no charge in the AWS Management Console, you can review your workloads against these best practices by answering a set of questions for each pillar.
The architecture diagram above is an example of a Solution created with Well-Architected best practices in mind. To be fully Well-Architected, you should follow as many Well-Architected best practices as possible.
-
Operational Excellence
You can use Athena to query custom metrics for additional insights, which can then be implemented in a QuickSight dashboard for reporting or a custom widget in a web application. These insights can also help you infer which additional climate scenarios and metrics to include as part of the input.
-
Security
Resources are protected by AWS Identity and Access Management (IAM) policies and roles, following the least-privilege access principle. Data ingested to the AWS Cloud is encrypted and transferred over HTTPS. AWS Key Management Service (AWS KMS) encrypts and protects data-at-rest.
-
Reliability
This Guidance follows an event-driven architecture with loosely coupled dependencies, making it easy to isolate behaviors and increase resilience and agility. It uses managed serverless services such as Step Functions and Lambda to orchestrate the loosely coupled dependencies.
-
Performance Efficiency
Services selected for this architecture are purpose-built. For example, Step Functions Distributed Map orchestrates large-scale parallel workloads. In this Guidance, it orchestrates data enrichment using large-scale raster datasets from ASDI. Additionally, QuickSight is a purpose-built business intelligence tool. It allows you to create charts with geographic locations.
-
Cost Optimization
Step Functions and Lambda only run when invoked by an event and automatically scale based on workload demand. You can also use the Amazon S3 Intelligent-Tiering storage class to automatically move data to the most cost-effective access tier in Amazon S3.
-
Sustainability
AWS hosts a broad variety of large-scale high-quality sustainability datasets through public S3 buckets. This Guidance uses cloud-optimized open data directly from these S3 buckets to enrich your location data. This means you don't need to store large datasets in your AWS environment, reducing your overhead maintenance.
Implementation Resources
A detailed guide is provided to experiment and use within your AWS account. Each stage of building the Guidance, including deployment, usage, and cleanup, is examined to prepare it for deployment.
The sample code is a starting point. It is industry validated, prescriptive but not definitive, and a peek under the hood to help you begin.
Related Content
Estimating physical climate heat risk with NASA Global Daily Downscaled Projections on ASDI
This post shows an example architecture to query the downscaled climate projection data for locations, calculate cooling degree day (CDD) metrics, and visualize the output with Amazon QuickSight.
Disclaimer
The sample code; software libraries; command line tools; proofs of concept; templates; or other related technology (including any of the foregoing that are provided by our personnel) is provided to you as AWS Content under the AWS Customer Agreement, or the relevant written agreement between you and AWS (whichever applies). You should not use this AWS Content in your production accounts, or on production or other critical data. You are responsible for testing, securing, and optimizing the AWS Content, such as sample code, as appropriate for production grade use based on your specific quality control practices and standards. Deploying AWS Content may incur AWS charges for creating or using AWS chargeable resources, such as running Amazon EC2 instances or using Amazon S3 storage.
References to third-party services or organizations in this Guidance do not imply an endorsement, sponsorship, or affiliation between Amazon or AWS and the third party. Guidance from AWS is a technical starting point, and you can customize your integration with third-party services when you deploy the architecture.