Select your cookie preferences

We use essential cookies and similar tools that are necessary to provide our site and services. We use performance cookies to collect anonymous statistics, so we can understand how customers use our site and make improvements. Essential cookies cannot be deactivated, but you can choose “Customize” or “Decline” to decline performance cookies.

If you agree, AWS and approved third parties will also use cookies to provide useful site features, remember your preferences, and display relevant content, including relevant advertising. To accept or decline all non-essential cookies, choose “Accept” or “Decline.” To make more detailed choices, choose “Customize.”

Skip to main content

Guidance for Monitoring and Optimizing Energy Usage on AWS

Overview

This Guidance demonstrates how to monitor and optimize the energy use of industrial and/or building equipment in customer premises using AI/ML on AWS. From data ingestion to data processing and model training to inference, customers can use this guidance to maximize their energy consumption while driving energy cost reduction. 

How it works

These technical details feature an architecture diagram to illustrate how to effectively use this solution. The architecture diagram shows the key components and their interactions, providing an overview of the architecture's structure and functionality step-by-step.

Well-Architected Pillars

The architecture diagram above is an example of a Solution created with Well-Architected best practices in mind. To be fully Well-Architected, you should follow as many Well-Architected best practices as possible.

This Guidance takes in real-time and batch telemetry data from IoT sensors, and trains a Machine Learning model to deliver recommendations to reduce energy usage. The model is stored as a model artifact in Amazon S3 and will trigger AWS CodePipeline to request human approval before deployment. This allows end users to question and validate the model recommendation with ease.

Read the Operational Excellence whitepaper 

This Guidance encourages the use of role-based access with AWS Identity and Access Management (AWS IAM). This ensures only the appropriate people have access to the content. All roles are defined with least-privilege access, and all communications between services stay within the customer account. 

All data is encrypted both in-transit and at rest using AWS Key Management Service (AWS KMS). The data catalog in AWS Glue is encrypted.

Read the Security whitepaper 

AWS Glue, Amazon S3, and Neptune are all serverless, and will scale data access performance as data volume increases. Neptune also adjusts capacity to provide just the right amount of database resources that the application needs, avoiding the need to set up and manage any servers or data warehouses. 

Data is stored in a data lake that is built with an Amazon S3 bucket. Amazon S3 objects are stored across a minimum of three Availability Zones. This provides 99.999999999% durability of objects. Therefore, the Guidance is inherently resistant to failures. 

Read the Reliability whitepaper 

By using serverless technologies, you provision only the exact resources you use. AWS Glue and AWS Lambda only run when needed. Additionally, Neptune is a fully managed serverless graph database, which also scales according to demand to ensure just the right number of resources are needed to complete the job.

Read the Performance Efficiency whitepaper 

This Guidance uses serverless components such AWS Glue, Amazon S3, and Neptune. These services automatically scale up and down to meet demand, so you only pay for what you use.

Read the Cost Optimization whitepaper 

All resources in this Guidance are serverless and scale with use, which reduces the number of resources that are idle.

Read the Sustainability whitepaper 

Implementation resources

The sample code is a starting point. It is industry validated, prescriptive but not definitive, and a peek under the hood to help you begin.
Open sample code on GitHub

Disclaimer

The sample code; software libraries; command line tools; proofs of concept; templates; or other related technology (including any of the foregoing that are provided by our personnel) is provided to you as AWS Content under the AWS Customer Agreement, or the relevant written agreement between you and AWS (whichever applies). You should not use this AWS Content in your production accounts, or on production or other critical data. You are responsible for testing, securing, and optimizing the AWS Content, such as sample code, as appropriate for production grade use based on your specific quality control practices and standards. Deploying AWS Content may incur AWS charges for creating or using AWS chargeable resources, such as running Amazon EC2 instances or using Amazon S3 storage.