This Guidance demonstrates how to monitor and optimize the energy use of industrial and/or building equipment in customer premises using AI/ML on AWS. From data ingestion to data processing and model training to inference, customers can use this guidance to maximize their energy consumption while driving energy cost reduction.
Operations and Subject Matter Experts (SMEs) install the required sensors (such as Temperature sensors) on the equipment of interest. Sensor or asset metadata, and other supporting data (such as suppliers information), should also be located as a primary data source for Energy Use Optimization.
AWS IoT SiteWise can act as a real-time data store for the sensor data. Other supporting information can be stored in a data lake (like Amazon S3) with scheduled extract, transform, and load (ETL) jobs built using AWS Glue. Asset and facility metadata can be stored as graph relationships in Amazon Neptune.
ML-based recommended IoT setpoints, or other controllable knob settings, are deployed through SageMaker. Optionally, deploy edge models on AWS IoT Greengrass core.
The AWS Well-Architected Framework helps you understand the pros and cons of the decisions you make when building systems in the cloud. The six pillars of the Framework allow you to learn architectural best practices for designing and operating reliable, secure, efficient, cost-effective, and sustainable systems. Using the AWS Well-Architected Tool, available at no charge in the AWS Management Console, you can review your workloads against these best practices by answering a set of questions for each pillar.
The architecture diagram above is an example of a Solution created with Well-Architected best practices in mind. To be fully Well-Architected, you should follow as many Well-Architected best practices as possible.
This Guidance takes in real-time and batch telemetry data from IoT sensors, and trains a Machine Learning model to deliver recommendations to reduce energy usage. The model is stored as a model artifact in Amazon S3 and will trigger AWS CodePipeline to request human approval before deployment. This allows end users to question and validate the model recommendation with ease.
This Guidance encourages the use of role-based access with AWS Identity and Access Management (AWS IAM). This ensures only the appropriate people have access to the content. All roles are defined with least-privilege access, and all communications between services stay within the customer account.
All data is encrypted both in-transit and at rest using AWS Key Management Service (AWS KMS). The data catalog in AWS Glue is encrypted.
AWS Glue, Amazon S3, and Neptune are all serverless, and will scale data access performance as data volume increases. Neptune also adjusts capacity to provide just the right amount of database resources that the application needs, avoiding the need to set up and manage any servers or data warehouses.
Data is stored in a data lake that is built with an Amazon S3 bucket. Amazon S3 objects are stored across a minimum of three Availability Zones. This provides 99.999999999% durability of objects. Therefore, the Guidance is inherently resistant to failures.
By using serverless technologies, you provision only the exact resources you use. AWS Glue and AWS Lambda only run when needed. Additionally, Neptune is a fully managed serverless graph database, which also scales according to demand to ensure just the right number of resources are needed to complete the job.
This Guidance uses serverless components such AWS Glue, Amazon S3, and Neptune. These services automatically scale up and down to meet demand, so you only pay for what you use.
All resources in this Guidance are serverless and scale with use, which reduces the number of resources that are idle.
The sample code; software libraries; command line tools; proofs of concept; templates; or other related technology (including any of the foregoing that are provided by our personnel) is provided to you as AWS Content under the AWS Customer Agreement, or the relevant written agreement between you and AWS (whichever applies). You should not use this AWS Content in your production accounts, or on production or other critical data. You are responsible for testing, securing, and optimizing the AWS Content, such as sample code, as appropriate for production grade use based on your specific quality control practices and standards. Deploying AWS Content may incur AWS charges for creating or using AWS chargeable resources, such as running Amazon EC2 instances or using Amazon S3 storage.