Important: This Guidance requires the use of Amazon Forecast, which is no longer available to new customers. Existing customers of Amazon Forecast can continue using and deploying this Guidance as normal.
This Guidance helps utility companies ingest data from Meter Data Management Systems (MDMS) or directly from Head End Systems (HES) and combine them with other data sources, including weather and geographic information system (GIS) data. Utility companies will be able to detect meter and distribution circuit anomalies, run circuit balancing, thwart energy theft, predict demand, and enhance customer engagement with proactive analytics and artificial intelligence and machine learning (AI/ML)-based forecasts and predictions.
Please note: This Guidance has been updated. The Architecture Diagram is an enhanced version that automatically deploys the following new features: data lake, data ingestion/ML pipelines, visualization components, MDMS/HES simulator, and enhanced load testing. The sample code has also been updated with the new functionalities.
Architecture Diagram
[text]
Step 1
Choose from multiple AWS tools to ingest customer and meter data, such as AWS Lambda for custom adapters, AWS SFTP and AWS Storage Gateway for batch processing, and Amazon Kinesis, Amazon Simple Notification Service (Amazon SNS) and Amazon Simple Queue Service (Amazon SQS) for streaming data.
Step 2
Use Amazon Timestream to store time-series data, AWS Glue and Amazon EMR to process data, and Amazon Simple Storage Service (Amazon S3) for raw data and archival storage. The time-series data can be sent directly to analytic services like Amazon EMR and Amazon Athena for further processing.
Step 3
Get raw data from the S3 bucket and send it to AWS Glue and Amazon EMR to automate extract, transform, and load (ETL) processes or for value-add processing. Use Amazon S3 Glacier for archival copies and retention compliance.
Step 4
Your final clean and curated data sets are stored in an S3 bucket within a data lake. Create a metadata catalog with AWS Glue Data Catalog to make all data visible and searchable. Keep data secure with AWS Security, Identity, and Compliance services.
Step 5
Perform complex analytics with Amazon EMR. Perform one-time data discovery and query against your data lake and warehouse with Athena.
Step 6
Query petabytes of structured, time-series, and semi-structured data using standard SQL with Amazon Redshift.
Step 7
Create and publish interactive dashboards that include AI/ML insights with Amazon QuickSight or Amazon Managed Grafana.
Step 8
Use Amazon SageMaker, Amazon Forecast, and Amazon Personalize to detect grid anomalies, forecast energy usage, and predict equipment failures.
Amazon Pinpoint lets you communicate with customers and measure customer engagement. Combine analytics and ML outputs with Amazon Pinpoint to create personalized customer target segments and campaigns.
Well-Architected Pillars
The AWS Well-Architected Framework helps you understand the pros and cons of the decisions you make when building systems in the cloud. The six pillars of the Framework allow you to learn architectural best practices for designing and operating reliable, secure, efficient, cost-effective, and sustainable systems. Using the AWS Well-Architected Tool, available at no charge in the AWS Management Console, you can review your workloads against these best practices by answering a set of questions for each pillar.
The architecture diagram above is an example of a Solution created with Well-Architected best practices in mind. To be fully Well-Architected, you should follow as many Well-Architected best practices as possible.
-
Operational Excellence
AWS Glue and Athena organize data and monitor data access. You can further analyze data with dashboards, reports, and notifications in QuickSight and Amazon Managed Grafana. You can access these dashboards from any device and embed them into your apps and websites. AWS CloudFormation manages the infrastructure and application stack, allowing you to make changes and test the Guidance for different use cases.
-
Security
The AWS Glue Data Catalog has encryption enabled. All metadata that AWS Glue writes to Amazon S3 is encrypted. We recommend designing AWS Identity and Access Management (IAM) roles using the principle of least privilege, which means granting the minimal amount of access needed for the role to complete a specific action under specific conditions. This will give data access to only the necessary users and resources.
-
Reliability
AWS Glue, Amazon S3, and Athena are all serverless and will scale data access performance as your data volume increases. AWS Glue provisions, configures, and scales the resources required to run your data integration jobs. Athena queries your data without you having to set up and manage any servers or data warehouses. Amazon SNS and Amazon SQS allow you to increase data ingestion without disruption.
-
Performance Efficiency
You can use this Guidance with your own meter read and adopt it to your requirements. Once your data is transformed to the internal data format (as detailed in the associated Implementation Guide), demo features and the dashboard work transparently.
-
Cost Optimization
This Guidance uses serverless services where possible, such as Lambda, Athena, and Kinesis, to help you avoid upfront costs and pay only for the resources you use. We use Amazon EMR in a non-serverless manner to control and optimize the nodes, making it more cost-effective based on your use case.
-
Sustainability
S3 Glacier archives data using magnetic storage rather than solid state memory, helping to optimize your storage footprint. Amazon SNS and Amazon SQS decouple applications to avoid having running resources waiting for load.
Implementation Resources
A detailed guide is provided to experiment and use within your AWS account. Each stage of building the Guidance, including deployment, usage, and cleanup, is examined to prepare it for deployment.
The sample code is a starting point. It is industry validated, prescriptive but not definitive, and a peek under the hood to help you begin.
Related Content
Disclaimer
The sample code; software libraries; command line tools; proofs of concept; templates; or other related technology (including any of the foregoing that are provided by our personnel) is provided to you as AWS Content under the AWS Customer Agreement, or the relevant written agreement between you and AWS (whichever applies). You should not use this AWS Content in your production accounts, or on production or other critical data. You are responsible for testing, securing, and optimizing the AWS Content, such as sample code, as appropriate for production grade use based on your specific quality control practices and standards. Deploying AWS Content may incur AWS charges for creating or using AWS chargeable resources, such as running Amazon EC2 instances or using Amazon S3 storage.
References to third-party services or organizations in this Guidance do not imply an endorsement, sponsorship, or affiliation between Amazon or AWS and the third party. Guidance from AWS is a technical starting point, and you can customize your integration with third-party services when you deploy the architecture.