The Internet of Things on AWS – Official Blog

Build a digital twin of your IoT device and monitor real-time sensor data using AWS IoT TwinMaker (Part 1 of 2)

Introduction

A digital twin is a living digital representation of a physical system that is dynamically updated to mimic the structure, state, and behavior of the physical system to drive business outcomes. Building one is usually not an easy task – to solve this challenge, we launched a new service, AWS IoT TwinMaker. AWS IoT TwinMaker connects data from a variety of sources like equipment sensors, video feeds, and business applications and creates a knowledge graph to model real-world systems and generate real-time insights from the digital twin.

Some customers are using AWS IoT TwinMaker to optimize maintenance schedules by quickly pinpointing and addressing equipment and process anomalies, or to give field workers a consolidated view of all asset and operational data. Another common use case is enhancing the user experience and improving occupant comfort in buildings by monitoring live and historical data about temperature, occupancy or air quality within rooms.

This blog post series addresses this last scenario: monitoring temperature and humidity of a room in real time while being able to control the location and status of sensors.

To simulate this scenario, you will learn how to use AWS IoT TwinMaker to create a digital twin of a Raspberry Pi device connected to a sensor that collects temperature and humidity data. You will integrate with an Amazon Managed Grafana dashboard to visualize the 3D environment where the digital twin lives together with collected data that influences the device status and 3D model representation in real time.

Part 1 (this blog post) covers an overview of the solution, the setup of the time-series database, which will host your data, and the configuration of the IoT Thing. It also covers the wiring of the sensor to the Raspberry Pi device.

In Part 2, you will continue with the setup of the Amazon Managed Grafana dashboard that will be used to visualize data. You will create the AWS Lambda function to read data from the time-series database and most importantly, you will setup AWS IoT TwinMaker and integrate it with the Amazon Managed Grafana dashboard to display your digital twin together with real time data you will collect.

Solution Overview

The diagram below shows a high-level architecture overview. Data generated from the sensor attached to the Raspberry Pi device is sent via a Python script to AWS IoT Core, that easily and securely connects devices through the MQTT and HTTPS protocols. From here, using AWS IoT Core rules, data is streamed to an Amazon Timestream database. On the AWS IoT TwinMaker side, you will create the workspace environment where a virtual entity is defined together with its 3D model representation. A component will also be created, which uses a Lambda function to read data from Amazon Timestream, so that your digital twin is in sync with data arriving from the sensor. For the visualization part, you will leverage the AWS IoT TwinMaker Grafana dashboard integration to create a dashboard which presents data together with the 3D model. The dashboard is accessible in SSO via AWS IAM Identity Center (successor to AWS SSO). Finally, you will create AWS IoT TwinMaker rules to be able to easily see changes in the dashboard whenever the temperature or humidity goes below or above the thresholds defined.


Architecture of our solution
Figure 1: The high-level architecture of the solution

Prerequisites

  • An AWS account
  • Raspberry Pi 4 Model B with a pre-installed Operating System SD card and a power cable
  • DHT sensor (or similar, to retrieve temperature/humidity data)
  • Breadboard with male to male (M2M) jumper wires, and resistor. You would also need female to female (F2F) jumper wires if you’re not going to use an extension board like I did (see note in section 3 – Raspberry Pi setup)

Implementation

Below are the macro-steps you will perform in this blog post series:

(Part1)

  1. Setup of the time-series database Amazon Timestream, which will store your temperature and humidity data
  2. Setup of AWS IoT Thing with certificates and rules to make sure that data collected will be sent to the database
  3. Configuration of Raspberry Pi device: wiring of the sensor and creation of the Python file used to send data to AWS

(Part2)

  1. Setup of the Amazon Managed Grafana dashboard that will be used to visualize data
  2. Creation of AWS Lambda function to read data from Timestream
  3. Setup of AWS IoT TwinMaker role, policy, bucket destination and workspace. Definition of the telemetry component to read from database. Import of the 3D model and definition of the scene
  4. Creation of the dashboard in Amazon Managed Grafana with AWS IoT TwinMaker plugin.

IMPORTANT: Some of the services used are not yet available in all the AWS regions. Make sure you create all your resources in us-east-1 or eu-west-1 (depending if you prefer US or Europe region).

1. Setup of Amazon Timestream Database

Let’s start by configuring the database for the temperature and humidity data. The use case is clearly related to collecting time-series data, so the right “tool” in this case is Amazon Timestream.

To create a database:

  1. Choose Timestream in the AWS Management Console.
  2. Choose Create database.
  3. Enter the following information:
    • Configuration: Standard database
    • Name: TempHumidityDatabase
  4. Confirm by choosing Create database.


Figure 2: Creation of Timestream database

Then, create a table.

  1. Choose the Tables tab, and choose Create table.
  2. Set the following values:
    • Database name: TempHumidityDatabase
    • Table name: TempHumidity
    • Data retention:
      • Memory: 1 Day
      • Magnetic: 1 Month(s)
  3. Create table

The database is ready to receive data.

2. Connecting Raspberry Pi to AWS IoT Core

In this section, you will prepare the “connection” between the Raspberry Pi device and AWS IoT Core by creating a policy and certificates, then registering the device as a “Thing”. Next, you will define the rules to send data to the Amazon Timestream database and route potential errors to logs in Amazon CloudWatch.

Create a policy

AWS IoT Core policies are JSON documents and follow the same conventions as AWS Identity and Access Management (IAM) policies. AWS IoT Core policies allow you to control access to the AWS IoT Core data plane. The AWS IoT Core data plane consists of operations that allow you to connect to the AWS IoT Core message broker, send and receive MQTT messages, and receive or update an AWS IoT thing’s device shadow.

You will now create a policy to allow the publication and subscription of a specific IoT topic (/raspberry/temphumid). This policy will be attached to a certificate used by the AWS IoT thing.

  1. Open the AWS Management Console of your AWS account.
  2. Navigate to the AWS IoT Core service, then from the left menu choose Manage > Security > Policies section.
  3. Choose Create Policy.
  4. Enter the following values:
    • Policy properties → Policy name: TempHumidityPolicy
    • Policy statements → Policy document → select JSON
  5. Paste the following Json replacing your AWS region and AWS account
{
  "Version": "2012-10-17",
  "Statement": [
    {
      "Effect": "Allow",
      "Action": "iot:Publish",
      "Resource": "arn:aws:iot:[AWS_REGION]:[AWS_ACCOUNT]:topic/raspberry/temphumid"
    },
    {
      "Effect": "Allow",
      "Action": "iot:Subscribe",
      "Resource": "arn:aws:iot:[AWS_REGION]:[AWS_ACCOUNT]:topicfilter/raspberry/temphumid"
    },
    {
      "Effect": "Allow",
      "Action": "iot:Connect",
      "Resource": "*"
    }
  ]
}
  1. choose Create

Create an AWS IoT Thing and generate a certificate

  1. In the console, choose AWS IoT Core.
  2. From the Manage > All devices > Things section on the left, choose Create Things.
  3. Select Create a single thing.
  4. Enter a name for the thing (eg RaspberryPi) and leave the Device Shadow section with No Shadow. Choose Next.
  5. In Configure Device Certificate, select Auto-generate a new certificate (recommended) and choose Next.
  6. In Policies, select the policy created before (TempHumidityPolicy) and choose Create Thing.
  7. Download the Device certificate (crt) along with public and private key (pem.key) and Root CA 1 certificate in a folder on your local machine. You will use the certificate and private keys later, on your Raspberry Pi device. You won’t need the public key or the Amazon Root CA 3.
  8. When the download is complete, choose Done.

Figure 3: An IoT Thing has been created to represent your Raspberry Pi
Figure 3: An IoT thing has been created to represent your Raspberry Pi

Create a rule to send data to Amazon Timestream and errors to Amazon CloudWatch

  1. Choose AWS IoT Core in the console.
  2. In the Manage > Message Routing > Rules section, choose Create rule, and then enter the following:
    • Name: TempHumidityRule
    • Description: Rule to handle temperature and humidity message. Choose Next.
    • Rule query statement:
      SELECT * FROM 'raspberry/temphumid'
    • Choose Next.
  3. In the “Rule actions” panel, choose Timestream table – Write a message into the Timestream table. Next, select the Timestream database TempHumidDatabase and table TempHumidity you created before.

Figure 4: The IoT Core rule to write data on Timestream
Figure 4: The IoT Core rule to write data in Amazon Timestream

  1. You need to enter the dimension name (minimum of 1 is required). Define a dimension with dimension name DeviceName and dimension value Rpi4.
  2. Next, you need to create an IAM role to allow the AWS IoT Core service to access the database. Choose Create new role, and then enter the following name: TempHumidityDBRole
  3. In the “Error action” panel, choose Add error action. Select CloudWatch logs – Send message data to CloudWatch logs.
  4. Choose Create CloudWatch Log group – you’ll be redirected in a new tab to CloudWatch. Create a log group named TempHumidityRuleErrors. You can access log groups from the left menu Logs > Logs group. You can leave Expiration to never as default.
  5. Go back to AWS IoT Core and refresh the Log group name list and select the newly created log group.
  6. Create an IAM role to allow the service to access CloudWatch: choose Create new role, then enter the following name TempHumidityCloudwatchRole

Figure 5: The IoT Core error rule action will send errors to Amazon CloudWatch
Figure 5: The IoT Core error rule action will send errors to Amazon CloudWatch

  1. Choose Next.
  2. Review the inputs and confirm the rule creation by choosing Create

You now have a valid IoT rule that will route temperature and humidity data sent by the sensor to the Timestream database. Errors will be sent to Amazon CloudWatch logs.

3. Raspberry Pi setup

Now that you have defined the database and prepared the AWS IoT thing that represents your Raspberry Pi, it’s time to wire the sensor to the Raspberry Pi and then send some data to AWS IoT Core.

Wire the sensor

In this post we use a DHT11 sensor to collect temperature and humidity data from the environment. The DHT11 is a basic, low-cost digital temperature and humidity sensor. It uses a capacitive humidity sensor and a thermistor to measure the surrounding air and generates a digital signal on the data pin

Note: This blog was created using a Raspberry Pi 4 Model B mounted on a case box kit. This box kit neatly packages the Pi and prevents damage, but is not required. The same is true for the expansion board, which makes it easier to work on the breadboard rather than using the Raspberry Pi pins directly. You don’t need it if you want to connect wires directly to your device.

The DHT11 sensor is connected to the breadboard as shown in the pictures following.


Figure 6: Raspberry Pi wiring of the DHT11 sensorFigure 6: Raspberry Pi wiring of the DHT11 sensor


Figure 7: Raspberry Pi with extension board and DHT11 sensor wiredFigure 7: Raspberry Pi with extension board and DHT11 sensor wired

Send data from Raspberry Pi to AWS IoT Core

Now that you have the sensor correctly wired to your Raspberry Pi device, you will try to send some temperature and humidity data to AWS. First, you need to copy the Raspberry Pi certificates that you downloaded when you configured the AWS IoT thing, so that Raspberry Pi knows where to connect and send the generated data. There are several options to connect with Raspberry Pi and copy over files. You could use SFTP and save the certificates in a Raspberry Pi safe folder, where you can reference them in the code later.

Once you have the certificates in place, you can move to the next step. You will create a Python file in the Raspberry Pi that will include the Python code below to collect data from the DHT11 sensor. This script will collect data and send it to AWS.

On your Raspberry Pi device, create a file named temphumid.py and paste the Python code below

from AWSIoTPythonSDK.MQTTLib import AWSIoTMQTTClient
import json
import time
import board
import adafruit_dht

# MQTT config (clientID must be unique within the AWS account)
clientID = "XXXX-XXXX-XXXXX-XXXXX"
endpoint = "XXXXXXXX.[AWS_REGION].amazonaws.com" #Use the endpoint from the settings page in the IoT console
port = 8883
topic = "raspberry/temphumid"

# Init MQTT client
mqttc = AWSIoTMQTTClient(clientID)
mqttc.configureEndpoint(endpoint,port)
mqttc.configureCredentials("certs/AmazonRootCA1.pem","certs/raspberry-private.pem.key","certs/raspberry-certificate.pem.crt")

# Send message to the iot topic
def send_data(message):
    mqttc.publish(topic, json.dumps(message), 0)
    print("Message Published")

# Loop until terminated
def loop():
     #Init the dht device with data pin connected
     dhtDevice = adafruit_dht.DHT11(board.D17)

     while(True):
          try:
               temperature = dhtDevice.temperature
               humidity = dhtDevice.humidity
               print("Temp: {:.1f} C    Humidity: {}% ".format(temperature, humidity)) 
               
               message = {
                         'temperature': temperature,
                         'humidity': humidity
                         }
                    
               # Send data to topic
               send_data(message)

               time.sleep(3)
          except RuntimeError as error:     # Errors happen fairly often, DHT's are hard to read, just keep going
               print(error.args[0])

# Main
if __name__ == '__main__':
    print("Starting program...")
    try:
        # Connect
        mqttc.connect()
        print("Connect OK!")

        # Main loop called
        loop()
    except KeyboardInterrupt:
        mqttc.disconnect()
        exit()

Let’s examine the code above:

  1. Imported some libraries (time, json) plus the library specific for the DHT sensor – in this case Adafruit_DHT. You will need to install the dependency (e.g. with sudo pip3 install Adafruit_DHT)
  2. Connected the Raspberry Pi to AWS IoT Core by setting up a name for MQTT client, which needs to be unique for the AWS account.
  3. Defined the source for certificates, make sure you have them in your Raspberry device.
  4. Created a loop to:
    • Read temperature and humidity data.
    • Create a message composed by temperature and humidity.
    • Send data to the MQTT topic.

As you probably noticed, the code needs to be adjusted to include your unique clientID (you can choose a unique string) and most importantly, your IoT endpoint. To find it, in the console, select AWS IoT Core and then navigate to Settings > Device data endpoint. Copy the endpoint URL you see in the page and paste it in the Python script.

Once you followed the steps above in the temphumid.py file, do not forget to save it, then run it. If everything is set up correctly, you will see a message like the following:


Figure 8: Output example of the temphumid.py python script

To make sure that data sent to AWS IoT Core is correctly routed to the Amazon Timestream database, open Amazon Timestream in the console and check that the table TempHumidity contains data. You can navigate to the query editor in Amazon Timestream to have a preview of your data by choosing the three dots close to your table name and choose Preview Data – by default, it will retrieve the last 15 minutes of data.


Figure 9: Preview data in Amazon Timestream
Figure 9: Preview data in Amazon Timestream


Figure 10: Humidity and temperature sent to Amazon Timestream via AWS IoT Core
Figure 10: Humidity and temperature sent to Amazon Timestream via AWS IoT Core

If data is not there, refer to the error logs you have in the CloudWatch log group you created before.

For now, you can stop the Python script. You will run it again later.

Conclusion

In part 1 of this series, you have laid the foundations for the creation of a digital twin of your device. You have completed the setup of the Amazon Timestream database and you have created the AWS IoT Thing with certificates and rules to make sure that temperature and humidity data collected by your device is sent to the database. You also have configured your Raspberry device by wiring the sensor and using a Python script to send data to AWS.

In part 2, you will continue with the setup of the Amazon Managed Grafana dashboard that will be used to visualize data. You will create the AWS Lambda function to read data from Timestream database and most importantly you will setup AWS IoT TwinMaker and integrate it with the Amazon Managed Grafana dashboard to display your digital twin 3D Model together with real time data you will collect.


About the author

Angelo Postiglione is a Senior Solutions Architect at AWS. He’s currently based in Copenhagen, where he helps customers adopt cloud technologies to build scalable and secure solutions using AWS. In his spare time, he likes to discover new places in the world, have long walks in the nature and play guitar and drums.