The Internet of Things on AWS – Official Blog

Build a digital twin of your IoT device and monitor real-time sensor data using AWS IoT TwinMaker (Part 2 of 2)

Introduction

This post is the second of the series on how to use AWS IoT TwinMaker to create a digital twin of a Raspberry Pi device connected to a sensor that collects temperature and humidity data, and integrate it with an Amazon Managed Grafana dashboard. This allows users to visualize the 3D environment where the digital twin lives together with collected data that influences the device status and 3D model representation in real time.

In Part 1, we introduced the idea together with the general architecture shown below. If you followed Part 1 already, you have completed the configuration of the Amazon Timestream database that will host your data, the setup of the IoT Thing, and the wiring of the sensor that collects and transmits temperature and humidity to your Raspberry Pi device.


Figure 1: The high-level architecture of the solution
Figure 1: The high-level architecture of the solution

In this second part, you will continue with the setup of the Amazon Managed Grafana dashboard that will be used to visualize data. You will create the AWS Lambda function to read data from the Amazon Timestream database and most importantly, you will setup AWS IoT TwinMaker and integrate it with the dashboard in Amazon Managed Grafana to display your 3D Model together with real time data you will collect.

1. Setup of Amazon Managed Grafana Dashboard workspace

From the console, search and select Amazon Managed Grafana from the list of AWS services. Choose Create Workspace. Use TempHumidTwinmaker as name for your Grafana workspace and optionally provide a description. Choose Next.

For Step 2 Configure Settings, from the Authentication access section, select AWS IAM Identity Center (successor to AWS SSO). For the Permission Type choose Service managed. Note that you may need to create a user if this is the first time you have configured SSO.

Choose Next and leave the default (Current account, without selecting any data source) in the next page. Choose Next then Create Workspace.

Note: AWS IoT TwinMaker is not listed as data source. However, the plugin is already installed on all Amazon Managed Grafana workspaces. You will add it later on.

Wait a few minutes for the workspace to be created. When the Amazon Managed Grafana workspace is ready, it will have created an IAM service role. You will be able to see it from the Summary tab once you have selected your workspace. Take note of this IAM service role as you will need it later. It should be something like arn:aws:iam::[YOUR-AWS-ACCOUNT-ID]:role/service-role/AmazonGrafanaServiceRole-[1234abcd]


Figure 2: The Amazon Managed Grafana workspace is ready. IAM role is displayed in the top right corner
Figure 2: The Amazon Managed Grafana workspace is ready. IAM role is displayed in the top right corner

Next, create the user that will access the dashboard.

If you have already performed these actions in your AWS account, you can skip this step. Otherwise, select AWS IAM Identity Center (successor to AWS SSO) from the console search bar, then choose Users → Add user. Enter your username, choose how you would like to get the password, enter your email address along with your first and last name.

Choose Next, skip the Groups section by choosing Next (as you don’t want to assign this user to one or more groups) and confirm by choosing Add user. You will receive an invitation email at the address specified and you will need to accept the invitation in order to access the AWS SSO portal. When the invitation is accepted, you will be asked to change your password depending on the setting you chose when creating the user.

Back in the Amazon Managed Grafana workspace, choose the Authentication section and then Configure users and user groups in the AWS IAM Identity Center (successor to AWS SSO) section. Select the user you just created and choose Assign users and groups.


Figure 3: Assign a user to the Amazon Managed Grafana workspace
Figure 3: Assign a user to the Amazon Managed Grafana workspace

You now have a user to access the dashboard you’re going to create at the end of this post. You need to make them an admin so they will be able to change settings in Amazon Managed Grafana and use the AWS IoT TwinMaker plugin. To do so, select the user and choose the Make Admin button.


Figure 4: Making your user an admin
Figure 4: Making your user an admin

2. Creation of Lambda function to read data from Amazon Timestream

Next you will need to create a Lambda function to retrieve data from your Amazon Timestream database. This Lambda function will be used within AWS IoT TwinMaker when you create an AWS IoT TwinMaker component.

First create the IAM role required for the Lambda function to access Amazon Timestream and Amazon CloudWatch logs. From the console open the IAM service, then move to Roles. Choose Create Role. Choose AWS Service as Trusted Entity Type and select Lambda from the Use Case section. Choose Next and add the AWSLambdaBasicExecutionRole and AmazonTimestreamReadOnlyAccess permissions policies. Choose Next, give the role the name ReadTimestreamFromLambda then review the details and click Create Role.

Note: For this blog, the AmazonTimestreamReadOnlyAccess policy was used, which allow read operations to Timestream. As a best practice, you would restrict read access only to the TimeStream database (and even table) you have created.

Next, create the Lambda function: from the Lambda homepage choose Create function then select the Author from scratch option. Name the function timestreamReader and select Python 3.7 as Runtime. In the Permissions tab, choose “Use an existing role” and select the role ReadTimestreamFromLambda created before. Choose Create function.


Figure 5: Creating the Lambda function to read data from Amazon Timestream
Figure 5: Creating the Lambda function to read data from Amazon Timestream

When the function is created, move to the Configuration section and in the General configuration change the Memory to 256 MB and timeout to 15 min. Remember to Save.

Still in the Configuration section, choose Environment variables and add the following four environment variables:

  • Key: TIMESTREAM_DATABASE_NAME, value TempHumidityDatabase
  • Key: TIMESTREAM_TABLE_NAME, value TempHumidity
  • Key: TWINMAKER_COMPONENT_NAME with no value as we will add it later
  • Key: TWINMAKER_ENTITY_ID with no value as we will add it later

Now move to the code section. Copy and paste the following python code.

import logging
import json
import os
import boto3

from datetime import datetime

LOGGER = logging.getLogger()
LOGGER.setLevel(logging.INFO)

# Get db and table name from Env variables as well as TwinMaker component name and entityId
DATABASE_NAME = os.environ['TIMESTREAM_DATABASE_NAME']
TABLE_NAME = os.environ['TIMESTREAM_TABLE_NAME']
TM_COMPONENT_NAME = os.environ['TWINMAKER_COMPONENT_NAME']
TM_ENTITY_ID = os.environ['TWINMAKER_ENTITY_ID']

# Python boto client for AWS Timestream
QUERY_CLIENT = boto3.client('timestream-query')


# Utility function: parses a timestream row into a python dict for more convenient field access
def parse_row(column_schema, timestream_row):
    """
    Example:
    column=[
        {'Name': 'TelemetryAssetId', 'Type': {'ScalarType': 'VARCHAR'}},
        {'Name': 'measure_name', 'Type': {'ScalarType': 'VARCHAR'}},
        {'Name': 'time', 'Type': {'ScalarType': 'TIMESTAMP'}},
        {'Name': 'measure_value::double', 'Type': {'ScalarType': 'DOUBLE'}},
        {'Name': 'measure_value::varchar', 'Type': {'ScalarType': 'VARCHAR'}}
    ]
    row={'Data': [
        {'ScalarValue': 'Mixer_15_7e3c0bdf-3b1c-46b9-886b-14f9d0b9df4d'},
        {'ScalarValue': 'alarm_status'},
        {'ScalarValue': '2021-10-15 20:45:43.287000000'},
        {'NullValue': True},
        {'ScalarValue': 'ACTIVE'}
    ]}

    ->

    {
        'TelemetryAssetId': 'Mixer_15_7e3c0bdf-3b1c-46b9-886b-14f9d0b9df4d',
        'measure_name': 'alarm_status',
        'time': '2021-10-15 20:45:43.287000000',
        'measure_value::double': None,
        'measure_value::varchar': 'ACTIVE'
    }
    """
    data = timestream_row['Data']
    result = {}
    for i in range(len(data)):
        info = column_schema[i]
        datum = data[i]
        key, val = parse_datum(info, datum)
        result[key] = val
    return result

# Utility function: parses timestream datum entries into (key,value) tuples. Only ScalarTypes currently supported.
def parse_datum(info, datum):
    """
    Example:
    info={'Name': 'time', 'Type': {'ScalarType': 'TIMESTAMP'}}
    datum={'ScalarValue': '2021-10-15 20:45:25.793000000'}

    ->

    ('time', '2021-10-15 20:45:25.793000000')
    """
    if datum.get('NullValue', False):
        return info['Name'], None
    column_type = info['Type']
    if 'ScalarType' in column_type:
        return info['Name'], datum['ScalarValue']
    else:
        raise Exception(f"Unsupported columnType[{column_type}]")

# This function extracts the timestamp from a Timestream row and returns in ISO8601 basic format
def get_iso8601_timestamp(str):
    #  e.g. '2022-04-06 00:17:45.419000000' -> '2022-04-06T00:17:45.419000000Z'
    return str.replace(' ', 'T') + 'Z'

# Main logic
def lambda_handler(event, context):
    selected_property = event['selectedProperties'][0]

    LOGGER.info("Selected property is %s", selected_property)

    # 1. EXECUTE THE QUERY TO RETURN VALUES FROM DATABASE
    query_string = f"SELECT measure_name, time, measure_value::bigint" \
        f" FROM {DATABASE_NAME}.{TABLE_NAME} " \
        f" WHERE time > from_iso8601_timestamp('{event['startTime']}')" \
        f" AND time <= from_iso8601_timestamp('{event['endTime']}')" \
        f" AND measure_name = '{selected_property}'" \
        f" ORDER BY time ASC"
            
    try:
        query_page = QUERY_CLIENT.query(
            QueryString = query_string
        )
    except Exception as err:
        LOGGER.error("Exception while running query: %s", err)
        raise err

    # Query result structure: https://docs.aws.amazon.com/timestream/latest/developerguide/API_query_Query.html

    next_token = None
    if query_page.get('NextToken') is not None:
       next_token = query_page['NextToken']
    schema = query_page['ColumnInfo']

    # 2. PARSE TIMESTREAM ROWS
    result_rows = []
    for row in query_page['Rows']:
        row_parsed = parse_row(schema,row)
        #LOGGER.info('row parsed: %s', row_parsed)
        result_rows.append(row_parsed)

    # 3. CONVERT THE QUERY RESULTS TO THE FORMAT TWINMAKER EXPECTS

    # There must be one entityPropertyReference for Humidity OR one for Temperature
    entity_property_reference_temp = {}
    entity_property_reference_temp['componentName'] = TM_COMPONENT_NAME
    entity_property_reference_temp['propertyName'] = 'temperature'
    entity_property_reference_temp['entityId'] = TM_ENTITY_ID


    entity_property_reference_hum = {}
    entity_property_reference_hum['componentName'] = TM_COMPONENT_NAME
    entity_property_reference_hum['propertyName'] = 'humidity'
    entity_property_reference_hum['entityId'] = TM_ENTITY_ID


    values_temp = []
    values_hum = []

    for result_row in result_rows:
        ts = result_row['time']
        measure_name = result_row['measure_name']
        measure_value = result_row['measure_value::bigint']

        time = get_iso8601_timestamp(ts)
        value = { 'doubleValue' : str(measure_value) }

        if measure_name == 'temperature':
            values_temp.append({
                'time': time,
                'value':  value
            })
        elif measure_name == 'humidity':
             values_hum.append({
                'time': time,
                'value':  value
            })

    # The final structure "propertyValues"
    property_values = []

    if(measure_name == 'temperature'):
        property_values.append({
            'entityPropertyReference': entity_property_reference_temp,
            'values': values_temp
        })
    elif(measure_name == 'humidity'):
        property_values.append({
            'entityPropertyReference': entity_property_reference_hum,
            'values': values_hum
        })
    LOGGER.info("property_values: %s", property_values)

    # marshall propertyValues and nextToken into final response
    return_obj = {
       'propertyValues': property_values,
       'nextToken': next_token
       }

    return return_obj

Note: The code contains references to TM_COMPONENT_NAME  and TM_ENTITY_ID which are respectively the name of the TwinMaker component and the Id of the TwinMaker entity representing your sensor. You are going to create both in the next section, and then update the Lambda environment variables.

This code is the implementation of an AWS IoT TwinMaker connector against Timestream. Remember to Deploy your Lambda function.

3. Configuration of AWS IoT TwinMaker

In the previous paragraph you created and configured an Amazon Managed Grafana workspace and a Lambda function to read data from the Amazon Timestream database. You can now move to the configuration of the digital twin.

Configure the IAM policy and roles that will be used by AWS IoT TwinMaker

From the console select the IAM service, then move to Roles. Choose Create Role. Choose Custom trust policy and paste the policy below. AWS IoT TwinMaker requires that you use a service role to allow it to access resources in other services on your behalf. Choose Next.

{
  "Version": "2012-10-17",
  "Statement": [
    {
      "Effect": "Allow",
      "Principal": {
        "Service": "iottwinmaker.amazonaws.com"
      },
      "Action": "sts:AssumeRole"
    }
  ]
}

In the next step, choose Create Policy, a new tab opens up. Select JSON tab and paste the following code:

{
"Version": "2012-10-17",
"Statement": [
{
    
    "Action": [
        "iottwinmaker:*", 
        "s3:*", 
        "iotsitewise:*", 
        "kinesisvideo:*"
    ],
    "Resource": [ "*" ],
    "Effect": "Allow"
},
{
    "Action": [   
        "lambda:invokeFunction"
    ],
    "Resource": ["*"],
    "Effect": "Allow"
},
{
    "Condition": {
    "StringEquals": {
        "iam:PassedToService": "lambda.amazonaws.com"
        }
    },
    "Action": ["iam:PassRole"],
    "Resource": ["*"],
    "Effect": "Allow"
}]}

You are giving AWS IoT TwinMaker the ability to work with Amazon Simple Storage Service (Amazon S3), AWS IoT SiteWise, and Amazon Kinesis services and also the ability to invoke the Lambda function to read data from the database. You can modify this policy to be more restrictive in a production environment.

Choose Next (Tags), then Next (Review). Give this policy the name TwinMakerWorkspacePolicy and choose Create Policy. When done, go back to the page of the role you were creating and look for your new policy in the list. Choose Refresh if you don’t see it immediately. Choose Next, and give the role the name TwinMakerWorkspaceRole then review the details and click Create Role.

Create the AWS IoT TwinMaker workspace

From the console, search and select AWS IoT TwinMaker from the list of AWS services. Choose Create workspace. When creating the workspace, you first need to provide some basic information: type “TempHumidWorkspace” as Workspace Name and insert an optional Description. From the Amazon S3 bucket dropdown, select Create a new S3 bucket. From the Execution Role dropdown, select the TwinMakerWorkspaceRole role you created in the previous step. Choose Next.


Figure 6: Creating the AWS IoT TwinMaker workspace
Figure 6: Creating the AWS IoT TwinMaker workspace

Now you will refer to the Grafana dashboard that you created before. From the Dashboard management page, select Amazon Managed Grafana. From the Grafana authentication provider dropdown, select the Grafana service role that has been created before — the one with name like AmazonGrafanaServiceRole-[1234abc]. Click Next.

From the Dashboard role page, leave No video permissions selected. You will create an IAM policy and role to be used by the dashboard to access the AWS IoT TwinMaker workspace’s Amazon S3 bucket and resources. Copy the policy code provided in the page, then click Create Policy in IAM.


Figure 7: Creating the Amazon Managed Grafana dashboard role and policy
Figure 7: Creating the Amazon Managed Grafana dashboard role and policy

In the new page, select JSON tab and paste the code for the policy you just copied. Choose Next (Tags), then Next (Review). Give this policy the name TempHumidWorkspaceDashboardPolicy and choose Create Policy.

Go back to the AWS IoT TwinMaker workspace creation page and choose Create dashboard role in IAM. In the new page, select Custom trust policy and paste the following trust policy JSON:

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Effect": "Allow",
            "Principal": { 
                "AWS": "*"
            },
            "Action": "sts:AssumeRole"
        }
    ]
}

Choose Next and then select the IAM policy you just created (named TempHumidWorkspaceDashboardPolicy). Choose the refresh button if you don’t see it immediately. Choose Next, then give this role the name TwinMakerDashboardRole and choose Create Role. You will receive an alert that the trust policy is overly permissive, but you will change that later. For now, choose Continue.

When done, go back to the AWS IoT TwinMaker workspace creation page and select the dashboard role you just created from the list. Choose the refresh button if you don’t see it immediately.

Next, copy the code provided in the Update dashboard role policy tab. You are going to apply this policy in the TwinMakerDashboardRole you have just created. Choose Update trust policy in IAM and paste the code replacing what is present already to apply the trust policy in the role. With this, you are changing the overly permissive permission by applying it only to a specific AWS Principal, the service role used by Amazon Managed Grafana. Choose Update Policy.


Figure 8: Creating the Amazon Managed Grafana dashboard role and policy (continued)Figure 8: Creating the Amazon Managed Grafana dashboard role and policy (continued)

When done, go back to the AWS IoT TwinMaker workspace creation page and choose Next, then Create Workspace.


Figure 9: Review of the Amazon Managed Grafana workspace creation
Figure 9: Review of the Amazon Managed Grafana workspace creation

You now have the AWS IoT TwinMaker workspace ready with all the permissions required to use it from the Grafana dashboard.

Create AWS IoT TwinMaker Component & Entity

With your AWS IoT TwinMaker workspace selected, move to Component types section to create a component type. AWS IoT TwinMaker components provide context for properties and data for their associated entities. You will now create a component that will access your device’s temperature and humidity data. The component will be the link between the AWS IoT TwinMaker workspace and the Lambda function used to read values from the Timestream database.

Choose Create component type and paste the following code in the Request section.

{
  "workspaceId": "[YOUR_TWINMAKER_WORKSPACE]",
  "isSingleton": false,
  "componentTypeId": "com.blog.DHT11sensor",
  "propertyDefinitions": {
    "humidity": {
      "dataType": {
        "type": "DOUBLE"
      },
      "isTimeSeries": true,
      "isRequiredInEntity": false,
      "isExternalId": false,
      "isStoredExternally": true,
      "isImported": false,
      "isFinal": false,
      "isInherited": false
    },
    "temperature": {
      "dataType": {
        "type": "DOUBLE"
      },
      "isTimeSeries": true,
      "isRequiredInEntity": false,
      "isExternalId": false,
      "isStoredExternally": true,
      "isImported": false,
      "isFinal": false,
      "isInherited": false
    }
  },
  "functions": {
    "dataReader": {
      "implementedBy": {
        "lambda": {
          "arn": "[YOUR_LAMBDA_ARN]"
        },
        "isNative": false
      },
      "isInherited": false
    }
  },
  "creationDateTime": "2022-05-19T14:58:42.140Z",
  "updateDateTime": "2022-05-19T14:58:42.140Z",
  "arn": "arn:aws:iottwinmaker:[YOUR_REGION]:[YOUR_AWS_ACCOUNT]:workspace/[YOUR_TWINMAKER_WORKSPACE]/component-type/com.blog.DHT11sensor",
  "isAbstract": false,
  "isSchemaInitialized": false,
  "status": {
    "state": "ACTIVE",
    "error": {}
    }
  }

IMPORTANT: Make sure you replace the value between brackets [YOUR_TWINMAKER_WORKSPACE], [YOUR_LAMBDA_ARN], [YOUR_REGION] and [YOUR_AWS_ACCOUNT].

When ready, choose Create component type.

Now it’s time to create an Entity for your sensor. You could potentially create a structure or hierarchy of entities representing your environment, but in this case for simplicity, only a single entity representing the device/sensor will be created. To do so, move to the Entities section and choose Create entity. Give the entity a name (eg TempHumiditySensor) and choose Create entity.

IMPORTANT: This entity is the one you need to use in your Lambda function. Copy the Entity id and add it as value of the [TWINMAKER_ENTITY_ID] environment variable you have in your Lambda function, created before.

Select now the Entity from the list of Entities and choose Add component on the right.


Figure 10: Creating the AWS IoT TwinMaker entity
Figure 10: Creating the AWS IoT TwinMaker entity

Select the type com.blog.DHT11sensor and give the component a name.

IMPORTANT: Remember to update the [TWINMAKER_COMPONENT_NAME] environment variable you have in your Lambda function with the value you choose here as component name.

You will see the properties of the components in the table (like temperature and humidity). When done, choose Add component.


Figure 11: Adding the AWS IoT TwinMaker component to the Entity
Figure 11: Adding the AWS IoT TwinMaker component to the Entity

Create AWS IoT TwinMaker Resource & Scene

Next, import the 3D model to represent your device or sensor in the virtual environment. AWS IoT TwinMaker supports a variety of files like BIN, GLB, GLTF, PNG, PDF, JPG, JPEG, and MP4. In this case, a Raspberry Pi4 model was used. You should be able to find free models in websites like CGTraderSketchfab or TurboSquid.

With your workspace selected, go to Resource library and choose Add resources and upload your file.


Figure 12: Uploading the 3D model
Figure 12: Uploading the 3D model

Finally, you will setup your Scene. With your workspace selected, go to Scenes and choose Create scene. Give it an ID (name) and choose Create scene. Once created, you will be presented with a view containing 3 main sections (see screenshot below). On the left, a section containing 3 tabs:

  • Hierarchy: your objects in the scene
  • Rules: how to change items in the scene depending on the data received (we’ll use it in this exercise)
  • Settings

In the center part, there is a section containing the 3D world, with the possibility to move around, pan, zoom, tilt the view etc. On the right, you have a section with the Inspector, to see details of what is selected in the scene.


Figure 13: The UI of the AWS IoT TwinMaker scene
Figure 13: The UI of the AWS IoT TwinMaker scene

You will start by creating the Rules for the item in the scene. Choose the Rules section on the left side panel and check the rules that are already present. Create two new rules, one for humidity data and another for temperature data. Define temperatureIconRule as RuleID and choose Add New Rule. Select the rule and click Add new statement to define some expressions to have the target icon change from Info to Warning to Error as shown below.

IMPORTANT: Make sure that the Expression you write uses the exact word that you used to name the property coming from the sensor and stored in the database (i.e. “temperature” and “humidity”).


Figure 14: Rules that will make your tag change color or icon depending on the data received
Figure 14: Rules that will make your tag change color or icon depending on the data received

When you are done with temperature rules, repeat the same process adding a new rule for humidity.

Next, add the 3D model. In the center part of the screen, click on the + icon and select Add 3d model, selecting from the resource library the 3d object that you uploaded before.


Figure 15: Adding your 3D model to the scene
Figure 15: Adding your 3D model to the scene

Once loaded, you can scale the model with the Transform section in the right panel. Most likely once you add it in the scene, the object will be dark. To fix that, you can adjust the lighting by clicking on Settings and choosing an Environmental Preset. Another way to add a light would be clicking on the + icon and selecting Add light. You can then select it and move it around with your mouse to light up your scene and the 3d model imported.


Figure 16: Make sure your model has proper lighting
Figure 16: Make sure your model has proper lighting

Finally, you will add a tag to handle the humidity and temperature data and make sure that what is received affects what is shown in the scene. Click on the + icon and choose Add tag. Using the Inspector section, define its name as Temperature and choose a Default Icon. Select your Entity as EntityId and your component as ComponentName. Select temperature as PropertyName and temperatureIconRule as RuleId. Repeat the same action creating a new tag for Humidity with humidity as PropertyName and humidityIconRule as RuleId.

Note: Move the two tags close to the 3D model but distant enough to make them visible in the scene.


Figure 17: Positioning tags in the scene
Figure 17: Positioning tags in the scene

4. Create an Amazon Managed Grafana dashboard with AWS IoT TwinMaker plugin

You are finally ready to create a dashboard in Amazon Managed Grafana to visualize the digital twin and your data. From the console, select Amazon Managed Grafana and then your Amazon Managed Grafana workspace. Access your Amazon Managed Grafana workspace from the link provided in the page at Amazon Managed Grafana workspace URL. You’ll be asked to access with the credentials you have setup when configuring the AWS IAM Identity Center (successor to AWS SSO). Since your user was set in admin, you should be able to access the Amazon Managed Grafana settings page.

First, you need to add the AWS IoT TwinMaker data source. To do so, go to Configuration and choose Add data source, then search for TwinMaker and select AWS IoT TwinMaker.


Figure 18: Configure AWS IoT TwinMaker as datasource for your dashboard
Figure 18: Configure AWS IoT TwinMaker as datasource for your dashboard

Then, make sure that all the Connection Details are correct in the Settings of the data source. This includes the authentication provider and the ARN of the role that AWS IoT TwinMaker assumes to access the dashboard and the AWS region (TwinMakerDashboardRole). Here also is configured the AWS IoT TwinMaker workspace.


Figure 19: Connection details
Figure 19: Connection details

Choose Save & Test to verify that the connection between Grafana and the AWS IoT TwinMaker workspace is setup correctly.

Then, move to the creation of the dashboard. From the left sidebar, click Create → Dashboard. We’ll start by adding an empty panel first. Choose Add a new panel.

On the right, select the type of Visualization to use. In the search bar type TwinMaker and select AWS IoT TwinMaker Scene Viewer. Using the controls in the right part of the screen, give this panel a name and select the AWS IoT TwinMaker workspace and Scene. Your 3D model should appear in the preview.


Figure 20: Adding AWS IoT TwinMaker Scene Viewer to the dashboard
Figure 20: Adding AWS IoT TwinMaker Scene Viewer to the dashboard

Now you make sure that the connection between what is shown in the dashboard and your data is defined. To do so, create two queries, one for the temperature data and the other for the humidity. These queries will use the AWS IoT TwinMaker component you created, which in turn, uses the Lambda function to read from the Timestream database.

In the query section, make sure that AWS IoT TwinMaker is selected as Data source and define a new query of type Get Property Value History by Entity. Select your Entity (TempHumiditySensor) and Component (DHTComponent), then choose the temperature property. Repeat the same adding a new query of same Type and with same Entity and Component but this time selecting the humidity property. When done, save your panel and click Apply then Save.


Figure 21: The query needed to read data with the component
Figure 21: The query needed to read data with the component

Aside from the AWS IoT TwinMaker panel, you can also create other panels to represent your data in various visualization formats, for example a Gauge or Time series, to show your temperature and humidity data. You will need to configure the same query mechanism to make sure you are able to retrieve data. The little red corner on the upper-left of each panel will inform you in case of issues with the component reading data. In this case, it just alerts us that no data is coming – that’s because you haven’t started the python script in your Raspberry Pi to send data to cloud.


Figure 22: Adding panels to the dashboard
Figure 22: Adding panels to the dashboard

5. The final result

If you now start the Python script again in your Raspberry Pi device, you should be able to see temperature and humidity data populating your dashboard’s panels. Since you have defined rules in your AWS IoT TwinMaker workspace, the tags associated with the Entity represented in the dashboard (the two blue dots) will change icons (info, warning, error), or color if you define a color-based rule, whenever the temperature or humidity data received is above/below the threshold defined in your rules.


Figure 23: The final result. Temperature tag is showing a warning icon as the threshold defined in the rule was 23°
Figure 23: The final result. Temperature tag is showing a warning icon as the threshold defined in the rule was 23°

Cleaning up

If you followed along with this solution, complete the following steps to avoid incurring unwanted charges to your AWS account.

AWS IoT Core

  • In the Manage → All devices, delete the Thing and Thing type.
  • In the Manage→ Security section, remove the Policy and Certificate.
  • In the Manage → Message Routing section, clean up the Rule.

Amazon Timestream

  • Delete the table and the database.

Amazon Managed Grafana

  • Delete the Amazon Managed Grafana workspace.

AWS IAM

  • Delete the roles created along the way.

AWS IoT TwinMaker

  • Delete the AWS IoT TwinMaker workspace

Conclusion

In this blog post series, you learned how to set up a simple end-to-end solution to monitor the temperature and humidity data creating a digital twin using a Raspberry Pi device and DHT sensor. The solution is achieved by first connecting the Raspberry Pi to AWS IoT Core using MQTT followed by forwarding messages from the topic stream using AWS IoT rules and putting records on an Amazon Timestream database. You also used AWS IoT TwinMaker to create a digital twin of the device/sensor and Amazon Managed Grafana to build a real time interactive dashboard.

You can create more complex scenarios, involving a multitude of devices and sensors and recreating your real environment. For a more complex use case, check out the AWS IoT TwinMaker Cookie Factory sample project. Also, visit The Internet of Things on AWS – Official Blog to learn more about AWS IoT TwinMaker or see what our customers built with it.


About the author

Angelo Postiglione is a Senior Solutions Architect at AWS. He’s currently based in Copenhagen, where he helps customers adopt cloud technologies to build scalable and secure solutions using AWS. In his spare time, he likes to discover new places in the world, have long walks in the nature and play guitar and drums.