The Internet of Things on AWS – Official Blog

How to use your own data source in AWS IoT TwinMaker

Introduction

AWS IoT TwinMaker makes it easier for developers to create digital twins of real-world systems such as buildings and factories with the ability to use existing data from multiple sources.

AWS IoT TwinMaker uses a connector-based architecture that you can connect data from your own data source to AWS IoT TwinMaker without needing to re-ingest or move the data to another location. AWS IoT TwinMaker provides built-in data connectors for AWS services such as AWS IoT SiteWise and Amazon Kinesis Video Streams. You can also create custom data connectors to use with other AWS or third-party data sources, such as Amazon Timestream, Amazon DynamoDB, Snowflake, and Siemens Mindsphere.

In this blog, you will learn how to use your own data source in AWS IoT TwinMaker using the AWS IoT TwinMaker data connector interface.

Overview

The connection between a data source and AWS IoT TwinMaker is described in Components. A component accesses an external data source by using a Lambda connector. A Lambda connector is a Lambda function that you specify in the component definition.

Here are the steps to create a data connector for Amazon DynamoDB using a Schema initializer connector to fetch the properties from the underlying data source and a DataReader connector to get the time-series values of these properties. Once the data connector is created, you will receive direction on how to create a component for this data connector and attach it to entities.

Amazon DynamoDB is used as data source in this post but the concepts described are applicable for any other data source.

Prerequisites

To setup and execute the steps in this blog, you need the following:

  • An AWS account. If you don’t have one, see Set up an AWS account.
  • An AWS IAM Identity Center (successor to AWS Single Sign-On) user with the permissions to create the resources described in the blog.
  • Read the section What is AWS IoT TwinMaker? of the documentation to understand the key concepts of AWS IoT TwinMaker.

Walkthrough

In this walkthrough, you will perform 6 steps in order to connect your Amazon DynamoDB data source to AWS IoT TwinMaker:

  1. Create a DynamoDB table. This table is only for the purpose of this post. You can easily adapt the instructions to use an existing database.
  2. Create a Lambda Function for the Schema initializer connector.
  3. Create a Lambda Function for the DataReader. You will need to give to the function’s execution role the permissions to read from the table.
  4. Create a TwinMaker Workspace. You will need to add to the workspace role the permissions to invoke both functions.
  5. Create a TwinMaker Component.
  6. Test the component. Before testing the component, you will create a TwinMaker entity and attach the component to the entity.

Step 1: Create a DynamoDB table

For the purpose of this post, you will create a DynamoDB table named TwinMakerTable that contains the key thingName of type String as partition key and the key timestamp of type Number as Sort key. See how to create a DynamoDB table for more information.

Dynamodb table creation

The table you created would store air quality measurements from sensors. You will keep it simple for this post and create items in the table corresponding to measurements from a sensor identified by its name (stored as partition key thingName). In addition to the name of the sensor, each measurement has the following properties of type Number: timestamp (stored as sort key timestamp that is the Unix timestamp in milliseconds of the measurement); temperature, humidity and co2.

Let’s create 5 items in the table, corresponding to 5 measurements of a sensor named airTwin. For the timestamp you can receive the current timestamp in milliseconds from this website and then derive 5 timestamps by subtracting 10000 per measurement. You can then enter random values for the properties: temperature, humidity and co2.  See Write data to a table using the console to learn more.

Item creation in DynamoDB

Now that you have the table created with data, you will create two Lambda functions. The first one for the Schema initializer connector and the second one for the DataReader connector.

Step 2: Create a Schema initializer connector

The Schema initializer connector is a Lambda function used in the component type or entity lifecycle to fetch the component type or component properties from the underlying data source. You will create a Lambda function that will return the schema of the TwinMakerTable.

You create a Node.js Lambda function using the Lambda console.

  • Open the Functions page.
  • On the Lambda console, choose Create function.
  • Under Basic information, do the following:
    • For Function name, enter TwinMakerDynamoSchemaInit.
    • For Runtime, confirm that Node.js 16.x is selected.
  • Choose Create function.
  • Under Function code, in the inline code editor, copy/paste the following code and choose Deploy:
exports.handler = async (event) => {
    let result = {
          properties: {
                temperature: {
                  definition: {
                      dataType: {
                          type: "DOUBLE"
                      },
                      isTimeSeries: true
                  }
                },
                humidity: {
                  definition: {
                      dataType: {
                          type: "DOUBLE"
                      },
                      isTimeSeries: true
                  }
                },
                co2: {
                  definition: {
                      dataType: {
                          type: "DOUBLE"
                      },
                      isTimeSeries: true
                  }
                },
              
          }
        }
    
    return result
}

This function sends the definition of each property of our table and specifies the type. In this case all properties are of type “DOUBLE” and are time-series data. You can check the valid types in the documentation.

Note: here the properties are hard-coded in the function. You could design a function that retrieves automatically the properties and their types from an Item for example.

Now let’s create the DataReader connector.

Step 3: Create a DataReader connector

DataReader is a data plane connector that is used to get the time-series values of properties in a single component.

You create a Node.js Lambda function using the Lambda console.

  • Open the Functions page.
  • On the Lambda console, choose Create function.
  • Under Basic information, do the following:
    • For Function name, enter TwinMakerDynamoDataReader.
    • For Runtime, confirm that Node.js 16.x is selected.
  • Choose Create function.
  • Under Function code, in the inline code editor, copy/paste the following code and choose Deploy:
const TABLE = 'TwinMakerTable'
const aws = require('aws-sdk')
const dynamo = new aws.DynamoDB.DocumentClient()


exports.handler = async (event) => {
    try {
 
        let {workspaceId, entityId, componentName, selectedProperties, startTime, endTime } = event
        
        
        // QUERY THE DATABASE WITH THE SELECTED PROPERTIES
        const {Items} = await dynamo.query({
            TableName: TABLE,
            ProjectionExpression: `${selectedProperties}, #tmsp`,
            KeyConditionExpression: `thingName = :hashKey AND #tmsp BETWEEN :startTime AND :endTime`,
            ExpressionAttributeNames: {
                '#tmsp': 'timestamp'
            },
            ExpressionAttributeValues: {
                ':hashKey': entityId,
                ':startTime': (new Date(startTime)).getTime(), 
                ':endTime': (new Date(endTime)).getTime() 
            }
        }).promise()

        let results = { propertyValues: [] }
        let res = []
        Items.forEach(item => {
    
            selectedProperties.forEach(prop => {
                if(!res[prop]){
                    res[prop] = {
                        entityPropertyReference:{
                            propertyName: prop,
                            componentName,
                                   entityId: event.entityId

                        },
                        values: []
                    }
                }
                res[prop].values.push({
                    time: (new Date(item['timestamp'])).toISOString(),
                    value: {doubleValue: item[prop]}
                })
            })
    
        })
    
        for (let key in res){
            results.propertyValues.push(res[key])
        }
    
        console.log(results)
        return results
    } catch (e) {
        console.log(e)
    }

}

The TwinMaker component will use this DataReader connector to fetch the data from the DynamoDB table. The component provides in the request two properties startTime and endTime (ISO-8601 timestamp format) that are used by the connector to fetch only the data in this time range. You can check the request and response interfaces in the Data connectors section of the documentation.

Before moving to the next step, you need to grant the function the access to the table. See Allows a Lambda function to access an Amazon DynamoDB table to learn more.

Adding permissions to the Lambda function to access the table

Now you can move to the step of creating a workspace in AWS IoT TwinMaker.

Step 4: Create a Workspace in AWS IoT TwinMaker

On the AWS IoT TwinMaker console, create a workspace named AirWorkspace. You can follow the instructions of the section Create a workspace of the AWS IoT TwinMaker documentation.

Once the workspace is created, you should have an Amazon Simple Storage Service (Amazon S3) bucket created. AWS IoT TwinMaker will use this bucket to store information and resources related to the workspace.

You should also have an IAM Identity Center role created. This role allows the workspace to access resources in other services on your behalf.

Before creating the component, you must provide permissions to invoke both lambda functions (created in the previous steps) to the workspace role. See Permissions for a connector to an external data source for an example of giving permission to the service role to use a Lambda function.

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Sid": "VisualEditor0",
            "Effect": "Allow",
            "Action": "lambda:InvokeFunction",
            "Resource": [
                "arn:aws:lambda:{{AWS_REGION}}:{{ACCOUNT_ID}}:function:TwinMakerDynamoDataReader",
                "arn:aws:lambda:{{AWS_REGION}}:{{ACCOUNT_ID}}:function:TwinMakerDynamoSchemaInit"
            ]
        }
    ]
}

You can now create your component.

Step 5: Create an AWS IoT TwinMaker component

Select the workspace you have created. In the workspace, choose Component types and then choose Create component type.

Twinmaker component types

Copy the following JSON document in the Request section and replace the ARN of the DataReader and Schema initializer functions respectively with the ones you created before:

{
  "componentTypeId": "com.dynamodb.airQuality",
  "description": "Connector for DynamoDB – Use case Air Quality",
  "propertyDefinitions":{
   },
  "functions": {
      "dataReader": { 
            "implementedBy": {
                             "lambda": {
                                    "arn": "arn:aws:lambda:{{AWS_REGION}}:{{ACCOUNT_ID}}:function:TwinMakerDynamoDataReader"
                              }
             }
      },
      "schemaInitializer": {
            "implementedBy": {
                             "lambda": { 
                                   "arn": "arn:aws:lambda:{{AWS_REGION}}:{{ACCOUNT_ID}}:function:TwinMakerDynamoSchemaInit"
                               }
            }
       }
  }
}

Choose Create component type. Now the component is created, you can create an entity to test the component.

Step 6: Create an entity and test the component

You will now create an entity and attach the component you created to it.

  • On the Workspaces page, choose your workspace, and then in the left pane choose Entities.
  • On the Entities page, choose Create, and then choose Create entity.
  • In the Create an entity window, enter airTwin for the entity name and also for the entity ID of your entity.
  • Choose Create entity.

Create entity in TwinMaker

  • On the Entities page, choose the entity you just created, and then choose Add component.
  • Enter a name for the component. You can call it dynamoAirComponent.
  • In Type, select the component com.dynamodb.airQuality created previously.
  • Choose Add component.

Add a component in TwinMaker

The component is attached to the entity with the ID airTwin. Now the only step that remains, is to test the component. When testing the component (or when calling the GetPropertyValueHistory API action), the component will send to the DataReader Lambda connector a request including the ID for the entity. The Lambda connector will use the ID to query the measurements of the sensor with the name corresponding to the ID. In this case, it will be measurements from the airTwin sensor.

  • On the Entities page, choose the entity airTwin, and then select the component com.dynamodb.airQuality.
  • Then choose Actions and View component details.
  • In the tab Test, select the properties you want to retrieve and a time range. Make sure that the time range selected includes the timestamp of the measurements.
  • Finally, choose Run test to test our component.

You should see the measurements of your sensors in the Time-series result section.

Time-series result in TwinMaker

You can now call the GetPropertyValueHistory API action to retrieve the measurements from your sensors stored in your DynamoDB table.

Cleaning up

To avoid incurring future charges, delete the resources created during this walk-through.

Conclusion

AWS IoT TwinMaker provides a unified data access API to read from and write to your digital twin’s source data. You can use your existing data sources without the need to move your data.

In this blog, you learned how to connect an Amazon DynamoDB table to AWS IoT TwinMaker. The concepts described are applicable to your other data sources. You can also combine multiple data sources to enrich your digital twin applications.

If you want to see an example of a solution using AWS IoT TwinMaker and Amazon S3 as data source, watch the video Build a Digital Twin using the Smart Territory Framework and AWS IoT TwinMaker on Youtube. You can also visit the related GitHub repository to check the code.

About the Author

Ali is a Technology Evangelist for IoT and Smart Cities at Amazon Web Services. With over 12 years of experience in IoT and Smart Cities, Ali brings his technical expertise to enable and help AWS partners and customers to accelerate their IoT and Smart Cities projects. Ali also holds an executive MBA, giving him the ability to zoom out and help customers and partners at a strategic level.