The Internet of Things on AWS – Official Blog

How to integrate NVIDIA DeepStream on Jetson Modules with AWS IoT Core and AWS IoT Greengrass

AWS continually evolves our edge computing offerings to provide customers with the technology they need to extend AWS services to edge devices, such as consumer products or manufacturing equipment, and enable them to act intelligently. This helps customers avoid unnecessary cost and latency, and empower customers with the ability to manage edge devices securely and efficiently.

You can use AWS IoT Greengrass to extend a wide range of AWS cloud technologies to your edge devices so they can act locally on the data they generate, while still using the cloud for real-time data analytics, data storage and visualization, and training and fine-tuning machine learning models. In addition to providing technology solutions from the edge to the cloud, AWS works with a variety of device providers across the globe to provide customers with the right hardware to choose from for their particular use case.

NVIDIA DeepStream SDK is an accelerated framework to build managed intelligent video analytics apps and services. The NVIDIA Jetson product family enables customers to extend server-class compute performance to devices operating at the edge. By using DeepStream in combination with TensorRT and CUDA on the Jetson platform , customers can build and deploy high throughput, low latency solutions.

In this post, we will demonstrate how you can integrate NVIDIA DeepStream on Jetson Modules with AWS IoT Services, so that you can start building innovative solutions with the AWS technologies and infrastructures to best meet your unique business requirements.

Solution Overview

The objective of this post is to provide an overview on how to enable NVIDIA DeepStream Applications to publish MQTT messages to AWS IoT Core and AWS IoT Greengrass. The following diagram presents the architecture of the solution demonstrated in this post.

The following diagram presents the architecture of the solution demonstrated in this post.

The following sections will walk through the following procedures to install and set up the DeepStream SDK’s message broker API to publish MQTT messages to AWS IoT Core. (If you want to use your Jetson module as an AWS IoT Greengrass Aware Device, you can further refer to the last section “Compatible with AWS IoT Greengrass” for more details.)

  • Procedure 1: Download AWS DeepStream adaptor
  • [Optional] Procedure 2: Manually build the shared library
  • Procedure 3: Provision DeepStream App with AWS IoT credentials
  • Procedure 4: Transfer certificates to Jetson modules
  • Procedure 5: Run Deepstream App

After the solution walkthrough, we explain how to process IoT messages with AWS IoT Rules, and how to connect the AWS DeepStream adaptor to AWS IoT Greengrass.

Pre-requisites

  • AWS account admin console access
  • A Jetson module with DeepStream SDK installed and internet access
  • Gstreamer installation as described in the NVIDIA documentation.

For the convenience of this solution walkthrough, we demonstrate how to create an environment variable of the path where your DeepStream SDK is installed. Please replace <DeepStream SDK PATH> to the path of your DeepStream SDK on your Jetson module:

$ export DEEPSTREAM_SDK_PATH=<DeepStream SDK PATH>

To verify your DeepStream installation, on your Jetson module, navigate to the ${DEEPSTREAM_SDK_PATH}/sources/apps/sample_apps/deepstream-app directory on the development kit, and enter this command on your terminal to run the reference application:

$ deepstream-app -c <path_to_config_file>

Where <path_to_config_file> is the pathname of one of the reference application’s configuration files, we recommend using ${DEEPSTREAM_SDK_PATH}/samples/configs/deepstream-app/source4_1080p_dec_infer-resnet_tracker_sgie_tiled_display_int8.txt

If the sample app has run successfully, then you can proceed to the deployment section in this article.  If you have trouble running this sample app, please refer to the DeepStream SDK Development Guide for more details on how to troubleshoot this step.

Solution Deployment

Procedure 1: Download AWS DeepStream adaptor

To download the AWS DeepStream adaptor

  • In your Jetson module, navigate to Downloads folder
  • Download or clone the AWS-managed GitHub repo.
  • Copy the aws_protocol_adaptor sub-folder to ${DEEPSTREAM_SDK_PATH}/sources/libs.

$ cd ~/Downloads
$ git clone git@github.com:awslabs/aws-iot-core-integration-with-nvidia-deepstream.git
$ cd aws-iot-core-integration-with-nvidia-deepstream
$ cp -r aws_protocol_adaptor ${DEEPSTREAM_SDK_PATH}/sources/libs

[Optional] Procedure 2: Manually build the shared library

The shared library (.so file) is pre-compiled and committed in the GitHub repo you cloned in Step 1 in the aws_protocol_adaptor/device_client directory. If you want to build and customize your shared library, you can follow procedure 2 to re-compile this shared library file with customized features such as optimized buffer size for incoming or outgoing MQTT messages, or TLS connection timeout values.

To manually build the shared library

  • First create an empty directory for the AWS IoT device SDK library within the aws_protocol_adaptor library we cloned in procedure 1.
  • Next, clone the AWS IoT device SDK in Embedded-C version 3 into this empty directory we created.

$ mkdir ${DEEPSTREAM_SDK_PATH}/sources/libs/aws_protocol_adaptor/aws-iot-sdk
$ cd {DEEPSTREAM_SDK_PATH}/sources/libs/aws_protocol_adaptor/aws-iot-sdk
$ git clone https://github.com/aws/aws-iot-device-sdk-embedded-C.git .

  • This AWS IoT device SDK has an external dependency on Mbed TLS, so navigate to aws-iot-sdk/external_libs/. And clone the existing Mbed TLS repo in this folder:

$ cd ${DEEPSTREAM_SDK_PATH}/sources/libs/aws_protocol_adaptor/aws-iot-sdk/external_libs/mbedTLS
$ git clone https://github.com/ARMmbed/mbedtls.git

  • Navigate to device_client folder, and compile the shared library:

$ cd ${DEEPSTREAM_SDK_PATH}/sources/libs/aws_protocol_adaptor/device_client
$ make clean
$ make

  • If you inspect this current folder, you should see the libnvds_aws_proto.so file just updated.

Procedure 3: Provision DeepStream App with AWS IoT Core credentials

To provision DeepStream App with AWS IoT Core credentials

  • Navigate to AWS web console, and select AWS IoT Core service.
  • On the left-side menu, choose Secure → Policies.
  • On the right upper corner, select Create.
  • A window will appear to help you create a policy.
    • WARNING: For this demonstration, we are going to explain how to create a policy to allow a thing to access every resource with any action in AWS IoT Core. In production, as a best practice, you should specify the resource and allowed actions to ensure you’re granting least privilege permissions).

A window that will appear to help you create a policy in the AWS IoT Core management console

  • Fill in the required fields, and select Create.
  • On the left-side of the AWS IoT console choose Manage->Things. On the right upper corner, select Create to start the process of creating a thing on AWS IoT.

This picture shows the left-side of the AWS IoT console choose Manage->Things. On the right upper corner, select Create to start the process of creating a thing on AWS IoT

  • Select “Create a single thing” in the next dialog box.

This image shows how you select “Create a single thing”in the AWS IoT console

  • In the proceeding window, put ds_app as name, for the purposes of this demo. Leave the rest unchanged and select Next.
  • Select Create certificate.

This shows where you create a certificate in the AWS IoT console

  • In the page that appears, download all of the links for the certificates we just generated for this thing.
  • For root certificate, a link redirects you to a root certificate download page. You should download Amazon Root CA1, which can also be accessed here.
  • Select the Activate button on this page to activate the set of certificates that you just downloaded.

This shows the activate button on this page to activate the set of certificates that you just downloaded in the AWS IoT console

  • Finally, select Attach Policy, and choose the policy named ds_app_policy you just created.

Procedure 4: Transfer certificates to a Jetson module

To transfer certificates to a Jetson module

  • Navigate to the path of your downloaded certs.
  • You should see four files in the following naming format.

XXX-certificate.pem.crt
XXX-private.pem.key
XXX-public.pem.key
AmazonRootCA1.pem

  • You can rename them as follows.

certificatePem.cert.pem
privateKey.private.key
publicKey.public.key
root.ca.pem

  • Now, make a cert folder on your Jetson module, and transfer these downloaded certificates and keys to your Jetson module. In this case, we are on the same network as our Jetson module, so we can use scp command to transfer the certificates:

$ scp certificatePem.cert.pem privateKey.private.key publicKey.public.key root.ca.pem <YOUR_JETSON_IP>:${DEEPSTREAM_SDK_PATH}/sources/libs/aws_protocol_adaptor/device_client/certs/

    • Alternatively, you can upload them to a secure storage in the cloud to be downloaded by your Jetson module.
    • Note: Once you have successfully transferred these certificates onto your Jetson module. We are going to put these certificates in the following directory for this demonstration, but you can use custom directories as long as you specify them in your cfg_aws.txt file:

$ mkdir ${DEEPSTREAM_SDK_PATH}/sources/libs/aws_protocol_adaptor/device_client/certs
$ mv <4 CERTS FILES> ${DEEPSTREAM_SDK_PATH}/sources/libs/aws_protocol_adaptor/device_client/certs

  • On your Jetson module, navigate to

$ cd ${DEEPSTREAM_SDK_PATH}/sources/libs/aws_protocol_adaptor/device_client/ 

  • Edit cfg_aws.txt :
    • Replace <YOUR IOT HOST ADDRESS> with your AWS IoT Endpoint URL, which is located in the AWS IoT consoleSettings, in the box labeled Endpoint.

This shows where you replace <YOUR IOT HOST ADDRESS> with your AWS IoT Endpoint URL, which is located in the AWS IoT console → Settings, in the box labeled Endpoint in the AWS IoT Console

  • Replace <DEEPSTREAM SDK PATH> to the absolute path of your DeepStream SDK PATH.
    • WARNING: Using relative path rather than absolute path would cause certificate parsing failure error. If this happens, you can come back to this file and edit certificate paths and restart your DeepStream application.
  • Replace the values of both ThingName and ClientID with ds_app (to match the name of the thing we created above).

Procedure 5: Run DeepStream App

Finally, we are going to use the test apps developed by NVIDIA to verify our adaptor setup. We are going to run tests with both test4 and test5 in the NVIDIA DeepStream SDK sample app folder (${DEEPSTREAM_SDK_PATH}/sources/apps/sample_apps). The deepstream-test4 can be used to demonstrate adding custom objects as NVDS_EVENT_MSG_META user metadata with buffers for generating a custom payload to be published to AWS IoT Core. The deepstream-test5 can demonstrate how to use “nvmsgconv” and “nvmsgbroker” plugins in the pipeline, create NVDS_META_EVENT_MSG type of meta, and upload to AWS IoT Core. Both apps can help verify the installation and functionalities of this message broker.

To use Test App 4

  • First, navigate to test4 in your DeepStream SDK at ${DEEPSTREAM_SDK_PATH}/sources/apps/sample_apps/deepstream-test4 on your Jetson module, and build DeepStream app test 4 using “make” command.

$ cd ${DEEPSTREAM_SDK_PATH}/sources/apps/sample_apps/deepstream-test4
$ make

  • Then, use the following command to run test4

$ ./deepstream-test4-app -i ../../../../samples/streams/sample_720p.h264 -p ../../../libs/aws_protocol_adaptor/device_client/libnvds_aws_proto.so --conn-str=hello -c ../../../libs/aws_protocol_adaptor/device_client/cfg_aws.txt -t test --no-display

  • Now navigate to the AWS IoT console, and on the menu bar on the left, select test
  • Input test (or # to receive messages on all topics) in the subscription topic box, and select Subscribe to topic. You should see MQTT messages start to show up on the console after the app successfully runs:

This shows MQTT messages start to show up on the console after the app successfully runs in AWS IoT Core console

To use Test App 5

First, navigate to test5 in your DeepStream SDK at ${DEEPSTREAM_SDK_PATH}/sources/apps/sample_apps/deepstream-test5 on your Jetson module, and build DeepStream app test 5 using “make” command.

$cd ${DEEPSTREAM_SDK_PATH}/sources/apps/sample_apps/deepstream-test5
$ make

  • In order to run deepstream-test5, open your configuration file in an editor you prefer:

$cp configs/test5_config_file_src_infer.txt configs/test5_config_file_src_infer_aws.txt
$ vim configs/test5_config_file_src_infer_aws.txt

  • Then, modify msg-broker-proto-lib under your message broker sink (sink1), to point to:

${DEEPSTREAM_SDK_PATH}/sources/libs/aws_protocol_adaptor/device_client/libnvds_aws_proto.so

And also modify msg-broker-config under the same sink to point to:
${DEEPSTREAM_SDK_PATH}/sources/libs/aws_protocol_adaptor/device_client/cfg_aws.txt

  • Next, modify the topic to a topic name that you choose, and modify the first sink to a fake sink.
  • Then, you can use the following command to run test5:

$ ./deepstream-test5-app -c configs/test5_config_file_src_infer_aws.txt 

  • Now, navigate to AWS IoT console, and on the menu bar on the left, select Test
  • Input test (or # to receive messages on all topics) in the subscription topic box
  • Choose Subscribe to topic. You should see MQTT messages start to show up on this console after the app successfully runs:

This shows MQTT messages start to show up on this console after the app successfully runs in the AWS IoT console

This concludes the procedure for installing and setting up the DeepStream SDK’s message broker API to publish MQTT messages to AWS IoT Core or AWS IoT Greengrass. Next, we will explain how you can process IoT messages with AWS IoT Rules, and how to connect the AWS DeepStream adaptor to AWS IoT Greengrass.

Processing IoT messages with AWS IoT Rule

Once you see messages coming into AWS IoT Core, there are a lot of options to further process them or store them on the AWS Cloud. One simple example would be to use AWS IoT Rules to push these messages to a customized AWS Lambda function, which parses the messages and puts them in Amazon DynamoDB. You may find the following documents helpful in setting up this IoT rule to storage pipeline:
Creating a Rule with an AWS Lambda Action 
Reading and Writing A Single Item in DynamoDB 
Implementing a Serverless AWS IoT Backend with AWS Lambda and Amazon DynamoDB

The following documents may further assist you to build a production-ready AWS Data Pipeline:
AWS IoT Analytics Console Quick Start Guide
Integrating IoT data with your data lake with new AWS IoT Analytics features
Real-Time IoT Device Monitoring with Kinesis Data Analytics
Writing to Kinesis Data Firehose Using AWS IoT

Compatible with AWS IoT Greengrass

This AWS DeepStream adaptor also supports connections to AWS IoT Greengrass. You can modify <YOUR IOT HOST ADDRESS> in cfg_aws.txt to your Greengrass Endpoint/IP address.

There are several options to find out the Greengrass Endpoint/IP address. If you know the IP address of your Greengrass device or run “ifconfig” on your Greengrass device to find it out, you can directly put that as <YOUR IOT HOST ADDRESS>. AWS IoT Greengrass also provides a Discovery API, which enables devices to retrieve the information required to connect to the AWS IoT Greengrass core that is in the same Greengrass group as the device.

For further information on enabling your device to connect to AWS IoT Greengrass, please follow module 4 in AWS IoT Greengrass developer guide.

Conclusion

In this post, we explained how to leverage the DeepStream SDK’s message broker API to publish MQTT messages to AWS IoT Core or AWS IoT Greengrass. We walked through the process of installing the libraries, setting up the configuration files, and running sample app4 and app5 in DeepStream SDK with the libraries installed. Finally, we talked about options to further process and store the published MQTT messages on AWS Cloud. We’d love to hear how you’re using this integration. Let us know in the comments below.

To learn more about how you can scale up or automate the deployment of NVIDIA DeepStream using AWS IoT services, please refer to “AWS IoT Greengrass Deploying NVIDIA DeepStream on Edge”.

Acknowledgement(s)

The author(s) of this blog appreciate the contributions from the NVIDIA Jetson team for their collaboration, test, and debug support throughout the development process of the method presented in this blog.