The Internet of Things on AWS – Official Blog

Using AWS IoT Greengrass Version 2 with Amazon SageMaker Neo and NVIDIA DeepStream Applications

AWS IoT Greengrass Version 2 was released for general availability during re:Invent 2020. AWS IoT Greengrass is an Internet of Things (IoT) open source edge runtime and cloud service that helps you build, deploy, and manage device software. Customers use AWS IoT Greengrass for their IoT applications on millions of devices in homes, factories, vehicles, and businesses. AWS IoT Greengrass V2 now offers an open source edge runtime, improved modularity, new local development tools, and improved fleet deployment features. This new version provides a component framework that manages dependencies, and allows you to reduce the size of deployments since you only need to deploy the flexible components required for your application. A component is defined with a YAML or JSON formatted recipe. Additionally, applications no longer have to be AWS Lambda based; you can now package a command-line application directly in your recipe in whatever language you choose. Of course, AWS IoT Greengrass V2 provides components that enable to run Lambda applications as well. You can access the AWS IoT Greengrass V2 open source project here on GitHub and learn more about what’s new in AWS IoT Greengrass Version 2 in the AWS IoT Greengrass Developer Guide.

This post will walk you through the following two use cases of the integration between AWS IoT Greengrass V2 and NVIDIA Jetson modules:

  1. How to deploy and use GPU accelerated Image Classification on NVIDIA Jetson modules (Nano, TX2, Xavier NX and AGX Xavier supported) with Amazon SageMaker Neo and
  2. How to deploy a Video Analytics Pipeline with NVIDIA DeepStream on Jetson modules.

This image displays the architecture of the integration between AWS IoT Greengrass V2 and NVIDIA Jetson modules.

The two use case walkthrough sections are not dependent on each other. If you want to only deploy the NVIDIA DeepStream application sample, you can skip Section 1 and go directly to Section 2.

Pre-requisites

Section 0: AWS IoT Greengrass Version 2 Installation

This post is not an introduction to AWS IoT Greengrass V2. For detailed steps for installing and running AWS IoT Greengrass V2 on edge devices, refer to the getting started section of the developer guide.

If you just want to install AWS IoT Greengrass Version 2 on Jetson modules and get started quickly, then you can take the installation script we prepared in GitHub and run this bash script on your Jetson module. Once it successfully installs, you can move on to the next sections.

Section 1: Image classification with Amazon SageMaker Neo compiled models

In this section, we are going to walk you through how to run an Amazon SageMaker Neo-compiled and optimized image classification neural network model. This is common in use cases such as animal image classification.

This example will take a pre-made JPEG image of a dog converted to a NPY file, perform inference (classification) on it, and send the results as a message to AWS IoT Core via MQTT.

This image shows the pre-made JPEG image of a dog that will be converted to a NPY file, perform inference (classification) on it, and send the results as a message to AWS IoT Core via MQTT

Figure 1: Picture of a dog

In the context of AWS IoT Greengrass V2, this section will deploy three AWS IoT Greengrass components on your Jetson module:

  1. variant.Jetson.DLR – installs the appropriate Amazon SageMaker Neo DLR on your device. Learn more about AWS IoT Greengrass V2 DLR Installer in the Developer Guide.
  2. variant.Jetson.ImageClassification.ModelStore – installs ResNet18 image classification models optimized for Jetson modules
  3. aws.greengrass.JetsonDLRImageClassification – Contains the Python example that does image classification and sends a message to AWS IoT Core using the MQTT protocol.

PLEASE NOTE: This example deployment will install some Python packages outside of a virtual environment. To be specific, python-opencv is specially installed as part of Jetpack 4.4. so the installation the Debian package may run for an extended period of time. NumPy can also take a long time to install.

Checkout components for deployment

In this section we will clone the sample repository from GitHub and prepare the components for deployment. You will get Git installed to proceed.

To prepare the samples for deployment:

1. From your Jetson module, check out the GitHub repository with the following command:

git clone https://github.com/aws-samples/aws-iot-greengrass-v2-deploy-nvidia-deepstream.git

2. In the GitHub repository, copy the recipes in the jetson_inference/recipes into your local GreengrassCore (i.e., ~/GreengrassCore/recipes). See the directory trees below that show the source paths in GitHub vs what it should look like in your GreengrassCore home directory after you copy them.

This image shows the directory tree displaying the source paths in GitHub vs what it should look like in your GreengrassCore home directory after you copy them.

Directory structure for deployment

 

3. Copy the directory contents of jetson_inference/artifacts to your GreengrassCore/artifacts directory so that the folder structure looks like the following.
This image shows the directory tree that shows the source paths in Github

GitHub Source

4. Next, we will upload the component versions to the AWS IoT Greengrass V2 cloud service. Run the following commands on your Jetson device in your GreengrassCore home directory (~/GreengrassCore) you copied the recipes and artifacts into:

aws greengrassv2 create-component-version --inline-recipe fileb://recipes/aws.greengrass.JetsonDLRImageClassification-1.0.0.json
aws greengrassv2 create-component-version --inline-recipe fileb://recipes/variant.Jetson.DLR-1.0.0.json
aws greengrassv2 create-component-version --inline-recipe fileb://recipes/variant.Jetson.ImageClassification.ModelStore-1.0.0.json

Deploy the sample components provided to Jetson module through AWS IoT Greengrass V2

Before starting this section, verify that you have successfully installed AWS IoT Greengrass V2 on your Jetson device. Uploading components to the AWS IoT Greengrass V2 service will allow you to install the examples to your AWS IoT Greengrass Core software installation on your Jetson device. Verify you have a valid installation of AWS IoT Greengrass Core software v2 – refer to the Pre-requisites section for help.

To deploy the components to your Jetson module:

  1. Navigate to the AWS IoT Core Console (https://console.aws.amazon.com/iot/home).
  2. Choose Greengrass.This image shows where you select components in the AWS IoT Greengrass consoleComponents to deploy
  3. Choose Components – You should see the three components you created via the AWS CLI.
  4. Choose any one of the three components you created.
  5. Choose Deploy.
  6. Choose Create new deployment.
  7. Choose Next.
  8. For Name give the deployment a name.
  9. For Target type, choose Thing Group and enter the name of your device core, which can be found at the following link to the AWS IoT Greengrass Core page within the AWS Management Console:   https://console.aws.amazon.com/iot/home?region=us-east-1#/greengrass/v2/cores).
  10. Choose Next.
  11. On the Select Components screen, make sure to select all three of the components you created and choose Next.
  12. On the Configure Components screen, choose Next.
  13. On the Configure advanced settings screen, choose Next.
  14. On the Review screen choose Deploy.

Verify Inference Results

If you have successfully deployed the three components, inference should start immediately.

This will show inference data coming from your Jetson device resulting from a successful deployment. If you do not see any data, please go through and verify successful completion of each procedure outlined in the “Deploy the Components to your account” section, or consult the AWS IoT Greengrass V2 Troubleshooting Guide.

To view results with the MQTT Test Client:

  1. On the AWS Management Console, choose AWS IoT Core
  2. Choose Test
  3. Choose MQTT Test ClientThis image shows the MQTT Test Client page in the AWS IoT Core Management ConsoleEnter topic to filter messages
  4. Enter demo/topic for Subscription Topic 
  5. Choose Subscribe to topicThis image shows where to conduct image inference/classification in the AWS IoT Core Management ConsoleInference/classification messages

Section 2: Deploy NVIDIA DeepStream Application with AWS IoT Greengrass V2

NVIDIA’s Jetson product family enables customers to extend server-class compute to devices operating at the edge. NVIDIA has developed a streaming analytics toolkit called DeepStream to leverage TensorRT and CUDA to optimize AI performance at the edge. The DeepStream SDK provides an end-to-end video processing and ML inferencing analytics solution for transforming pixels and sensor data into actionable insights.

In this section, we will present how AWS can help deploy DeepStream apps and new ML models run by DeepStream apps on NVIDIA Jetson modules at scale with AWS IoT Greengrass V2.

For this demonstration, we will use the sample model and sample DeepStream application developed by NVIDIA in their DeepStream SDK as an example. You are also welcome to use your customized models and DeepStream apps.

Before starting the deployment process, first verify your DeepStream installation on your Jetson module:

1. Enter this command on your terminal to start the reference application

$ deepstream-app -c <path_to_config_file>

** <path_to_config_file> is the pathname of one of the reference application’s configuration files

2. Verify DeepStream application runs successfully on your terminal

Note, if you are using a Jetson Nano , we recommend using /opt/nvidia/deepstream/<your deepstream version>/samples/configs/deepstream-app/source8_1080p_dec_infer-resnet_tracker_tiled_display_fp16_nano.txt as your configuration file.

If the sample app starts without reporting error, then proceed to the deployment section in this article.  If you have trouble running this sample app, please refer to DeepStream documentation troubleshooting section for more details.

To deploy the NVIDIA DeepStream Application with AWS IoT Greengrass V2:

Step 0: Create a DeepStream application package

Before completing the steps in this section, first determine if you’d like to use a sample DeepStream application package or use your own customized deployment package. We have created a sample DeepStream application package that consists of three components:

  • ML model (directly sourced from DeepStream SDK provided by NVIDIA:  https://developer.nvidia.com/deepstream-download)
  • Modified version of sample DeepStream application configuration file.
  • Modified version of sample DeepStream primary GIE configuration file.

You can use this as an example to follow along the deployment steps, or you can use your own customized version of DeepStream configuration files and your own trained ML models.

Step 1: Prepare local environment

  1. 1. If you have not yet cloned the GitHub repository, run the following command to clone it locally
    git clone https://github.com/aws-samples/aws-iot-greengrass-v2-deploy-nvidia-deepstream.git
  2. 2. Export the path to the GitHub repository locally as an environment variable by running:
    cd aws-iot-greengrass-v2-deploy-nvidia-deepstreamexport DEMO_PATH=${PWD}

Step 2: Upload your package into an Amazon S3 bucket

  1. Create an S3 bucket by running the following command. (if you already have an S3 bucket, skip this step and use your existing bucket in Step XX):
    aws s3 create-bucket –bucket [YOUR_S3_BUCKET_NAME]
  2. Enter into nvidia_deepstream_integration folder in your GitHub repository by running:
    cd $DEMO_PATH/nvidia_deepstream_integration
  3. Upload our prepared sample deployment package in our S3 bucket:
    aws s3 cp jetson_deployment.zip  s3:// [YOUR_S3_BUCKET_NAME]/jetson_deployment.zip

Step 3: Create an AWS IoT Greengrass V2 component

  1. Rename greengrass_component.json file by adding a postfix  that AWS IoT Greengrass V2 uses as version number. For example:
    mv greengrass_component.json greengrass_component-1.0.0.json
  2. Open greengrass_component.json file with your text editor, and replace the placeholder [YOUR_S3_BUCKET_NAME] with the actual bucket name that you used in Step 2.
  3. Upload an AWS IoT Greengrass V2 component to the AWS IoT Greengrass cloud service.
    aws greengrassv2 create-component-version --inline-recipe fileb://greengrass_component-1.0.0.json

You will see the following message returned by AWS CLI:

{

“arn”: “arn:aws:greengrass:us-west-2:XXXXXXXXXXXX:components:deepstream-deployment:versions:1.0.0”,

“componentName”: “deepstream-deployment”,

“componentVersion”: “1.0.0”,

“creationTimestamp”: “2021-03-19T14:13:30.126000-07:00”,

“status”: {

“componentState”: “REQUESTED”,

“message”: “NONE”,

“errors”: {}

}

}

** Note: the default AWS IoT Greengrass V2 role does not have S3 access. So please manually add S3 access to your AWS IoT Greengrass V2 role if you have not already done so by running the following AWS CLI command or doing it manually in the AWS Management Console.

aws iam attach-role-policy --role-name [Your_Greengrass_V2_role_name] --policy-arn arn:aws:iam::aws:policy/AmazonS3ReadOnlyAccess

If you encounter Invalid choice: 'greengrassv2', this indicates that you need to download and update your AWS CLI service to the latest version.

Step 4: Deploy the AWS IoT Greengrass V2 component

  1. Navigate to the following page on AWS IoT console and click on “Deploy”.

    This image shows the AWS Management Console, AWS IoT Greengrass V2 Components Deployment

    AWS Management Console, AWS IoT Greengrass V2 Components Deployment

  2. Verify that the component ran successfully either from your AWS IoT Greengrass Core software v2 runtime log located at /greengrass/v2/logs.
    • In that folder, there’s greengrass.log (for the nucleus) and <componentName>.log for each component.
    • You can also verify by observing if you are receiving inferencing results on your configured DeepStream pipeline sink.

Section 3: Additional Resources

Local deployment of AWS IoT Greengrass V2 components

You can also deploy locally without an internet connection as outlined in the AWS IoT Greengrass V2 Getting Started Guide (Create your first component section).

Change camera source

You can replace the image inference with a camera interface. Because most Jetson modules do not come with cameras, your method for interfacing with the camera may vary for your type of camera. Please refer to inference.py for more details.

DeepStream IoT Test Applications (test 4 or test 5 in DeepStream application)

DeepStream applications have a function called Gst-nvmsgbroker. This plugin can send payload messages to AWS IoT Core** using MQTT protocol. It accepts any buffer with NvDsPayload metadata attached and uses the nvds_msgapi_* interface to send the messages to the server. If you need to use AWS IoT Core or AWS IoT Greengrass as a MsgBroker sink for your DeepStream application, you need the shared library from this GitHub: AWS IoT Core Integration with NVIDIA DeepStream.

NVIDIA DeepStream integration with AWS IoT Greengrass V1 (legacy)

To review the integration between DeepStream and AWS IoT Greengrass V1, please refer to the following GitHub repository. https://github.com/aws-samples/aws-iot-greengrass-deploy-nvidia-deepstream-on-edge

Summary: Start building!

In this post we’ve shown two ways to use AWS IoT Greengrass V2 on NVIDIA Jetson devices: classify images using SageMaker Neo and deploy a DeepStream video analytics pipeline for video data. To help you evaluate, test, and develop with this new release of AWS IoT Greengrass, the first 1,000 devices in your account will not incur any AWS IoT Greengrass charges until December 31, 2021. You will still incur charges for other AWS services you use with your applications running on AWS IoT Greengrass such as AWS IoT Core. We can’t wait to see what you build!

References

Chihuahua picture is part of the Stanford ImageNet resource collection located at http://vision.stanford.edu/aditya86/ImageNetDogs.

NVIDIA DeepStream Developer Guide: https://developer.nvidia.com/deepstream-getting-started

AWS IoT Greengrass V2 Developer Guide: https://docs.aws.amazon.com/greengrass/index.html

About The Authors

Author Image of Ryan VanderwerfRyan Vanderwerf is a Partner Solutions Architect focusing on IoT partnerships

 

 

 

 

 

Author Image Yuxin YangYuxin Yang is an IoT Consultant in AWS Professional Services