Posted On: Nov 28, 2018
AWS IoT Greengrass now supports Amazon SageMaker Neo. Neo enables machine learning models to train once and run anywhere in the cloud and at the edge. Neo automatically optimizes TensorFlow, MXNet, PyTorch, ONNX, and XGBoost models for deployment on ARM, Intel, and Nvidia processors. Optimized models run up to twice as fast and consume less than a tenth of the memory footprint. Neo will also be available as open source code under the Apache Software License soon, enabling hardware vendors to customize it for their processors and devices. Using Neo with AWS IoT Greengrass, you can retrain these models in Amazon SageMaker, and update the optimized models quickly to improve intelligence on these edge devices. You can use a broad range of devices based on the Nvidia Jetson TX2, Arm v7 (Raspberry Pi), or Intel Atom platforms.
Additionally, AWS IoT Greengrass also provides new connectors for Image Classification that are trained using Amazon SageMaker’s Image Classification algorithm. These connectors package all required AWS Lambda code and ML dependencies required for Image Classification inference on a device such as a camera. A connector is available for each of the supported hardware platforms: Nvidia Jetson TX2, Arm v7 (Raspberry Pi), and Intel Atom.
To get started with these improvements to AWS IoT Greengrass ML Inference, visit the service page.