Posted On: Nov 20, 2018
Amazon SageMaker has added several enhancements to the built-in TensorFlow and Chainer containers. These enhancements make it easier to run TensorFlow and Chainer scripts, while taking advantage of the capabilities Amazon SageMaker offers, including a library of high-performance algorithms, managed and distributed training with automatic model tuning, one-click deployment, and managed hosting.
The built-in TensorFlow 1.11 container with SageMaker now supports Python 3, while continuing to support Python 2. Python 3 provides many improvements towards function annotations, language improvements, Unicode support, and many others. Additionally, the script format for training with the built-in TensorFlow 1.11 container is now similar to using TensorFlow outside SageMaker, enabling seamless movement of workloads between SageMaker and your infrastructure. Finally, starting from TensorFlow 1.11 in SageMaker, you can now choose to deploy your models to dedicated TensorFlow Serving containers for inference. These containers offer a code-less model hosting option that supports requests using standard TensorFlow Serving REST API inputs and outputs, as well as simplified JSON or CSV input. Compared to the standard TensorFlow containers that support both training and inference, these dedicated containers provide faster startup time and improved throughput.
The SageMaker built-in containers for Chainer now support Chainer 5.0. This version comes with multiple enhancements including iDeep 2.0, which is the latest version of the Chainer backend for Intel architecture with performance improvements.