AWS Machine Learning Blog

AWS Deep Learning AMIs now come with TensorFlow 1.5 and new Model Serving capabilities

The AWS Deep Learning AMIs help you quickly and easily get started with machine learning. The AMIs include a range of prebuilt options that cater to the diverse needs of machine learning practitioners. For those who want the latest stock versions of deep learning frameworks, the Deep Learning AMIs provide prebuilt pip binaries installed in separate Conda-based virtual environments. For those looking to test advanced framework features or tweak framework source code, the Deep Learning AMIs with Source Code provide custom installations of frameworks from source. These are often built with advanced optimizations not available in stock binaries.

Faster training with TensorFlow on Volta GPUs

The AMIs with Source Code now come with TensorFlow 1.5.0-rc1. This prerelease version of TensorFlow supports NVidia CUDA 9 and cuDNN 7 drivers that take advantage of the V100 Volta GPUs powering the EC2 P3 instances. In our tests, training the ResNet-50 benchmark with synthetic ImageNet data in fp-16 mode on a p3.8xlarge instance was 1.8x times faster than training with TensorFlow 1.4.1. Because this is a prerelease version, test before you use it in production. 

Get the prebuilt Conda-based virtual environment for the stock versions of TensorFlow 1.4.1 with CUDA 8 and cuDNN 6 on our Deep Learning AMIs for Ubuntu and Amazon Linux. The Conda-based AMIs now also come with TensorBoard–a visualization tool for monitoring and debugging TensorFlow model training. To get started with TensorBoard, see our step-by-step guide.

Latest in deep learning frameworks

The Conda-based Deep Learning AMIs now come with the latest versions of popular deep learning frameworks. They support the Caffe deep learning framework, and include Keras 2.1.3, Microsoft Cognitive Toolkit 2.3.1, and Theano 1.0.

Quickly deploy and test your deep learning models

The new AMIs include updates to help you quickly create an inference endpoint for trained models. You can now test, validate, and integrate models with business applications faster, reducing the time it takes to develop a working prototype.

The Conda-based virtual environment for TensorFlow now comes preinstalled with TensorFlow Serving. TensorFlow Serving takes an exported TensorFlow model and creates a server running a gRPC service to host it. In addition, Apache MXNet users can use the MXNet Model Server to quickly deploy an HTTP-based inference API for their models. To quickly export a model, host a server, and test the inference API, use our tutorials for getting started with TensorFlow Serving and MXNet Model Server.

Latest in deep learning platforms

All of the AMIs for both Ubuntu and Amazon Linux operating systems have been updated with the latest NVidia GPU drivers and operating system versions. They now include security patches that address the Spectre and Meltdown vulnerabilities.

Getting started with the AWS Deep Learning AMIs

It’s easy to get started with the AWS Deep Learning AMIs. Our AMI selection topic helps you pick the right AMI for your deep learning project. We’ve also provided many tutorials and developer resources to help you quickly deploy your first deep learning model on AWS.

The latest releases of the AMIs are available in the AWS Marketplace.

Deep Learning AMIs with prebuilt pip binaries installed in separate virtual environments:

AMIs with frameworks built from source code:


About the Author

 

Sumit Thakur is a Senior Product Manager for AWS Deep Learning. He works on products that make it easy for customers to get started with deep learning on cloud, with a specific focus on making it easy to use engines on Deep Learning AMI. In his spare time, he likes connecting with nature and watching sci-fi TV series.