New AWS Deep Learning AMIs with Updated Framework Support: Tensorflow 1.15 & 2.0, PyTorch 1.3.1, and MXNet 1.6.0-rc0

Posted on: Dec 3, 2019

The AWS Deep Learning AMIs are available on Ubuntu 18.04, Ubuntu 16.04, Amazon Linux 2, and Amazon Linux with TensorFlow 1.15, Tensorflow 2.0, PyTorch 1.3.1, MXNet 1.6.0-rc0. Also new in this version is support for AWS Neuron, a SDK for running inference using AWS Inferentia chips. It consists of a compiler, run-time, and profiling tools that enable developers to run high-performance and low latency inference using Inferentia-based EC2 Inf1 instances. Neuron is pre-integrated into popular machine learning frameworks including TensorFlow, Pytorch, and MXNet to deliver optimal performance of EC2 Inf1 instances. Customers using Amazon EC2 Inf1 instances will receive the highest performance and lowest cost for machine learning inference in the cloud, and no longer need to make the sub-optimal tradeoff between optimizing for latency or throughput when running large machine learning models in production.  

AWS Deep Learning AMIs also support other interfaces such as Keras, Chainer, and Gluon — pre-installed and fully-configured for you to start developing your deep learning models in minutes while taking advantage of the computation power and flexibility of Amazon EC2 instances. When you activate a Conda environment, the Deep Learning AMIs automatically deploy higher-performance builds of frameworks, optimized for the EC2 instance of your choice. For a complete list of frameworks and versions supported by the AWS Deep Learning AMI, see release notes.

Get started quickly with the AWS Deep Learning AMIs using the getting-started guides and beginner to advanced level tutorials in our developer guide. You can also subscribe to our discussion forum to get launch announcements and post your questions.