AWS Deep Learning AMIs

Quickly build scalable, secure deep learning applications in preconfigured environments

Scale distributed machine learning (ML) training to thousands of accelerated instances and seamlessly deploy models for inference in production.

Develop on accelerators—including AWS Trainium, AWS Inferentia, and NVIDIA GPUs—with the newest drivers, frameworks, libraries, and tools.

Reduce risk with customized, stable machine images regularly patched to address security vulnerabilities.

How it works

AWS Deep Learning AMIs (DLAMI) provides ML practitioners and researchers with a curated and secure set of frameworks, dependencies, and tools to accelerate deep learning on Amazon EC2. Built for Amazon Linux and Ubuntu, Amazon Machine Images (AMIs) come preconfigured with TensorFlow, PyTorch, NVIDIA CUDA drivers and libraries, Intel MKL, Elastic Fabric Adapter (EFA), and AWS OFI NCCL plugin, allowing you to quickly deploy and run these frameworks and tools at scale.

Diagram showing how DLAMI can be launched using the AWS Management Console, AWS Command Line Interface (CLI), AWS SDK, AWS API, or your local terminal or application scripts

Use cases

Autonomous vehicle development

Develop advanced ML models at scale to develop autonomous vehicle (AV) technology safely by validating models with millions of supported virtual tests.

Natural language processing

Accelerate the installation and configuration of AWS instances, and speed up experimentation and evaluation with up-to-date frameworks and libraries, including Hugging Face Transformers.

Healthcare data analysis

Use advanced analytics, ML, and deep learning capabilities to identify trends and make predictions from raw, disparate health data.

Accelerated model training

DLAMI includes the latest NVIDIA GPU acceleration through preconfigured drivers, the Intel Math Kernel Library (MKL), Python packages, and the Anaconda Platform.

Customer success

Bazaarvoice

Cimpress invests in and builds customer-focused, entrepreneurial, print mass customization businesses for the long term. Cimpress makes it easy and affordable for customers to make an impression – for their customers, organization or loved ones. Whether it’s promotional material amplifying a business’ brand or an announcement celebrating a birth, Cimpress combines the individual personalization customers desire with the tangible impact of physical products.

"Cimpress uses AWS Deep Learning AMIs to rapidly set up and deploy our machine learning environments. The DLAMIs reduce our operational overhead and we can get our products to market faster by focusing on the core work of training and deploying our deep learning models for computer vision and generative AI.”

Ajay Joshi, Principal Software Engineer – Cimpress

Bazaarvoice

Flip AI is the first GenAI native observability platform that is data and platform agnostic, understands all observability modalities - including metrics, events, logs and traces - and generates predictive and incident Root Cause Analyses in seconds.

"At Flip AI, We have trained our own LLMs for DevOps to debug production incidents to help enterprises reach the highest level of customer experience. This training requires a high performance setup that is easily customizable. With DLAMI, we don't need to fight battles with CUDA drivers or Pytorch related optimizations. It just works. Improving percentages on GPU utilization means we're able to train our models more efficiently, and shave off 10s of milliseconds on inference.”

Sunil Mallya, CTO – Flip AI

How to get started

See how you can accelerate your model training

Learn how DLAMI can expedite your development and model training.

Explore the AMIs

Select the right AMI and instance type for your project.

Take the hands-on training

Start building with 10-minute tutorials.


Explore more of AWS