AWS Deep Learning AMIs
Quickly build scalable, secure deep learning applications in preconfigured environments
Scale distributed machine learning (ML) training to thousands of accelerated instances, and seamlessly deploy models for production.
Develop on accelerators—including AWS custom silicon and Intel Habana—with the newest drivers, frameworks, libraries, and tools.
Reduce risk with customized, stable machine images regularly patched to address security vulnerabilities.
How it works
AWS Deep Learning AMIs (DLAMI) provides ML practitioners and researchers with a curated and secure set of frameworks, dependencies, and tools to accelerate deep learning in the cloud. Built for Amazon Linux and Ubuntu, Amazon Machine Images (AMIs) come preconfigured with TensorFlow, PyTorch, Apache MXNet, Chainer, Microsoft Cognitive Toolkit (CNTK), Gluon, Horovod, and Keras, allowing you to quickly deploy and run these frameworks and tools at scale.

Use cases
Autonomous vehicle development
Develop advanced ML models at scale to develop autonomous vehicle (AV) technology safely by validating models with millions of supported virtual tests.
Natural language processing
Accelerate the installation and configuration of AWS instances, and speed up experimentation and evaluation with up-to-date frameworks and libraries, including Hugging Face Transformers.
Healthcare data analysis
Use advanced analytics, ML, and deep learning capabilities to identify trends and make predictions from raw, disparate health data.
Accelerated model training
DLAMI includes the latest NVIDIA GPU acceleration through preconfigured drivers, the Intel Math Kernel Library (MKL), Python packages, and the Anaconda Platform.
How to get started
See how you can accelerate your model training
Learn how DLAMI can expedite your development and model training.
Explore the AMIs
Select the right AMI and instance type for your project.
Take the hands-on training
Start building with 10-minute tutorials.