Hugging Face Neuron Deep Learning AMI (Ubuntu 20.04)
Linux/Unix
Linux/Unix
Product Overview
Hugging Face Neuron Deep Learning AMI (DLAMI) makes it easy to use Amazon EC2 Inferentia & Trainium instances for efficient training and inference of Hugging Face Transformers and Diffusers models.
With the Hugging Face Neuron DLAMI, scale your Transformers and Diffusion workloads quickly on Amazon EC2 while reducing your costs, with up to 50% cost-to-train savings over comparable GPU-based DLAMIs.
This DLAMI is the officially supported, and recommended solution by Hugging Face, to run training and inference on Trainium and Inferentia EC2 instances, and supports most Hugging Face use cases, including:
- Fine-tuning and pre-training Transformers models like BERT, GPT, or T5
- Running inference with Transformers models like BERT, GPT, or T5
- Fine-tuning and deploying Diffusers models like Stable Diffusion
This DLAMI is provided at no additional charge to Amazon EC2 users.
For more information and documentation, visit the Hugging Face Neuron Developer Guide: https://awsdocs-neuron.readthedocs-hosted.com/en/latest/frameworks/torch/torch-neuronx/tutorials/training/bert.html
AMI Name format: Hugging Face Neuron Deep Learning AMI (Ubuntu 20.04) ${YYYY-MM-DD}
The AMI includes the following:
- Supported AWS Service: EC2
- Operating System: Ubuntu 20.04
- Compute Architecture: x86
- EBS volume type: gp2
- Python version: 3.8
- Supported EC2 Instances: Trn1
- Pytorch: 1.12 Neuron SDK
- Hugging Face Libraries: transformers, datasets, accelerate, evaluate, diffusers