Posted On: Aug 14, 2020
Amazon Elastic Container Service (ECS) today launched the new Amazon ECS Optimized Inferentia Amazon Machine Image (AMI), a new Amazon Linux 2 based AMI for the Amazon EC2 Inf1 Instances on ECS. This makes it easy for customers to run Inferentia based containers on ECS using the ECS Optimized Inferentia AMI which comes pre-baked with all the necessary AWS Neuron packages.
Amazon EC2 Inf1 instances deliver high performance and the lowest cost machine learning inference in the cloud. Inf1 instances feature up to 16 AWS Inferentia chips, high-performance machine learning inference chips designed and built by AWS. Using Inf1 instances, customers can run large scale machine learning inference applications such as image recognition, speech recognition, natural language processing, personalization, and fraud detection. Once your machine learning model is trained to meet your requirements, you can deploy your model by using AWS Neuron, a specialized software development kit (SDK) consisting of a compiler, run-time, and profiling tools that optimizes the machine learning inference performance of Inferentia chips, and supports popular machine learning frameworks such as TensorFlow, PyTorch, or MXNet.
Customers can launch and add Inf1 instance to their ECS clusters with the new ECS optimized Inferentia AMI using the AWS CLI and ECS Console. The ECS optimized Inferentia AMI version 20200623 consists of ECS container agent version 1.41.0 and docker version 1903.6-ce.
The Amazon ECS Optimized Inferentia AMI is available in US East (N. Virginia) and US West (Oregon) regions. To learn more please see the AWS documentation and news blog.