Posted On: Jun 23, 2022
Starting today, the Amazon Elastic Compute Cloud (Amazon EC2) G5 instances powered by NVIDIA A10G Tensor Core GPUs are now available in Asia Pacific (Mumbai, Tokyo), Europe (Frankfurt, London), and Canada (Central). G5 instances can be used for a wide range of graphics intensive and machine learning use cases. They deliver up to 3x higher performance for graphics-intensive applications and machine learning inference, and up to 3.3x higher performance for training simple to moderately complex machine learning models when compared to Amazon EC2 G4dn instances.
G5 instances feature up to 8 NVIDIA A10G Tensor Core GPUs and 2nd generation AMD EPYC processors. They also support up to 192 vCPUs, up to 100 Gbps of network bandwidth, and up to 7.6 TB of local NVMe SSD storage. With eight G5 instance sizes that offer access to single or multiple GPUs, customers have the flexibility to pick the right instance size for their applications.
Customers can use G5 instances for graphics-intensive applications such as remote workstations, video rendering, and cloud gaming to produce high fidelity graphics in real time. Machine learning customers can use G5 instances for high performance and cost-efficient training and inference for natural language processing, computer vision, and recommender engine use cases.
With access to NVIDIA’s Tesla drivers for compute workloads, GRID drivers to provision RTX Virtual Workstations, and Gaming drivers at no additional cost, customers can easily optimize the G5 instances for their workloads.
With these additional Regions, Amazon EC2 G5 instances are available in the following 8 Regions: US East (N. Virginia), US West (Oregon), Europe (Ireland, Frankfurt, London), Asia Pacific (Tokyo, Mumbai), and Canada (Central). Customers can purchase G5 instances as On-Demand Instances, Reserved Instances, Spot Instances, or as part of Savings Plans.
To get started, visit the AWS Management Console, AWS Command Line Interface (CLI), and AWS SDKs. To learn more, visit the G5 instance page.