New T2.Xlarge and T2.2Xlarge Instances
AWS customers love the cost-effective, burst-based model that they get when they use T2 instances. These customers use T2 instances to run general purpose workloads such as web servers, development environments, continuous integration servers, test environments, and small databases. These instances provide a generous amount of baseline performance and the ability to automatically and transparently scale up to full-core processing power on an as-needed basis (refer back to New Low Cost EC2 Instances with Burstable Performance if this is news to you).
Today we are adding two new larger T2 instance sizes – t2.xlarge with 16 GiB of memory and t2.2xlarge with 32 GiB of memory. These new sizes enable customers to benefit from price and performance of the T2 burst model for applications with larger resource requirements (this is the third time that we have expanded the range of t2 instances; we added t2.large instances last June and t2.nano instances last December).
Here are the specs for all of the sizes of T2 instances (the prices reflect the most recent EC2 Price Reduction):
|Name||vCPUs||Baseline Performance||Platform||Memory (GiB)||CPU Credits / Hour||Price / Hour
|t2.nano||1||5%||32-bit or 64-bit||0.5||3||$0.0059|
|t2.micro||1||10%||32-bit or 64-bit||1||6||$0.012|
|t2.small||1||20%||32-bit or 64-bit||2||12||$0.023|
|t2.medium||2||40%||32-bit or 64-bit||4||24||$0.047|
Here are a couple of ways that you might be able to move existing workloads to the new instances:
- t2.large workloads can scale up to t2.xlarge or t2.2xlarge in order to gain access to more memory.
- Intermittent c4.2xlarge workloads can move to t2.xlarge at a significant cost reduction, with similar burst performance.
- Intermittent m4.xlarge workloads can move to t2.xlarge at s slight cost reduction, and higher burst performance.
The new instances are available today as On-Demand & Reserved Instances in all AWS regions.
Update: We have a webinar coming up on January 19th, where you can learn more. Sign up here.