On-premises storage can be costly and complex, with expensive hardware refresh cycles, and data migrations due to system upgrades. It is also difficult to gain insights because your data is in silos from multiple storage systems.
With cloud storage, you adjust on the fly and use what storage you need now, and not get locked into another hardware refresh. Moving to Amazon S3 keeps you agile and reduces costs by eliminating over-provisioning, and provides unlimited scale, while also tearing down data silos to gain insights from data.
Lower your storage costs without sacrificing performance with Amazon S3. Amazon S3 lets you take control of costs and continuously optimize your spend, while building modern, scalable applications. Amazon S3 Storage Classes offer the flexibility to manage your costs, or have it automated for you, by providing different data access levels at corresponding costs, including the lowest cost cloud storage.
Amazon S3 Storage Classes
You can store your data across 6 distinct storage classes that are designed to accommodate different access requirements at corresponding costs.
These include S3 Standard for general-purpose storage of frequently accessed data; S3 Intelligent-Tiering for data with unknown or changing access patterns; S3 Standard-Infrequent Access (S3 Standard-IA) and S3 One Zone-Infrequent Access (S3 One Zone-IA) for long-lived, but less frequently accessed data; and Amazon S3 Glacier (S3 Glacier) and Amazon S3 Glacier Deep Archive (S3 Glacier Deep Archive) for long-term archive and digital preservation.
Amazon S3 Intelligent-Tiering
The S3 Intelligent-Tiering storage class is designed to optimize costs by automatically moving data to the most cost-effective access tier, without operational overhead. It works by storing objects in four access tiers: two low latency access tiers optimized for frequent and infrequent access, and two optional archive access tiers designed for asynchronous access that are optimized for rare access.
- Frequent and Infrequent Access tiers have same low latency and high throughput performance of S3 Standard and saves up to 40% on storage costs
- Activate optional automatic archive capabilities for objects that become rarely accessed
- Archive access and deep Archive access tiers have same performance as Glacier and Glacier Deep Archive and saves up to 95% for rarely accessed objects
- Designed for durability of 99.999999999% of objects across multiple Availability Zones and for 99.9% availability over a given year
- Small monthly monitoring and auto-tiering charge
- No operational overhead, no lifecycle charges, no retrieval charges
How it works — S3 Intelligent-Tiering
The S3 Intelligent-Tiering storage class automatically stores objects in two access tiers: one tier that is optimized for frequent access and another lower-cost tier that is optimized for infrequent access. For a low monthly object monitoring and automation charge, S3 Intelligent-Tiering monitors access patterns and automatically moves objects that have not been accessed for 30 consecutive days to the Infrequent Access tier, without performance impact or operational overhead. For data that can be accessed asynchronously, you can choose to activate automatic archiving capabilities within the S3 Intelligent-Tiering storage class. Once you enable automatic archiving, S3 Intelligent-Tiering will move objects that have not been accessed for 90 days to the Archive Access tier and after 180 days of no access to the Deep Archive Access tier. There are no retrieval charges in S3 Intelligent-Tiering. If an object in the Infrequent Access tier is accessed later, it is automatically moved back to the Frequent Access tier. S3 Intelligent-Tiering is the ideal storage class for data with unknown, changing, or unpredictable access patterns, independent of object size or retention period, such as data lakes, data analytics, and new applications.
Amazon S3 Glacier Deep Archive
S3 Glacier Deep Archive is Amazon S3’s lowest-cost storage class and supports long-term retention and digital preservation for data that may be accessed once or twice in a year. It is designed for customers— particularly those in highly regulated industries, such as the Financial Services, Healthcare, and Public Sectors— that retain data sets for 7-10 years or longer to meet regulatory compliance requirements.
- Designed for durability of 99.999999999% of objects across multiple Availability Zones
- Lowest cost storage class designed for long-term retention of data that will be retained for 7-10 years
- Ideal alternative to magnetic tape libraries
- Retrieval time within 12 hours
- S3 PUT API for direct uploads to S3 Glacier Deep Archive, and S3 Lifecycle management for automatic migration of objects
Performance across the S3 Storage Classes
† Because S3 One Zone-IA stores data in a single AWS Availability Zone, data stored in this storage class will be lost in the event of Availability Zone destruction.
* S3 Intelligent-Tiering charges a small monitoring and automation charge, and has a minimum eligible object size of 128KB for auto-tiering. Smaller objects may be stored, but will always be charged at the Frequent Access tier rates, and are not charged the monitoring and automation charge. See the Amazon S3 Pricing for more information.
** Standard retrievals in the S3 Intelligent-Tiering Archive Access tier and Deep Archive access tier are free. If you need faster access to your objects from the Archive Access tiers, you can pay for expedited retrieval using the S3 console.
*** S3 Intelligent-Tiering first byte latency for frequent and infrequent access tier is milliseconds access time, and the archive access and deep archive access tiers first byte latency is minutes or hours.
Data lifecycle management tools to optimize costs
Amazon S3 has various features you can use to organize and manage your data in ways that enable cost efficiencies, letting you optimize your storage for access patterns and cost.
S3 Lifecycle management
To manage your objects so that they are stored cost effectively throughout their lifecycle, use Amazon S3 Lifecycle. An S3 Lifecycle configuration is a set of rules that define actions that Amazon S3 applies to a group of objects, you can either transition to another storage class, or delete expired objects. Learn more.
S3 Storage Class Analysis
Use S3 Storage Class Analysis to analyze data access patterns to help you decide when to transition the right data to the right storage class. After using S3 Storage Class Analysis to monitor access patterns, you can use this information to configure S3 Lifecycle policies to make the data transfer to the appropriate storage class. Learn more.
S3 Pricing Calculator
Configure a cost estimate that fits your unique business needs with Amazon S3 by using the pricing calculator.
"With S3, we were able to reduce storage costs by over 40%."
Founded in 2008, Zalando is Europe’s leading online platform for fashion and lifestyle with over 32 million active customers. Amazon S3 is the cornerstone of the data infrastructure of Zalando, and they have utilized S3 Storage Classes to optimize storage costs.
"We are saving 37% annually in storage costs by using Amazon S3 Intelligent-Tiering to automatically move objects that have not been touched within 30 days to the infrequent-access tier."
Max Schultze, Lead Data Engineer - Zalando
Teespring, an online platform that lets creators turn unique ideas into custom merchandise, experienced rapid business growth, and the company’s data also grew exponentially—to a petabyte—and continued to increase. Like many cloud native companies, Teespring addressed the problem by using AWS, specifically storing data on Amazon S3.
By using Amazon S3 Glacier and S3 Intelligent-Tiering, Teespring now saves more than 30 percent on its monthly storage costs.
Photobox wanted to get out of the business of owning and maintaining its own IT infrastructure so it could redeploy resources toward innovation in artificial intelligence and other areas to create a better customer experience. Photobox is an online, personalized photo-products company that serves millions of customers each year in over ten markets.
By migrating from its EMC Isilon and IBM Cleversafe on-premises storage arrays to Amazon S3 using AWS Snowball Edge, Photobox saved a significant amount on costs on storage for its 10 PB of photo storage.
To save on storage costs, customers choose Amazon S3.
Start saving today - migrate your storage to Amazon S3
The AWS Migration Acceleration Program for Storage consists of AWS services, best practices, and tools to help customers save costs and accelerate migrations of storage workloads to AWS. Reach your migration goals even faster with AWS services, best practices, tools, and incentives. Workloads that are well suited for storage migration include on premises data lakes, large unstructured data repositories, file shares, home directories, backups, and archives.
AWS offers more ways to help you reduce storage costs, and more options to migrate your data. That is why more customers choose AWS storage to build the foundation for their cloud IT environment. Learn more about the Storage Migration Acceleration Program
Cost optimization resources
Webinar: Optimize cost with Storage Classes
re:Invent 2019: Optimizing cost in S3
Webinar: S3 Glacier Deep Archive
Durability and Global Resiliency
Amazon S3 is designed to deliver 99.999999999% data durability. S3 automatically creates copies of all uploaded objects and stores them across at least three Availability Zones (AZs), besides S3 One-Zone Infrequent Access. This means your data is protected by a multi-AZ resiliency model and against site-level failures. Watch the video to learn more about what the 11 9s of durability means for your data and global resiliency.