S3 Intelligent-Tiering is the only cloud storage class that delivers automatic storage cost savings when data access patterns change, without performance impact or operational overhead. The Amazon S3 Intelligent-Tiering storage class is designed to optimize storage costs by automatically moving data to the most cost-effective access tier when access patterns change. For a small monthly object monitoring and automation charge, S3 Intelligent-Tiering monitors access patterns and automatically moves objects that have not been accessed to lower cost access tiers.
S3 Intelligent-Tiering is the ideal storage class for data with unknown, changing, or unpredictable access patterns, independent of object size or retention period. You can use S3 Intelligent-Tiering as the default storage class for data lakes, analytics, and new applications.
Amazon S3 Intelligent-Tiering
The Amazon S3 Intelligent-Tiering storage class is designed to optimize storage costs by automatically moving data to the most cost-effective access tier when access patterns change. For a small monthly object monitoring and automation charge, S3 Intelligent-Tiering monitors access patterns and automatically moves objects that have not been accessed to lower cost access tiers. S3 Intelligent-Tiering delivers automatic storage cost savings in two low latency and high throughput access tiers. For data that can be accessed asynchronously, customers can choose to activate automatic archiving capabilities within the S3 Intelligent-Tiering storage class. There are no retrieval charges in S3 Intelligent-Tiering. If an object in the infrequent access tier is accessed later, it is automatically moved back to the frequent access tier. No additional tiering charges apply when objects are moved between access tiers within the S3 Intelligent-Tiering storage class.
- Frequent and Infrequent Access tiers have same low latency and high throughput performance of S3 Standard and saves up to 40% on storage costs
- Opt-in asynchronous, archive capabilities for objects that become rarely accessed
- Archive Access and Deep Archive Access tiers have same performance as Glacier and Glacier Deep Archive and saves up to 95% for rarely accessed objects
- Designed for durability of 99.999999999% of objects across multiple Availability Zones and for 99.9% availability over a given year
- No operational overhead, no lifecycle charges, no retrieval charges, and no minimum storage duration
How it works — S3 Intelligent-Tiering
The S3 Intelligent-Tiering storage class automatically stores objects in two access tiers: one tier that is optimized for frequent access and another lower-cost tier that is optimized for infrequent access. For a low monthly object monitoring and automation charge, S3 Intelligent-Tiering monitors access patterns and automatically moves objects that have not been accessed for 30 consecutive days to the Infrequent Access tier, without performance impact or operational overhead. For data that can be accessed asynchronously, you can choose to activate automatic archiving capabilities within the S3 Intelligent-Tiering storage class. Once you enable automatic archiving, S3 Intelligent-Tiering will move objects that have not been accessed for 90 days to the Archive Access tier and after 180 days of no access to the Deep Archive Access tier. There are no retrieval charges in S3 Intelligent-Tiering. If an object in the Infrequent Access tier is accessed later, it is automatically moved back to the Frequent Access tier. S3 Intelligent-Tiering is the ideal storage class for data with unknown, changing, or unpredictable access patterns, independent of object size or retention period, such as data lakes, data analytics, and new applications.
S3 Intelligent-Tiering charges a low monitoring and automation charge, and has a minimum eligible object size of 128KB for auto-tiering. Smaller objects may be stored, but will always be charged at the Frequent Access tier rates, and are not charged the monitoring and automation charge. See the Amazon S3 Pricing page for more information.
To learn more visit the S3 Intelligent-Tiering user guide.
Customers saving on storage with S3 Intelligent-Tiering
Electronic Arts (EA) is a global leader in digital interactive entertainment. EA makes games that touch 450+ million players across console, PC, and mobile including top franchises such as FIFA, Madden, and Battlefield. EA has transformed from a Hadoop dominant environment to one centered around an AWS Cloud Storage based data lake on Amazon S3, including S3 Glacier for data archiving and long-term backup. To support our top games, our core telemetry systems routinely deal with 10s of petabytes, 10s of thousands of tables, and 2+ billion objects. EA used S3 Intelligent-Tiering to optimize storage costs for their data lake with changing access patterns.
"With minimal to no changes to our existing tools, we were able to reduce storage costs by 30% with S3 Intelligent-Tiering for data with unpredictable access patterns. This has helped our data infrastructure team concentrate on our core competencies related to game launches. Our collaboration with AWS allows us the ability to focus even more on growing and delighting our customers to continue inspiring the world to play.
Sundeep Narravula, principal technical director, EA
German live streaming service Joyn GmbH, a ProSiebenSat.1 and Discovery joint venture, is tapping into its deep content vault to bring subscribers exclusive, hyperlocal series, and films from the past for enjoyment. To make this possible, Joyn recently transferred over 3 petabytes (PB) of media archives from an on-premises facility in into Amazon S3 in under three months using 40 AWS Snowball appliances.
By utilizing Amazon S3 Intelligent-Tiering, Joyn can keep all of its content online and also optimize storage automatically as access patterns change – without an impact on performance or operational overhead. Content from the archive that sees a lot of interest is sorted into a frequent access tier, while content that draws less attention is stored in an infrequent access tier.
“It used to be that we’d have to be selective about which content we’d retrieve from our deep archive, or in some cases, what we’d keep on the archive, but now it’s a no brainer. We were able to grow our storage volume by a factor of 3x for the same total cost of ownership (TCO) by using S3 Intelligent-Tiering. It’s great to no longer have to think about deleting content to make space, and if inactive, the content sits in an infrequent access or archive tier.”
Stefan Haufe, Media Engineer, Joyn
Founded in 2008, Zalando is Europe’s leading online platform for fashion and lifestyle with over 32 million active customers. Amazon S3 is the cornerstone of the data infrastructure of Zalando, and they have utilized S3 Storage Classes to optimize storage costs.
"We are saving 37% annually in storage costs by using Amazon S3 Intelligent-Tiering to automatically move objects that have not been touched within 30 days to the infrequent-access tier."
Max Schultze, Lead Data Engineer - Zalando
Teespring, an online platform that lets creators turn unique ideas into custom merchandise, experienced rapid business growth, and the company’s data also grew exponentially—to a petabyte—and continued to increase. Like many cloud native companies, Teespring addressed the problem by using AWS, specifically storing data on Amazon S3.
By using S3 Intelligent-Tiering, Teespring now saves more than 30 percent on its monthly storage costs.
Using AWS, SimilarWeb manages large volumes of data, with which its data scientists build algorithms to improve its market-intelligence platform. By using S3 Intelligent-Tiering, SimilarWeb is able to democratize that data for its employees and save 20 percent on storage costs.
AppsFlyer is a leading mobile advertising attribution and marketing analytics platform. AppsFlyer stores data from its 100 billion events per day in a petabyte scale data lake on Amazon S3. But had little insight when it came to whether objects older than 365 days would be accessed frequently again in the future, and thereby incur unexpected retrieval charges. AppsFlyer needed a different solution and found it in S3 Intelligent-Tiering.
AppsFlyer was able to take an informed decision to transition data to S3 Intelligent-Tiering that yielded a cost reduction of 18% per GB stored. Reducing cost is important for AppsFlyer as it helps increase revenue and allows AppsFlyer to invest in new workloads.
“S3 Intelligent-Tiering allows us to make better use and be more cost efficient whenever we have to go to historical data and make changes on top of it.”
Reshef Mann, AppsFlyer’s CTO and Co-Founder
Embark is building self-driving truck technology to make roads safer and improve the efficiency of transportation. When the COVID-19 pandemic hit, Embark made a decision to pause their truck operations in order to align with social responsibility to public health and ensure the safety of their workforce. Embark turned to their petabytes of historical data on Amazon S3 and develop systems to allow them to leverage these more deeply.
Engineers began pulling from years of historical data, pouring through thousands of hours of driving data to find scenarios of interest, and using this data to build stronger simulations against which they could test their system. With all of Embark’s data stored using the S3 Intelligent-Tiering storage class, Embark didn’t have to spend time thinking about which data should be available and how to move this data between different storage tiers in order to optimize costs while still enabling this sudden pattern of random data access into their data lake.
S3 Intelligent-Tiering did all of the work of optimizing costs for them so that their team could focus all of their engineering efforts on building better data pipelines and simulation systems. With the help of AWS, Embark’s team was able to quickly adapt to the challenges of the pandemic and when the pause was lifted, they were able to continue their focus on delivering the safety and efficiency benefits of self-driving trucks.
S3 Intelligent-Tiering resources
Overview: S3 Intelligent-Tiering
Demo: Using S3 Intelligent-Tiering
re:Invent 2020: Optimizing cost in S3
S3 Intelligent-Tiering blog posts
S3 cost optimization for predictable and dynamic access patterns
Many customers of all sizes and industries tell us they are growing at an unprecedented scale – both their business and their data. This post helps you understand how to control your storage costs for workloads that have predictable and changing access patterns, and how to take action to implement changes to realize storage costs savings.
5 Ways to reduce data storage costs using Amazon S3 Storage Lens
If you have an increasing number of S3 buckets, spread across tens or hundreds of accounts, you might be in search of a tool that makes it easier to manage your growing storage footprint and improve cost efficiencies. This post will help you walk away with a basic understanding of how to use S3 Storage Lens to identify typical cost savings opportunities, and how to take action to implement changes to realize those cost savings.
S3 Intelligent-Tiering Adds Opt-in Archive Access Tiers
We launched S3 Intelligent-Tiering in 2018, which added the capability to take advantage of S3 without needing to have a deep understanding of your data access patterns. In 2020, we launched opt-in archiving capabilities that will archive objects that are rarely accessed. These new optimizations will reduce the amount of manual work you need to do to archive objects with unpredictable access patterns and that are not accessed for months at a time.
Start saving today - migrate your storage to Amazon S3
The AWS Migration Acceleration Program for Storage consists of AWS services, best practices, and tools to help customers save costs and accelerate migrations of storage workloads to AWS. Reach your migration goals even faster with AWS services, best practices, tools, and incentives. Workloads that are well suited for storage migration include on premises data lakes, large unstructured data repositories, file shares, home directories, backups, and archives.
AWS offers more ways to help you reduce storage costs, and more options to migrate your data. That is why more customers choose AWS storage to build the foundation for their cloud IT environment. Learn more about the Storage Migration Acceleration Program »