Optimizing storage costs using Amazon S3

Save with cost-efficient storage classes and lifecycle management tools

On-premises storage can be costly and complex, with expensive hardware refresh cycles, and data migrations due to system upgrades. It is also difficult to gain insights because your data is in silos from multiple storage systems.

With cloud storage, you adjust on the fly and use what storage you need now, and not get locked into another hardware refresh. Moving to Amazon S3 keeps you agile and reduces costs by eliminating over-provisioning, and provides unlimited scale, while also tearing down data silos to gain insights from data.

Lower your storage costs without sacrificing performance with Amazon S3. Amazon S3 lets you take control of costs and continuously optimize your spend, while building modern, scalable applications. Amazon S3 Storage Classes offer the flexibility to manage your costs, or have it automated for you, by providing different data access levels at corresponding costs, including the lowest cost cloud storage. 

S3 Storage Classes

Automatic savings

with S3 Intelligent-Tiering, which optimizes storage costs for you.

Every workload

S3 Storage Classes optimizes costs and performance for all workloads.

99.999999999%

11 9's of durability across all storage classes.

Lowest cost

storage in the cloud with S3 Glacier Deep Archive.

Amazon S3 Storage Classes

AWS_StorageAsset 2@2x

You can store your data across 6 distinct storage classes that are designed to accommodate different access requirements at corresponding costs.

These include S3 Standard for general-purpose storage of frequently accessed data; S3 Intelligent-Tiering for data with unknown or changing access patterns; S3 Standard-Infrequent Access (S3 Standard-IA) and S3 One Zone-Infrequent Access (S3 One Zone-IA) for long-lived, but less frequently accessed data; and Amazon S3 Glacier (S3 Glacier) and Amazon S3 Glacier Deep Archive (S3 Glacier Deep Archive) for long-term archive and digital preservation. 

S3 Storage Classes feature page »

Introduction to S3 Storage Classes
Introduction to S3 Intelligent-Tiering

Amazon S3 Intelligent-Tiering

The S3 Intelligent-Tiering storage class is designed to optimize costs by automatically moving data to the most cost-effective access tier, without operational overhead. It works by storing objects in four access tiers: two low latency access tiers optimized for frequent and infrequent access, and two optional archive access tiers designed for asynchronous access that are optimized for rare access. 

  • Frequent and Infrequent Access tiers have same low latency and high throughput performance of S3 Standard and saves up to 40% on storage costs
  • Activate optional automatic archive capabilities for objects that become rarely accessed 
  • Archive access and deep Archive access tiers have same performance as Glacier and Glacier Deep Archive and saves up to 95% for rarely accessed objects
  • Designed for durability of 99.999999999% of objects across multiple Availability Zones and for 99.9% availability over a given year
  • Small monthly monitoring and auto-tiering fee
  • No operational overhead, no lifecycle fees, no retrieval fees

How it works — S3 Intelligent-Tiering

How_it_Works_Amazon-S3-Intelligent_Tiering.png

S3 Intelligent-Tiering works by storing objects in four access tiers: two low latency access tiers optimized for frequent and infrequent access, and two opt-in archive access tiers designed for asynchronous access that are optimized for rare access. Objects uploaded or transitioned to S3 Intelligent-Tiering are automatically stored in the Frequent Access tier.

S3 Intelligent-Tiering works by monitoring access patterns and then moving the objects that have not been accessed in 30 consecutive days to the Infrequent Access tier. Once you have activated one or both of the archive access tiers, S3 Intelligent-Tiering will automatically move objects that haven’t been accessed for 90 consecutive days to the Archive Access tier and then after 180 consecutive days of no access to the Deep Archive Access tier. If the objects are accessed later, S3 Intelligent-Tiering moves the objects back to the Frequent Access tier. There are no retrieval fees, so you won’t see unexpected increases in storage bills when access patterns change.

Amazon S3 Glacier Deep Archive

S3 Glacier Deep Archive is Amazon S3’s lowest-cost storage class and supports long-term retention and digital preservation for data that may be accessed once or twice in a year. It is designed for customers— particularly those in highly regulated industries, such as the Financial Services, Healthcare, and Public Sectors— that retain data sets for 7-10 years or longer to meet regulatory compliance requirements.

  • Designed for durability of 99.999999999% of objects across multiple Availability Zones
  • Lowest cost storage class designed for long-term retention of data that will be retained for 7-10 years
  • Ideal alternative to magnetic tape libraries
  • Retrieval time within 12 hours
  • S3 PUT API for direct uploads to S3 Glacier Deep Archive, and S3 Lifecycle management for automatic migration of objects
Webinar: Amazon S3 Glacier Deep Archive

Performance across the S3 Storage Classes

  S3 Standard S3 Intelligent-Tiering*
S3 Standard-IA
S3 One Zone-IA†
S3 Glacier
S3 Glacier
Deep Archive

Designed for durability
99.999999999%
(11 9’s)
99.999999999%
(11 9’s)
99.999999999%
(11 9’s)
99.999999999%
(11 9’s)
99.999999999%
(11 9’s)
99.999999999%
(11 9’s)
Designed for availability
99.99% 99.9% 99.9% 99.5% 99.99% 99.99%
Availability SLA 99.9% 99% 99% 99% 99.9%
99.9%
Availability Zones ≥3 ≥3 ≥3 1 ≥3 ≥3
Minimum capacity charge per object N/A N/A 128KB 128KB 40KB 40KB
Minimum storage duration charge N/A 30 days 30 days 30 days 90 days 180 days
Retrieval fee N/A
N/A
per GB retrieved
per GB retrieved per GB retrieved per GB retrieved
First byte latency milliseconds milliseconds milliseconds milliseconds select minutes or hours select hours

† Because S3 One Zone-IA stores data in a single AWS Availability Zone, data stored in this storage class will be lost in the event of Availability Zone destruction.

* S3 Intelligent-Tiering charges a small tiering fee and has a minimum eligible object size of 128KB for auto-tiering. Smaller objects may be stored but will always be charged at the Frequent Access tier rates. See the Amazon S3 Pricing for more information.

** Standard retrievals in archive access tier and deep archive access tier are free. If you need faster access to your objects from the archive access tiers, you can pay for expedited retrieval using the S3 console.

*** S3 Intelligent-Tiering first byte latency for frequent and infrequent access tier is milliseconds access time, and the archive access and deep archive access tiers first byte latency is minutes or hours.

Data lifecycle management tools to optimize costs

Amazon S3 has various features you can use to organize and manage your data in ways that enable cost efficiencies, letting you optimize your storage for access patterns and cost.

S3 Lifecycle management

S3 Lifecycle

To manage your objects so that they are stored cost effectively throughout their lifecycle, use Amazon S3 Lifecycle. An S3 Lifecycle configuration is a set of rules that define actions that Amazon S3 applies to a group of objects, you can either transition to another storage class, or delete expired objects. Learn more.

S3 Storage Class Analysis

S3 Storage Class Analysis

Use S3 Storage Class Analysis to analyze data access patterns to help you decide when to transition the right data to the right storage class. After using S3 Storage Class Analysis to monitor access patterns, you can use this information to configure S3 Lifecycle policies to make the data transfer to the appropriate storage class.  Learn more.

S3 Pricing Calculator

S3 pricing tools

Configure a cost estimate that fits your unique business needs with Amazon S3 by using the pricing calculator

S3 Data Lifecycle Management Tools

You can use S3 Storage Class Analysis to learn data access patterns and S3 Lifecycle tools to transfer less frequently accessed objects to lower-cost storage classes.

Developer guide: S3 Storage Class Analysis »

Developer guide: S3 Lifecycle policies »

Creating cost efficiencies across your storage resources

Customers

Sysco saves with S3

"With S3, we were able to reduce storage costs by over 40%."

Zalando

Founded in 2008, Zalando is Europe’s leading online platform for fashion and lifestyle with over 32 million active customers. Amazon S3 is the cornerstone of the data infrastructure of Zalando, and they have utilized S3 Storage Classes to optimize storage costs.

"We are s aving 37% annually in storage costs by using Amazon S3 Intelligent-Tiering to automatically move objects that have not been touched within 30 days to the infrequent-access tier."

Max Schultze, Lead Data Engineer - Zalando

Read the customer blog post >>

Teespring

Teespring, an online platform that lets creators turn unique ideas into custom merchandise, experienced rapid business growth, and the company’s data also grew exponentially—to a petabyte—and continued to increase. Like many cloud native companies, Teespring addressed the problem by using AWS, specifically storing data on Amazon S3. 

By using Amazon S3 Glacier and S3 Intelligent-Tiering, Teespring now saves more than 30 percent on its monthly storage costs.

Read the customer blog post >>

Photobox

Photobox wanted to get out of the business of owning and maintaining its own IT infrastructure so it could redeploy resources toward innovation in artificial intelligence and other areas to create a better customer experience. Photobox is an online, personalized photo-products company that serves millions of customers each year in over ten markets.

By migrating from its EMC Isilon and IBM Cleversafe on-premises storage arrays to Amazon S3 using AWS Snowball Edge, Photobox saved a significant amount on costs on storage for its 10 PB of photo storage.

Watch the case study video >>

To save on storage costs, customers choose Amazon S3.

Start saving today - migrate your storage to Amazon S3

The AWS Migration Acceleration Program for Storage consists of AWS services, best practices, and tools to help customers save costs and accelerate migrations of storage workloads to AWS. Reach your migration goals even faster with AWS services, best practices, tools, and incentives. Workloads that are well suited for storage migration include on premises data lakes, large unstructured data repositories, file shares, home directories, backups, and archives.

AWS offers more ways to help you reduce storage costs, and more options to migrate your data. That is why more customers choose AWS storage to build the foundation for their cloud IT environment. Learn more about the Storage Migration Acceleration Program

Cost optimization resources

Webinar: Optimize cost with Storage Classes

Cost optimization guidelines for Amazon S3

re:Invent 2019: Optimizing cost in S3

Guidelines and design patterns to optimize cost in S3

Webinar: S3 Glacier Deep Archive

The cheapest storage in the cloud
Data Durability and Global Resiliency

Durability and Global Resiliency

Amazon S3 is designed to deliver 99.999999999% data durability. S3 automatically creates copies of all uploaded objects and stores them across at least three Availability Zones (AZs), besides S3 One-Zone Infrequent Access. This means your data is protected by a multi-AZ resiliency model and against site-level failures. Watch the video to learn more about what the 11 9's of durability means for your data and global resiliency.

AWS global infrastructure »

Amazon S3 durability & data protection FAQs »

Protecting data in Amazon S3 »

Start saving on storage costs by using Amazon S3 

Learn more at the S3 resources page
Ready to build?
Get started with Amazon S3
Have more questions?
Contact us