AWS Storage Blog

What’s new with Amazon S3? A recap of 2020 launches

At AWS, we have a culture of innovation, and that is apparent for Amazon S3 when you look back at the number of features delivered this year to help you easily manage, secure your data, optimize costs, or use S3 in new ways. Meaningful innovation thrives on AWS. So, whether you’re looking to invent new services continuously, extract value from data efficiently, or modernize IT rapidly, you can do it on AWS and Amazon S3.

The rapid pace of innovation brings many benefits, but it can be a challenge to keep track of every launch, and how it can help you manage your data. In this post, I provide a recap of what’s new with Amazon S3, and links to all of the 2020 launches that help make your data more secure, consistent, visible, and help optimize and lower your storage costs. You can also view the on-demand re:Invent session, What’s new with Amazon S3, to get up-to-speed on the majority of the launches covered below. For a list of Amazon S3 re:Invent sessions that are now on-demand, you can read this post, and for the upcoming re:Invent sessions (Jan 12-14), you can bookmark this post.

Amazon S3 now delivers strong read-after-write consistency

At re:Invent 2020, we announced a major update for Amazon S3, with the strong consistency launch. Amazon S3 pioneered object storage in the cloud with high availability, performance, and virtually unlimited scalability, with eventual consistency. Increasingly, customers are using big data analytics applications that often require access to an object immediately after a write. Without strong consistency, you would insert custom code into these applications, or provision databases to keep objects consistent with any changes in Amazon S3 across millions or billions of objects. Amazon S3 now delivers strong read-after-write consistency automatically for all applications. Unlike other cloud providers, Amazon S3 delivers strong read-after-write consistency for any storage request, without changes to performance or availability, without sacrificing regional isolation for applications, and at no additional cost.

S3 Intelligent-Tiering adds Archive Access Tiers – further optimizes costs

Amazon S3 Intelligent-Tiering now supports automatic data archiving to further reduce storage costs by up to 95% when objects become rarely accessed over long periods of time. The S3 Intelligent-Tiering storage class is the first and only cloud storage that automatically optimizes customers’ storage costs. S3 Intelligent-Tiering delivers milliseconds latency and high throughput performance for frequently and infrequently accessed data in the Frequent and Infrequent Access Tiers, and now the lowest storage costs in the cloud when data is rarely accessed in the Deep Archive Access Tier.

S3 Intelligent-Tiering delivers automatic cost savings by moving objects between four access tiers when access patterns change. S3 Intelligent-Tiering stores objects in four access tiers: two low latency access tiers optimized for frequent and infrequent access, and now two opt-in archive access tiers designed for asynchronous access and optimized for rare access at low costs. To see how S3 Intelligent-Tiering with automatic archiving works, read the blog post, or visit the S3 cost optimization page.

You can learn more about optimizing your cost and performance on Amazon S3 in the re:Invent session on Jan 12 – Better, faster, and lower-cost storage: Optimizing Amazon S3

https://youtu.be/dZOQb7TWT4c

S3 Storage Lens delivers organization-wide visibility into object storage usage and activity trends

Customers use Amazon S3 to store large shared data sets across tens to hundreds of accounts and buckets, multiple regions, and thousands of prefixes. Now, S3 Storage Lens delivers storage metrics across all accounts, regions, and buckets to help you understand, analyze, and optimize your storage. S3 Storage Lens delivers organization-wide visibility into your object storage usage and activity trends, and makes actionable recommendations to improve cost-efficiency and apply data protection best practices. S3 Storage Lens is the first cloud storage analytics solution to provide a single view of object storage usage and activity across tens to hundreds of accounts in an AWS organization, with drill-downs to generate insights at the account, bucket, or even prefix level. Drawing from more than 14 years of experience helping customers optimize storage, S3 Storage Lens analyzes organization-wide metrics to deliver contextual recommendations to find ways to reduce your storage costs and apply best practices on data protection. To learn more about Amazon S3 Storage Lens, visit the webpageread the blog post, or read the documentation.

You can learn more about S3 Storage Lens in the upcoming re:Invent session on Jan 12 – Bytes to insights: Analyze and take action on your storage usage.

S3 Replication launches to simplify, and enhance your replication strategy

S3 Replication is an elastic, fully managed, low cost feature that replicates objects between buckets, and this feature had some major launches throughout the year that will help simplify your replication strategy, and gives you the controls and flexibility to meet your data backup, data sovereignty, and other business needs.

With Amazon S3 Replication, you can configure Amazon S3 to automatically replicate S3 objects across different AWS Regions by using Amazon S3 Cross-Region Replication (CRR) or between buckets in the same AWS Region by using Amazon S3 Same-Region Replication (SRR). S3 Replication launched the flexibility of replicating to multiple destination buckets in the same, or different AWS Regions (Dec 2020). S3 Replication also now supports two-way replication between two or more buckets in the same, or different AWS Regions (Dec 2020). Customers needing a predictable replication time backed by a Service Level Agreement (SLA) can use Replication Time Control (RTC) to replicate objects in less than 15 minutes (Nov 2019). Amazon S3 Replication now provides detailed metrics and notifications to monitor the status of object replication between buckets (Nov 2020). You can monitor replication progress by tracking bytes pending, operations pending, and replication latency between your source and destination buckets. And finally, S3 Replication is now able to replicate delete markers from one S3 bucket to another, to protect data from accidental deletions

To summarize, here are the launches for S3 Replication:

Security and access control launches

We added a number of important security features, conditions, and enhancements in 2020, that will help make it easier for your to secure your data in Amazon S3. At re:Invent, we launched S3 Bucket Keys, which reduce the costs of Server-Side Encryption with AWS KMS (SSE-KMS) by up to 99%, and we also pre-announced the availability of AWS PrivateLink support for Amazon S3, which is coming soon. In October, we launched three new security and access control features, S3 Object Ownership enables bucket owners to automatically assume ownership of objects uploaded to their buckets, and S3 Bucket Owner Condition which helps to validate the correct bucker ownership. You can read about these in one place in the Jeff Barr blog post, or visit the what’s new posts for each launch below to get the specific details. Two other major enhancements include, the major enhancements and 80%+ price reduction for Amazon Macie to automatically and continually evaluate every bucket to alert on any publicly accessible buckets, unencrypted buckets, or buckets shared or replicated with AWS accounts outside of a customer’s organization, and the expansion of Amazon GuardDuty threat detection coverage for S3 to monitor your data for highly-suspicious data access and anomaly detection. I would also be remiss to not include two major launches from re:Invent 2019, S3 Access Points were launched to make it simple to manage access at scale for applications using shared data sets, and the launch of Access Analyzer for S3 to review and remediate unintended access to S3.

You can learn about a lot of these security updates in the upcoming re:Invent session (Jan 14) – A defense-in-depth approach to Amazon S3 security and access

Complete list of Amazon S3 security launches for 2020

S3 on Outposts is generally available, expanding object storage to on-premises environments

Amazon S3 on Outposts delivers object storage to your on-premises AWS Outposts environment to meet local data processing and data residency needs. Using the S3 APIs and features, S3 on Outposts makes it easy to store, secure, tag, retrieve, report on, and control access to the data on your Outpost. AWS Outposts is a fully managed service that extends AWS infrastructure, services, and tools to virtually any data center, co-location space, or on-premises facility for a truly consistent hybrid experience. To learn more, visit the S3 on Outposts pagedocumentation, or read the blog post.

To hear more about S3 on Outposts, don’t miss the upcoming re:Invent session on Jan 13 – Extend Amazon S3 to on-premises environments with AWS Outposts

S3 console updates

That covers it from a S3 feature launch perspective, but we have two more updates to your Amazon S3 experience that you may have noticed to help make the way you use and manage S3 easier. In November, we launched a new Amazon S3 console that improves upload speed, simplifies common tasks, and makes it even easier to manage storage. In addition to a refreshed look and feel, the updated Amazon S3 console simplifies common tasks by presenting contextual information about your storage resources and other S3 features throughout the console. Some key changes include streamlining the work to copy bucket settings when creating new buckets, indicating the bucket level settings you have permissions to change, improving the performance of uploads, and having a new page that gives more visibility into upload progress. And finally, you can now discover a curated collection of third-party software built for Amazon S3 from within the S3 Management Console, called AWS Marketplace for S3.

That is a wrap on 2020, what do you want to see in 2021?

That covers it for Amazon S3 launches in 2020. To say the least, it has been an extremely challenging, unique, and year unlike any of us have ever experienced. We are looking forward to 2021, and on continuing the pace of innovation at AWS, to help you deliver more for your customers. Let us know what you want to see from Amazon S3 in the year(s) to come in the comments section, and we look forward to delivering more innovation.

Thanks, for your readership this year, and we are always looking for ideas on content!

Sean White

Sean White

Sean White is the Amazon S3 product marketing manager at AWS. He enjoys writing about the innovative ways customers use Amazon S3, launching new features and innovations that that delight customers, and simplifying the complex. He is based in Boston, loves spending time with his wife and two kids, watching sports, and exploring craft breweries.