AWS Storage Blog
What’s new with Amazon S3? A recap of 2020 launches
At AWS, we have a culture of innovation, and that is apparent for Amazon S3 when you look back at the number of features delivered this year to help you easily manage, secure your data, optimize costs, or use S3 in new ways. Meaningful innovation thrives on AWS. So, whether you’re looking to invent new services continuously, extract value from data efficiently, or modernize IT rapidly, you can do it on AWS and Amazon S3.
The rapid pace of innovation brings many benefits, but it can be a challenge to keep track of every launch, and how it can help you manage your data. In this post, I provide a recap of what’s new with Amazon S3, and links to all of the 2020 launches that help make your data more secure, consistent, visible, and help optimize and lower your storage costs. You can also view the on-demand re:Invent session, What’s new with Amazon S3, to get up-to-speed on the majority of the launches covered below. For a list of Amazon S3 re:Invent sessions that are now on-demand, you can read this post, and for the upcoming re:Invent sessions (Jan 12-14), you can bookmark this post.
Amazon S3 now delivers strong read-after-write consistency
At re:Invent 2020, we announced a major update for Amazon S3, with the strong consistency launch. Amazon S3 pioneered object storage in the cloud with high availability, performance, and virtually unlimited scalability, with eventual consistency. Increasingly, customers are using big data analytics applications that often require access to an object immediately after a write. Without strong consistency, you would insert custom code into these applications, or provision databases to keep objects consistent with any changes in Amazon S3 across millions or billions of objects. Amazon S3 now delivers strong read-after-write consistency automatically for all applications. Unlike other cloud providers, Amazon S3 delivers strong read-after-write consistency for any storage request, without changes to performance or availability, without sacrificing regional isolation for applications, and at no additional cost.
S3 Intelligent-Tiering adds Archive Access Tiers – further optimizes costs
Amazon S3 Intelligent-Tiering now supports automatic data archiving to further reduce storage costs by up to 95% when objects become rarely accessed over long periods of time. The S3 Intelligent-Tiering storage class is the first and only cloud storage that automatically optimizes customers’ storage costs. S3 Intelligent-Tiering delivers milliseconds latency and high throughput performance for frequently and infrequently accessed data in the Frequent and Infrequent Access Tiers, and now the lowest storage costs in the cloud when data is rarely accessed in the Deep Archive Access Tier.
S3 Intelligent-Tiering delivers automatic cost savings by moving objects between four access tiers when access patterns change. S3 Intelligent-Tiering stores objects in four access tiers: two low latency access tiers optimized for frequent and infrequent access, and now two opt-in archive access tiers designed for asynchronous access and optimized for rare access at low costs. To see how S3 Intelligent-Tiering with automatic archiving works, read the blog post, or visit the S3 cost optimization page.
You can learn more about optimizing your cost and performance on Amazon S3 in the re:Invent session on Jan 12 – Better, faster, and lower-cost storage: Optimizing Amazon S3
S3 Storage Lens delivers organization-wide visibility into object storage usage and activity trends
Customers use Amazon S3 to store large shared data sets across tens to hundreds of accounts and buckets, multiple regions, and thousands of prefixes. Now, S3 Storage Lens delivers storage metrics across all accounts, regions, and buckets to help you understand, analyze, and optimize your storage. S3 Storage Lens delivers organization-wide visibility into your object storage usage and activity trends, and makes actionable recommendations to improve cost-efficiency and apply data protection best practices. S3 Storage Lens is the first cloud storage analytics solution to provide a single view of object storage usage and activity across tens to hundreds of accounts in an AWS organization, with drill-downs to generate insights at the account, bucket, or even prefix level. Drawing from more than 14 years of experience helping customers optimize storage, S3 Storage Lens analyzes organization-wide metrics to deliver contextual recommendations to find ways to reduce your storage costs and apply best practices on data protection. To learn more about Amazon S3 Storage Lens, visit the webpage, read the blog post, or read the documentation.
You can learn more about S3 Storage Lens in the upcoming re:Invent session on Jan 12 – Bytes to insights: Analyze and take action on your storage usage.
S3 Replication launches to simplify, and enhance your replication strategy
With Amazon S3 Replication, you can configure Amazon S3 to automatically replicate S3 objects across different AWS Regions by using Amazon S3 Cross-Region Replication (CRR) or between buckets in the same AWS Region by using Amazon S3 Same-Region Replication (SRR). S3 Replication launched the flexibility of replicating to multiple destination buckets in the same, or different AWS Regions (Dec 2020). S3 Replication also now supports two-way replication between two or more buckets in the same, or different AWS Regions (Dec 2020). Customers needing a predictable replication time backed by a Service Level Agreement (SLA) can use Replication Time Control (RTC) to replicate objects in less than 15 minutes (Nov 2019). Amazon S3 Replication now provides detailed metrics and notifications to monitor the status of object replication between buckets (Nov 2020). You can monitor replication progress by tracking bytes pending, operations pending, and replication latency between your source and destination buckets. And finally, S3 Replication is now able to replicate delete markers from one S3 bucket to another, to protect data from accidental deletions
To summarize, here are the launches for S3 Replication:
- Support for two-way replication: 12/01/2020
- Support for multiple destinations in the same, or different AWS Regions: 12/01/2020
- Adds support for replicating delete markers: 11/09/2020
- Support for metrics and notifications: 11/09/2020
- Amazon S3 Replication Time Control for predictable replication time, backed by an SLA: 11/20/2019
Security and access control launches
You can learn about a lot of these security updates in the upcoming re:Invent session (Jan 14) – A defense-in-depth approach to Amazon S3 security and access
Complete list of Amazon S3 security launches for 2020
- New IAM condition keys for Amazon S3 limit requests to buckets owned by specific AWS accounts, and to specific TLS versions: 12/21/2020
- Amazon S3 Bucket Keys reduce the costs of Server-Side Encryption with AWS Key Management Service (SSE-KMS): 12/01/2020
- AWS X-Ray now supports trace context propagation for Amazon S3: 11/16/2020
- Amazon S3 Object Ownership is now generally available with AWS CloudFormation support: 11/09/2020
- Amazon S3 bucket owner condition helps to validate correct bucket ownership: 09/11/2020
- Amazon GuardDuty expands threat detection coverage to help you better protect your data stored in Amazon S3: 07/31/2020
- Tighten S3 permissions for your IAM users and roles using access history of S3 actions: 06/03/2020
- Amazon S3 Batch Operations adds support for S3 Object Lock: 05/04/2020
- Discover, review, and remediate unintended access to S3 buckets shared through S3 Access Points: 04/27/2020
- Amazon S3 adds tagging support for S3 Batch Operations jobs: 03/16/2020
- Announcing major enhancements to Amazon Macie, an 80%+ price reduction, and global region expansion: 05/13/2020
- Introducing Access Analyzer for Amazon S3 to review access policies: 12/02/2019
- Amazon S3 Access Points makes it simple to manage access at scale for applications using shared data sets on S3: 12/03/2019
S3 on Outposts is generally available, expanding object storage to on-premises environments
Amazon S3 on Outposts delivers object storage to your on-premises AWS Outposts environment to meet local data processing and data residency needs. Using the S3 APIs and features, S3 on Outposts makes it easy to store, secure, tag, retrieve, report on, and control access to the data on your Outpost. AWS Outposts is a fully managed service that extends AWS infrastructure, services, and tools to virtually any data center, co-location space, or on-premises facility for a truly consistent hybrid experience. To learn more, visit the S3 on Outposts page, documentation, or read the blog post.
To hear more about S3 on Outposts, don’t miss the upcoming re:Invent session on Jan 13 – Extend Amazon S3 to on-premises environments with AWS Outposts
S3 console updates
That covers it from a S3 feature launch perspective, but we have two more updates to your Amazon S3 experience that you may have noticed to help make the way you use and manage S3 easier. In November, we launched a new Amazon S3 console that improves upload speed, simplifies common tasks, and makes it even easier to manage storage. In addition to a refreshed look and feel, the updated Amazon S3 console simplifies common tasks by presenting contextual information about your storage resources and other S3 features throughout the console. Some key changes include streamlining the work to copy bucket settings when creating new buckets, indicating the bucket level settings you have permissions to change, improving the performance of uploads, and having a new page that gives more visibility into upload progress. And finally, you can now discover a curated collection of third-party software built for Amazon S3 from within the S3 Management Console, called AWS Marketplace for S3.
That is a wrap on 2020, what do you want to see in 2021?
That covers it for Amazon S3 launches in 2020. To say the least, it has been an extremely challenging, unique, and year unlike any of us have ever experienced. We are looking forward to 2021, and on continuing the pace of innovation at AWS, to help you deliver more for your customers. Let us know what you want to see from Amazon S3 in the year(s) to come in the comments section, and we look forward to delivering more innovation.
Thanks, for your readership this year, and we are always looking for ideas on content!