S3 Batch Operations is an Amazon S3 data management feature that lets you manage billions of objects at scale with just a few clicks in the Amazon S3 Management Console or a single API request. With this feature, you can make changes to object metadata and properties, or perform other storage management tasks, such as copying objects between buckets, replacing object tag sets, modifying access controls, and restoring archived objects from S3 Glacier — instead of taking months to develop custom applications to perform these tasks.
Introduction to S3 Batch Operations
Watch this re:Invent session to get an overview of this new S3 feature and learn how it can save up to 90% of time spent on managing your S3 objects at scale.
For more technical details, go to "The Basics: Amazon S3 Batch Operations" Developer Guide »
How to Create a Job
Learn how to create an S3 Batch Operations jobs that appends metadata tags to objects in an S3 bucket. We start by selecting a manifest (a list of objects) and specifying the parameters for your job.
For more information, go to the "Creating a Batch Operations Job" Developer Guide »
Manage and Track a Job
Watch this guided demo to learn what actions you can take once an S3 Batch Operations job is created, including prioritizing jobs, cloning jobs, and cancelling jobs.
For more information, go to the "Managing Batch Operations Jobs" Developer Guide »
In this console demo, learn how to use AWS Identity and Access Management to define permissions for your requested S3 Batch Operations jobs. This is a required step to ensure that your S3 Batch Operations job can effectively access your target objects.
For more information, go to the "Granting Permissions for Batch Operations" Developer Guide »
Ready to get started?
Pay only for what you use. There is no minimum fee.
Instantly get access to the AWS Free Tier and start experimenting with Amazon S3.
Get started building with Amazon S3 in the AWS Console.