AWS Storage Blog
Tag: Amazon S3 Batch Operations
How Delhivery migrated 500 TB of data across AWS Regions using Amazon S3 Replication
Delhivery is one of the largest third-party logistics providers in India. It fulfills millions of packages every day, servicing over 18,000 pin codes in India and powered by more than 20 automated sort centers, 90 warehouses, with over 2800 delivery centers. Data is at the core of the Delhivery’s business. In anticipating of potential regulatory […]
Transition data to cheaper storage based on custom filtering criteria with Amazon S3 Lifecycle
As your organization’s data grows, effective management of storage costs is crucial for operating an efficient and cost-effective data infrastructure. One of the most efficient strategies to reduce storage costs is transitioning files to less expensive cold storage classes. To optimize storage costs according to their specific needs and requirements, organizations need the flexibility to […]
Faster restores on Veeam using Amazon S3 Glacier Flexible Retrieval and S3 Batch Operations
Storing multiple copies of data is often an enterprise data protection best practice and a critical part of backup and recovery solutions. The ability to quickly recover or restore data – often from backup copies in cost-effective archive storage – is critical to minimizing potential downtime or operational disruptions in disaster recovery (DR) scenarios such […]
Maintaining object immutability by automatically extending Amazon S3 Object Lock retention periods
Protecting against accidental or malicious deletion is a key element of data protection. Immutability protects data in-place, preventing unintended changes or deletions. However, sometimes it isn’t clear for how long data should be made immutable. Users in this situation are looking for a solution that maintains short-term immutability, indefinitely. They want to make sure their […]
Streamline data management at scale by automating the creation of Amazon S3 Batch Operations jobs
Over time, Enterprises may need to undertake operations or make modifications to their data as part of general data management, to address changing business needs, or to comply with evolving data-management regulations and best practices. As datasets being generated, stored, and analyzed continue to grow exponentially, the need for simplified, scalable, and reproduceable data management […]
Automate object processing in Amazon S3 directory buckets with S3 Batch Operations and AWS Lambda
Data, the lifeblood of any modern organization, is rarely static. For high-performance applications and workloads, enterprises need the ability to run operations on massive amounts of data, including modifying the data as is necessary for each use case, to further accelerate processing. This could include modifying uploaded images with a watermark, changing the bitrate of […]
Managing duplicate objects in Amazon S3
When managing a large volume of data in a storage system, it is common for data duplication to happen. Data duplication in data management refers to the presence of multiple copies of the same data within your system, leading to additional storage usage as well as extra overhead when handling multiple copies of the same […]
Simplify querying your archive data in Amazon S3 with Amazon Athena
Today, customers increasingly choose to store data for longer because they recognize its future value potential. Storing data longer, coupled with exponential data growth, has led to customers placing a greater emphasis on storage cost optimization and using cost-effective storage classes. However, a modern data archiving strategy not only calls for optimizing storage costs, but […]
Reduce recovery time and optimize storage costs with faster restores from Amazon S3 Glacier storage classes and Commvault
Data is the lifeblood of any modern business. Organizations are storing more copies of their application data than ever before to recover from data loss, repair data corruption or ransomware damage, respond to compliance requests, and become more data driven. Storing more data at reduced cost enables businesses to extract more value and insights to […]
Reducing AWS Key Management Service costs by up to 99% with Amazon S3 Bucket Keys
Customers across many industries face increasingly stringent audit and compliance requirements on data security and privacy. Certain compliance frameworks, such as FISMA, FEDRAMP, PCI DSS, and SOC 2, have specific regulatory standards for validating the security of systems. A common requirement for these compliance frameworks is more rigorous encryption standards for data-at-rest, where organizations must […]