AWS Storage Blog
Efficiently verify Amazon S3 data at scale with compute checksum operation
Organizations across industries must regularly verify the integrity of their stored datasets to protect valuable information, satisfy compliance requirements, and preserve trust. Media and entertainment customers validate assets to make sure that content remains intact, financial institutions run integrity checks to meet regulatory obligations, and research institutions confirm the reproducibility of scientific results. These verifications […]
Architecting secure and compliant managed file transfers with AWS Transfer Family SFTP connectors and PGP encryption
Users in industries such as financial services and healthcare regularly exchange files with their external business partners containing sensitive and regulated datasets, such as Personal Identifiable Information (PII) and financial records. These file transfers often happen over the Secure File Transfer Protocol (SFTP) and encrypting files using Pretty Good Privacy (PGP) before transfer is often […]
Encrypt and decrypt files with PGP and AWS Transfer Family
1/11/2024: Updates made due to CloudShell migration to Amazon Linux 2023 (AL2023). Protecting sensitive data is not a novel idea. Customers in industries like financial services and healthcare regularly exchange files containing sensitive data, including Personal Identifiable Information (PII) and financial records with their users. Pretty Good Privacy (PGP) encryption of these files is often […]
Restoring archived objects at scale from the Amazon S3 Glacier storage classes
Update (7/26/2024): You no longer need to optimize the S3 Inventory report using Amazon Athena. Amazon S3 will automatically optimize your S3 Batch Operations restore job to achieve the fastest retrieval throughput. For more guidance on using batch operations, learn more in the S3 User Guide. Every organization around the world has archival data. There is […]


