Now you can export your Amazon DynamoDB table data to your data lake in Amazon S3 to perform analytics at any scale

Posted on: Nov 9, 2020

Now you can export your Amazon DynamoDB table data to your data lake in Amazon S3, and use other AWS services such as Amazon Athena, Amazon SageMaker, and AWS Lake Formation to analyze your data and extract actionable insights. No code-writing is required. 

All DynamoDB data added to your Amazon S3 data lake is easily discoverable, encrypted at rest and in transit, and retained in your Amazon S3 bucket until you delete it. This new feature does not consume table capacity, and has zero impact on the performance and availability of your production applications. You also can use this feature to export data to Amazon S3 across AWS Regions and accounts to help you comply with regulatory requirements, and to develop a disaster recovery and business continuity plan.  

You can export DynamoDB tables ranging from a few megabytes to hundreds of terabytes of data with a few clicks in the AWS Management Console, a simple API call, or the AWS Command Line Interface. Choose a DynamoDB table that has point-in-time recovery (continuous backups) enabled, specify any point in time in the last 35 days, and choose the target Amazon S3 bucket. The output data formats supported are DynamoDB JSON and Amazon Ion.  

To learn more about this feature, see Data Export and Export Amazon DynamoDB Table Data to Your Data Lake in Amazon S3. For information about pricing and regional availability, see Amazon DynamoDB pricing.