AWS Database Blog

Use Amazon DynamoDB incremental exports to drive continuous data retention

Amazon DynamoDB supports incremental exports to Amazon Simple Storage Service (Amazon S3), which enables a variety of use cases for downstream data retention and consumption. In this post, we show you how to maintain a continuously updating export of your table data by doing a bootstrap full export followed by an ongoing series of incremental exports.

Optimize costs with scheduled scaling of Amazon DocumentDB for read workloads

In this post, we show you two ways to schedule the scaling of your Amazon DocumentDB instance-based clusters to address anticipated read traffic patterns. By aligning your Amazon DocumentDB cluster scaling operations with the anticipated read traffic patterns, you can achieve optimal performance during peak loads and save costs by reducing the need to overprovision your cluster.

Introducing the Advanced Python Wrapper Driver for Amazon Aurora

Building upon our work with the Advanced JDBC (Java Database Connectivity) Wrapper Driver, we are continuing to enhance the scalability and resiliency of today’s modern applications that are built with Python. The Advanced Python Wrapper Driver has been released as an open-source project under the Apache 2.0 License. You can find the project on GitHub. In this post, we provide details on how to use some of the features of the Advanced Python Wrapper Driver.

Upgrade Amazon RDS for SQL Server 2014 to a newer supported version using the AWS CLI

As SQL Server 2014 approaches its end of support on July 9, 2024, it’s crucial to understand your options and take a proactive approach in planning and upgrading your SQL Server databases to the latest version. In this post we show you how to leverage AWS Command Line Interface (AWS CLI) automation to upgrade your current RDS for SQL Server 2014 instance to a more recent supported version.

Near zero-downtime migrations from self-managed Db2 on AIX or Windows to Amazon RDS for Db2 using IBM Q Replication

When you’re migrating your mission-critical Db2 database from on premises or Amazon Elastic Compute Cloud (Amazon EC2) to Amazon RDS for Db2, one of the key requirements is to have near-zero downtime. This post demonstrates how to use IBM InfoSphere Data Replication (IIDR) Q Replication to migrate data with minimal downtime.

Build a FedRAMP compliant generative AI-powered chatbot using Amazon Aurora Machine Learning and Amazon Bedrock

In this post, we explore how to use Amazon Aurora PostgreSQL and Amazon Bedrock to build Federal Risk and Authorization Management Program (FedRAMP) compliant generative artificial intelligence (AI) applications using Retrieval Augmented Generation (RAG).

Exploring new features of Apache TinkerPop 3.7.x in Amazon Neptune

Amazon Neptune 1.3.2.0 now supports the Apache TinkerPop 3.7.x release line, introducing many major new features and improvements. In this post, we highlight the features that have the greatest impact on Gremlin developers using Neptune, to help you understand the implications of upgrading to these versions of Neptune and TinkerPop.

Turn petabytes of relational database records into a cost-efficient audit trail using Amazon Athena, AWS DMS, Amazon RDS, and Amazon S3

In this post, we show how you can use AWS Database Migration Service (AWS DMS) to migrate relational data from Amazon RDS into compressed archives on Amazon S3. We discuss partitioning strategies for the resulting archive objects and how to use S3 Object Lock to protect the archive objects from modification. Lastly, we demonstrate how to query the archive objects using SQL syntax through Athena with seconds latency, even on large datasets.