AWS Database Blog

Category: Amazon DynamoDB

Introducing the Amazon DynamoDB data modeling MCP tool

To help you move faster with greater confidence, we’re introducing a new DynamoDB data modeling tool, available as part of our DynamoDB Model Context Protocol (MCP) server. The DynamoDB MCP data modeling tool integrates with AI assistants that support MCP, providing a structured, natural-language-driven workflow to translate application requirements into DynamoDB data models. In this post, we show you how to generate a data model in minutes using this new data modeling tool.

How to evaluate throughput utilization for Amazon DynamoDB tables in provisioned mode

In this post, we demonstrate how to evaluate throughput utilization for DynamoDB tables in provisioned mode. Understanding this metrics helps you determine whether switching to on-demand mode is the right choice. Moving to on-demand mode, where you pay-per-request for throughput, can optimize costs, eliminate capacity planning, minimize operational overhead, and enhance overall user experience for your applications.

SQL to NoSQL: Modernizing data access layer with Amazon DynamoDB

The transition from SQL-based access patterns to a DynamoDB API-driven approach presents opportunities to optimize how your application interacts with its data layer. This final part of our series focuses on implementing an effective abstraction layer and handling various data access patterns in DynamoDB.

SQL to NoSQL: Modeling data in Amazon DynamoDB

In this post, we explore strategies for designing DynamoDB data models, including entity identification, table design decisions, and relationship modeling approaches. We examine practical scenarios comparing different modeling strategies, helping you make informed decisions for your specific use case.

SQL to NoSQL: Planning your application migration to Amazon DynamoDB

This is the first part of a series exploring how to effectively migrate from SQL to DynamoDB. We will examine how to analyze existing database structures and access patterns to prepare for migration, focusing on schema analysis, query patterns, and usage metrics that inform DynamoDB data model design.

Supercharging AWS database development with AWS MCP servers

Amazon Aurora, Amazon DynamoDB, and Amazon ElastiCache are popular choices for developers powering critical workloads, including global commerce platforms, financial systems, and real-time analytics applications. To enhance productivity, developers are supplementing everyday tasks with AI-assisted tools that understand context, suggest improvements, and help reason through system configurations. Model Context Protocol (MCP) is at the helm of this revolution, rapidly transforming how developers integrate AI assistants into their development pipelines. In this post, we explore the core concepts behind MCP and demonstrate how new AWS MCP servers can accelerate your database development through natural language prompts.

Amazon DynamoDB zero-ETL integration with Amazon SageMaker Lakehouse – Part 2

Amazon DynamoDB zero-ETL integration with Amazon SageMaker Lakehouse allows you to run analytics workloads on your DynamoDB data without having to set up and manage extract, transform, and load (ETL) pipelines. In this post we cover setting up Amazon SageMaker Unified Studio, followed by running data analysis to showcase its capabilities. We illustrate our solution walkthrough with an example of a credit card company that wants to analyze its customer behavior and spending trends.

Amazon DynamoDB zero-ETL integration with Amazon SageMaker Lakehouse – Part 1

Amazon DynamoDB zero-ETL integration with Amazon SageMaker Lakehouse allows you to run analytics workloads on your DynamoDB data without having to set up and manage extract, transform, and load (ETL) pipelines. In this two-part series, we first walk through the prerequisites and initial setup for the zero-ETL integration. In Part 2, we cover setting up Amazon SageMaker Unified Studio, followed by running data analysis to showcase its capabilities. We illustrate our solution walkthrough with an example of a credit card company that wants to analyze its customer behavior and spending trends.

Upgrade your Amazon DynamoDB global tables to the current version

Amazon DynamoDB is a fully managed, serverless NoSQL database that delivers single-digit millisecond performance for applications at any scale. DynamoDB global tables is a multi-active database feature that replicates data across AWS Regions, enabling local reads and writes. In this post, we explain why we strongly recommend all customers use the Current version for all global tables.