AWS Database Blog

Category: Technical How-to

SQL to NoSQL: Modernizing data access layer with Amazon DynamoDB

The transition from SQL-based access patterns to a DynamoDB API-driven approach presents opportunities to optimize how your application interacts with its data layer. This final part of our series focuses on implementing an effective abstraction layer and handling various data access patterns in DynamoDB.

SQL to NoSQL: Modeling data in Amazon DynamoDB

In this post, we explore strategies for designing DynamoDB data models, including entity identification, table design decisions, and relationship modeling approaches. We examine practical scenarios comparing different modeling strategies, helping you make informed decisions for your specific use case.

SQL to NoSQL: Planning your application migration to Amazon DynamoDB

This is the first part of a series exploring how to effectively migrate from SQL to DynamoDB. We will examine how to analyze existing database structures and access patterns to prepare for migration, focusing on schema analysis, query patterns, and usage metrics that inform DynamoDB data model design.

AWS DMS validation: A custom serverless architecture

AWS DMS customers might choose not to use the data validation feature provided by the AWS DMS service due to the time it takes to complete validation after a load, a large dataset transfer or a data reload, where business requires rapid availability of data in the target environment. As a result, you might opt to perform validation manually or use a single pass full load only validation, which requires additional effort and time. In this post, we walk you through how to build a custom AWS DMS data validation solution with AWS serverless services.

Accelerate SQL Server to Amazon Aurora migrations with a customizable solution

Migrating from SQL Server to Amazon Aurora can significantly reduce database licensing costs and modernize your data infrastructure. To accelerate your migration journey, we have developed a migration solution that offers ease and flexibility. You can use this migration accelerator to achieve fast data migration and minimum downtime while customizing it to meet your specific business requirements. In this post, we showcase the core features of the migration accelerator, demonstrated through a complex use case of consolidating 32 SQL Server databases into a single Amazon Aurora instance with near-zero downtime, while addressing technical debt through refactoring.

Restore an Amazon RDS Custom for SQL Server instance using a backup from AWS Backup

AWS Backup supports the creation of on-demand backups of RDS Custom for SQL Server instances. However, the restoration of RDS Custom for SQL Server instances through AWS Backup is not natively supported at the time of writing this post. Nonetheless, this post presents a workaround solution that enables the successful restoration of RDS Custom for SQL Server instances using AWS Backup-created backups.

Better together: Amazon RDS for SQL Server and Amazon SageMaker Lakehouse, a generative AI data integration use case

Generative AI solutions are transforming how businesses operate worldwide. It has now become paramount for businesses to integrate generative AI capabilities into their customer-facing services and applications. The challenge they often face is the need to use massive amounts of relational data hosted on SQL Server databases to contextualize these new generative AI solutions. In this post, we demonstrate how you can address this challenge by combining Amazon RDS for SQL Server and Amazon SageMaker Lakehouse.

Supercharging AWS database development with AWS MCP servers

Amazon Aurora, Amazon DynamoDB, and Amazon ElastiCache are popular choices for developers powering critical workloads, including global commerce platforms, financial systems, and real-time analytics applications. To enhance productivity, developers are supplementing everyday tasks with AI-assisted tools that understand context, suggest improvements, and help reason through system configurations. Model Context Protocol (MCP) is at the helm of this revolution, rapidly transforming how developers integrate AI assistants into their development pipelines. In this post, we explore the core concepts behind MCP and demonstrate how new AWS MCP servers can accelerate your database development through natural language prompts.

Leveling up Amazon RDS with AWS Graviton4: Benchmarks

In November 2024, AWS introduced the latest evolution of its custom-designed ARM-based processors with Graviton4, delivering significant performance and efficiency improvements for Amazon RDS for PostgreSQL, MySQL, and MariaDB and Amazon Aurora. In this post, we focus on Amazon RDS for PostgreSQL and compare the performance of the new Graviton4 instances to both Graviton3 and Graviton2. Using benchmarks, we evaluate throughput, latency, and price-performance, showcasing the advantages of Graviton4 for modern database workloads.

Building a job search engine with PostgreSQL’s advanced search features

In today’s employment landscape, job search platforms play a crucial role in connecting employers with potential candidates. Behind these platforms lie complex search engines that must process and analyze vast amounts of structured and unstructured data to deliver relevant results. This post explores how to use PostgreSQL’s search features to build an effective job search engine. We examine each search capability in detail, discuss how they can be combined in PostgreSQL, and offer strategies for optimizing performance as your search engine scales.