AWS Database Blog

Category: Database

Implement active/active replication between Amazon Aurora clusters using Oracle GoldenGate

Enterprises both large and small, across diverse industries and with varying levels of cloud maturity, recognize the importance and value of deploying active/active database configurations. An active/active system is a network of independent processing nodes, each having access to a common replicated database so all nodes can participate in a common application. Some enterprises are […]

Read More

Replicate and transform data in Amazon Aurora PostgreSQL across multiple Regions using AWS DMS

Global organizations that operate and do business in many countries need to be compliant with data sovereignty and other compliance rules like GDPR. For example, you may want to replicate data to other Regions while at the same time removing certain columns to adhere to privacy laws within a country. In this post, we demonstrate […]

Read More

Automate Amazon Aurora Global Database endpoint management for planned and unplanned failover

This blog post was last reviewed or updated April, 2022 to include unplanned failover feature. Amazon Aurora is a MySQL and PostgreSQL-compatible relational database built for the cloud. Aurora combines the performance and availability of traditional enterprise databases with the simplicity and cost-effectiveness of open-source databases. Aurora Global Database lets you span your relational database […]

Read More

Trigger notifications on time series data with Amazon Timestream

In recent years, large-scale internet of things (IoT) applications generate data at fast rates, and many IoT implementations require data to be stored sequentially, based on date-time values generated either at sensor or at ingestion levels. In use cases such as smart factories, IoT data and time series data are being produced at a large […]

Read More

Evaluate Amazon DocumentDB (with MongoDB compatibility) configurations using AWS Config

It’s common practice for organizations to define compliance and standards for all applications and software they interact with, such as databases, storage, network, and compute. The key drivers are to comply with different regulations to achieve security and audit certifications according to the domain they operate in. AWS Config allows you to create rules and […]

Read More
DBBLOG-1780-arch-diag

Automate benchmark tests for Amazon Aurora PostgreSQL

Optimizing a database is an important activity for new and existing application workloads. You need to take cost, operations, performance, security, and reliability into consideration. Conducting benchmark tests help with these considerations. With Amazon Aurora PostgreSQL-Compatible Edition, you can run multiple benchmark tests with different transaction characteristics matching your data access patterns. In this post, […]

Read More

Connect to Amazon Keyspaces from your desktop using IntelliJ, PyCharm, or DataGrip IDEs

AWS customers use Amazon Keyspaces (for Apache Cassandra) to modernize their Cassandra workloads. Keyspaces offers customers scalability and fast performance to provide users a great end-user experience. Amazon Keyspaces is a scalable, highly available, and managed Apache Cassandra-compatible database service. With Amazon Keyspaces, you can run your Cassandra workloads on AWS using the same Cassandra […]

Read More
shows an overview of the solution

Send webhooks to SaaS applications from Amazon Aurora via Amazon EventBridge

Customers developing software as a service (SaaS) applications often need to send outgoing webhooks (HTTP call-backs in response to events) to other SaaS applications such as Salesforce, Marketo, or ServiceNow. When processing webhooks, you often have to implement custom logic or services to enqueue and emit these events. This introduces additional complexity and operational overhead. […]

Read More

Post-migration steps and best practices for Amazon RDS for SQL Server

Amazon Relational Database Service (Amazon RDS) makes it easy to set up, operate, and scale a relational database in the cloud. It provides cost-efficient and resizable capacity while automating time-consuming administration tasks, such as hardware provisioning, database setup, patching, and backups. It frees you to focus on your applications, so you can give them the […]

Read More
process flow

Stream time series data into Amazon Timestream using Apache NiFi

Time series data is one of the fastest growing categories of source data used by organizations to provide better services, analysis, and insights to their end-users. High on the list of requirements is the speed at which streaming data can be ingested and accessed by existing and new applications. One of the enabling technologies being […]

Read More