AWS Database Blog

Implement event-driven architectures with Amazon DynamoDB – Part 3

In Part 1 of this series, we explored how to use Amazon EventBridge Scheduler for precise data eviction in Amazon DynamoDB. Part 2 discussed how to use a global secondary index (GSI) for strict data management within DynamoDB. In this post, we focus on using EventBridge Scheduler for fine-grained event scheduling based on data written to DynamoDB.

Throughout this series, we’ve examined various strategies for managing data within DynamoDB. This post shifts the focus to an event-driven pattern that reliably schedules future downstream actions using EventBridge Scheduler. One of the key advantages of this approach is its ability to invoke time-critical downstream events, such as sending reminder notifications for upcoming appointments, expiring offers, or subscription renewals. For example, when a user’s subscription is about to expire, EventBridge Scheduler can trigger an event that invokes an AWS Lambda function with relevant DynamoDB item details. This Lambda function can then use Amazon Simple Email Service (Amazon SES) to send a timely notification to the user.

This architecture helps users receive timely reminders, enhancing their experience and improving engagement. The flexibility of EventBridge Scheduler allows for fine-tuned control over notification timings, accommodating various business needs.

Solution overview

This solution demonstrates how to use Amazon DynamoDB Streams and AWS Lambda to automatically schedule future actions based on item writes to a DynamoDB table. By capturing these writes through stream records, a Lambda function is triggered to create precise, time-based schedules using Amazon EventBridge Scheduler. These schedules can then invoke downstream services such as Amazon SES for email reminders, Amazon Simple Queue Service (Amazon SQS), AWS Step Functions, or other AWS services, enabling highly scalable and reliable event-driven workflows.

The following diagram illustrates the solution architecture.

Architecture Overview

A common use case for this pattern is scheduling future events, such as appointment reminders, expiring offers, or subscription renewals, with high reliability. EventBridge Scheduler provides the flexibility to schedule one-time or recurring actions based on attributes stored in DynamoDB items.Here’s how the flow works:

  1. Data write – When an item is written to the DynamoDB table, a corresponding stream record is created in its associated DynamoDB stream.
  2. Trigger – This stream record, either individually or as part of a batch, triggers an AWS Lambda function.
  3. Schedule creation – The Lambda function uses EventBridge Scheduler to create a future schedule. The schedule includes the exact timestamp and relevant item data.
  4. Target invocation – At the designated time, EventBridge Scheduler invokes the configured target, such as sending an email with Amzon SES, queuing a message in Amazon SQS, or starting an execution in Step Functions.

This architecture is designed to scale with your data, allowing fine-grained scheduling logic without over-complicating the DynamoDB data model or requiring constant polling.Let’s review the steps to implement this solution.

Prerequisites

Before diving into implementing the event-driven solution, you should have the following prerequisites in place:

  • AWS account – Access to an active AWS account
  • DynamoDB basics – A foundational understanding of DynamoDB concepts, including tables, items, attributes, and basic CRUD operations, is necessary to effectively configure and manage the database.
  • Lambda functions – Familiarity with Lambda is crucial, because you’ll be creating and deploying Lambda functions to process Amazon DynamoDB Streams events and create schedules through EventBridge Scheduler.
  • EventBridge Scheduler – Basic knowledge of Amazon EventBridge is necessary for setting up EventBridge Scheduler rules to invoke specific API actions.
  • Amazon SES – Basic knowledge of Amazon SES and a verified sender email is required. To create and verify an email address identity, see Creating and verifying identities in Amazon SES.
  • AWS CLI or console proficiency – Proficiency in using either the AWS Command Line Interface (AWS CLI) or the AWS Management Console is recommended for configuring services, creating resources, and monitoring logs.

Create a DynamoDB table

Complete the following steps to create a DynamoDB table:

  1. On the DynamoDB console, choose Tables in the navigation pane.
  2. Choose Create table.
  3. For Table name, enter a name for your new table.
  4. For Partition key, enter PK as the name and choose String as the type.
  5. For Sort key, enter SK as the name and choose String as the type.
    Create Table
  6. Leave all other configurations as default and choose Create table.
    The table creation should complete within a few seconds.
  7. Choose Tables in the navigation pane, then open the table you created.
  8. In the DynamoDB stream details section, choose Turn on.
    Streams
  9. Select New and old images, then choose Turn on stream.
    Stream config

This will enable DynamoDB Streams on the table to expose both the old and new state of items in the stream records, so you can manage updates on items’ Time to Live (TTL) values.

Create a Lambda function

Complete the following steps to create your Lambda function:

  1. On the Lambda console, choose Functions in the navigation pane.
  2. Choose Create function.
  3. Select Author from scratch.
  4. For Function name, enter a name (for example, DDBStreamTriggerEventScheduler).
  5. Choose the latest Runtime for Node.js
  6. To the Lambda service role you attached to the function, add the AWS Identity and Access Management (IAM) managed policy AWSLambdaDynamoDBExecutionRole and an inline policy with scheduler:CreateSchedule permission.
  7. Choose Create function.
    Create Lambda
  8. After you create the Lambda function, choose Add trigger to configure the function as the event source mapping for the DynamoDB table.
  9. Choose DynamoDB as the source.
  10. For DynamoDB table enter the ARN for Appointment-Table.
  11. Leave the remaining settings as default and choose Add to create the trigger.
    DynamoDB Trigger
  12. On the Code tab of your Lambda function, replace the default code with the following Node.js code. Be sure to update the placeholders with appropriate values—such as replacing ses-verified-email@example.com with your verified Amazon SES email address. Additionally, ensure that the IAM role used by EventBridge Scheduler has the ses:SendEmail permission.
    import { SchedulerClient, CreateScheduleCommand } from "@aws-sdk/client-scheduler";
    const client = new SchedulerClient({ region: <region_name> }); // such as: eu-west-1
    export const handler = async (event) => {
        try {
            for (const record of event.Records) {
                let params = {
                    eventID: record.eventID,
                    sequenceNumber: record.dynamodb.SequenceNumber,
                    email: record.dynamodb.email, // Email to send
                    subject: "Time for your appointment", // Email subject
                    reminderTS: record.dynamodb.NewImage.REMINDER_TIMESTAMP.S, // Scheduled time in ISO format
                    
                };
                
                params.body = "This is the email body, you have a reminder" // Email body
                await scheduleEmail(params);
            }
            const response = {
                statusCode: 200,
                body: JSON.stringify('Complete'),
            };
            return response;
        } catch (error) {
            console.error("Error processing event: ", error);
            const response = {
                statusCode: 500,
                body: JSON.stringify({
                    message: 'Error processing event',
                    error: error.message
                }),
            };
            return response;
        }
    };
    const scheduleEmail = async (params) => {
        try {
            const sesParams = {
                Destination: {
                    ToAddresses: [params.email],
                },
                Message: {
                    Body: {
                        Text: {
                            Data: params.body,
                        },
                    },
                    Subject: {
                        Data: params.subject,
                    },
                },
                Source: "ses-verified-email@example.com", // Verified sender email
            };
            const target = {
                RoleArn: <role_arn>, // such as: arn:aws:iam::XXXX:role/SchedulerRole
                Arn: <target_arn>, // such as: "arn:aws:scheduler:::aws-sdk:ses:sendEmail",
                Input: JSON.stringify(sesParams),
                DeadLetterConfig: {
                    Arn: <dlq_arn> // such as: arn:aws:sqs:eu-west-1:XXXX:Appointment-DLQ
                },
            };
            const schedulerInput = {
                Name: params.eventID,
                FlexibleTimeWindow: {
                    Mode: "OFF",
                },
                ActionAfterCompletion: "DELETE",
                Target: target,
                ScheduleExpression: `at(${params.reminderTS})`,
                ClientToken: params.sequenceNumber,
            };
            const command = new CreateScheduleCommand(schedulerInput);
            const result = await client.send(command);
            return result;
        } catch (error) {
            console.error("Error scheduling email: ", error);
            throw new Error(`Failed to schedule email: ${error.message}`);
        }
    };
  13. Choose Deploy to deploy the latest function code.

Tip: To improve reliability and observability, you can configure Dead Letter Queues (DLQs) at two points in this architecture. First, you can add a DLQ to the Lambda function that consumes from the DynamoDB stream, this captures any failures that occur while processing stream records or creating schedules, such as malformed input or permission errors. Second, you can configure a DLQ as part of the EventBridge Scheduler target. This captures failures that happen at the time of execution, for example, if Amazon SES fails to send an email or the target service is unavailable. Using both DLQs allows you to track, analyze, and retry failures across the full scheduling and delivery lifecycle.

Generate sample writes to see the solution in action

Run the following AWS CLI command to simulate writes to your DynamoDB table. This loop inserts 10 sample items into the table, each with a unique partition key (PK) and a static sort key (SK). Each item includes a REMINDER_TIMESTAMP set to 3 minutes from the current time and a test email address. These writes will trigger the DynamoDB Stream, which invokes your Lambda function to schedule reminder emails via EventBridge Scheduler. Be sure to replace abc@example.com with a valid, verified email address in Amazon SES to observe the full flow of the solution.

#!/bin/bash
TABLE="Appointment-Table"
for PK_VALUE in {1..10};
do
  ISO_TIMESTAMP_PLUS_3_MINS=$(date -v+3M -u +"%Y-%m-%dT%H:%M:%S")
  aws dynamodb put-item --table-name $TABLE \
  --item '{"PK": {"S": "'$PK_VALUE'"}, "SK": {"S": "StaticSK"}, "REMINDER_TIMESTAMP": {"S": "'$ISO_TIMESTAMP_PLUS_3_MINS'"}, "email": {"S": "abc@example.com"}, "ATTR_1": {"S": "This is a static attribute"}}' 
done

To monitor the reminder emails being sent, navigate to the Monitoring tab of the EventBridge schedule group. You can view the events invoked by EventBridge Scheduler at a specific time by looking at the InvocationAttemptCount metric. In our case, the invocations are appointment reminder emails to users through Amazon SES. For a list of all metrics available for a schedule group, refer to Monitoring Amazon EventBridge Scheduler with Amazon CloudWatch.

Invocation Metrics

Clean up

If you created a test environment to follow along with this post, make sure to delete the DynamoDB table, Lambda function, EventBridge schedule, and any other resources you created for testing the solution.

Summary

In this post, we showed how to use EventBridge Scheduler for fine-grained event scheduling based on data written to DynamoDB. This solution allows you to drive precise, timely actions based on timestamps stored in your DynamoDB items when you want to send time-sensitive notifications or invoke downstream jobs.

Across this three-part series, we explored how to extend the native capabilities of Amazon DynamoDB using event-driven architecture patterns tailored to real-world needs:

  • Part 1 introduced a near real-time TTL solution using Amazon EventBridge Scheduler to delete expired items with greater precision than native TTL.
  • Part 2 demonstrated how to build a strict data management solution using a sharded global secondary index (GSI), EventBridge Scheduler, and Lambda to periodically query and evict expired records.
  • Part 3 focused on using DynamoDB Streams and EventBridge Scheduler to schedule future downstream actions based on data written to a DynamoDB table, for example, sending reminder emails for upcoming appointments.

To dive deeper and explore additional best practices for designing with DynamoDB and EventBridge, visit the Amazon DynamoDB documentation and Amazon EventBridge documentation.


About the authors

Lee Hannigan

Lee Hannigan

Lee is a Sr. DynamoDB Specialist Solutions Architect based in Donegal, Ireland. He brings a wealth of expertise in distributed systems, backed by a strong foundation in big data and analytics technologies. In his role as a DynamoDB Specialist Solutions Architect, Lee excels in assisting customers with the design, evaluation, and optimization of their workloads using the capabilities of DynamoDB.

Aman Dhingra

Aman Dhingra

Aman is a Sr. DynamoDB Specialist Solutions Architect based in Dublin, Ireland. He is passionate about distributed systems and has a strong background in big data & analytics. Aman is the author of Amazon DynamoDB – The Definitive Guide and helps customers design, evaluate, and optimize workloads running on Amazon DynamoDB.