AWS Database Blog
Implement event-driven architectures with Amazon DynamoDB – Part 3
In Part 1 of this series, we explored how to use Amazon EventBridge Scheduler for precise data eviction in Amazon DynamoDB. Part 2 discussed how to use a global secondary index (GSI) for strict data management within DynamoDB. In this post, we focus on using EventBridge Scheduler for fine-grained event scheduling based on data written to DynamoDB.
Throughout this series, we’ve examined various strategies for managing data within DynamoDB. This post shifts the focus to an event-driven pattern that reliably schedules future downstream actions using EventBridge Scheduler. One of the key advantages of this approach is its ability to invoke time-critical downstream events, such as sending reminder notifications for upcoming appointments, expiring offers, or subscription renewals. For example, when a user’s subscription is about to expire, EventBridge Scheduler can trigger an event that invokes an AWS Lambda function with relevant DynamoDB item details. This Lambda function can then use Amazon Simple Email Service (Amazon SES) to send a timely notification to the user.
This architecture helps users receive timely reminders, enhancing their experience and improving engagement. The flexibility of EventBridge Scheduler allows for fine-tuned control over notification timings, accommodating various business needs.
Solution overview
This solution demonstrates how to use Amazon DynamoDB Streams and AWS Lambda to automatically schedule future actions based on item writes to a DynamoDB table. By capturing these writes through stream records, a Lambda function is triggered to create precise, time-based schedules using Amazon EventBridge Scheduler. These schedules can then invoke downstream services such as Amazon SES for email reminders, Amazon Simple Queue Service (Amazon SQS), AWS Step Functions, or other AWS services, enabling highly scalable and reliable event-driven workflows.
The following diagram illustrates the solution architecture.
A common use case for this pattern is scheduling future events, such as appointment reminders, expiring offers, or subscription renewals, with high reliability. EventBridge Scheduler provides the flexibility to schedule one-time or recurring actions based on attributes stored in DynamoDB items.Here’s how the flow works:
- Data write – When an item is written to the DynamoDB table, a corresponding stream record is created in its associated DynamoDB stream.
- Trigger – This stream record, either individually or as part of a batch, triggers an AWS Lambda function.
- Schedule creation – The Lambda function uses EventBridge Scheduler to create a future schedule. The schedule includes the exact timestamp and relevant item data.
- Target invocation – At the designated time, EventBridge Scheduler invokes the configured target, such as sending an email with Amzon SES, queuing a message in Amazon SQS, or starting an execution in Step Functions.
This architecture is designed to scale with your data, allowing fine-grained scheduling logic without over-complicating the DynamoDB data model or requiring constant polling.Let’s review the steps to implement this solution.
Prerequisites
Before diving into implementing the event-driven solution, you should have the following prerequisites in place:
- AWS account – Access to an active AWS account
- DynamoDB basics – A foundational understanding of DynamoDB concepts, including tables, items, attributes, and basic CRUD operations, is necessary to effectively configure and manage the database.
- Lambda functions – Familiarity with Lambda is crucial, because you’ll be creating and deploying Lambda functions to process Amazon DynamoDB Streams events and create schedules through EventBridge Scheduler.
- EventBridge Scheduler – Basic knowledge of Amazon EventBridge is necessary for setting up EventBridge Scheduler rules to invoke specific API actions.
- Amazon SES – Basic knowledge of Amazon SES and a verified sender email is required. To create and verify an email address identity, see Creating and verifying identities in Amazon SES.
- AWS CLI or console proficiency – Proficiency in using either the AWS Command Line Interface (AWS CLI) or the AWS Management Console is recommended for configuring services, creating resources, and monitoring logs.
Create a DynamoDB table
Complete the following steps to create a DynamoDB table:
- On the DynamoDB console, choose Tables in the navigation pane.
- Choose Create table.
- For Table name, enter a name for your new table.
- For Partition key, enter PK as the name and choose String as the type.
- For Sort key, enter SK as the name and choose String as the type.
- Leave all other configurations as default and choose Create table.
The table creation should complete within a few seconds. - Choose Tables in the navigation pane, then open the table you created.
- In the DynamoDB stream details section, choose Turn on.
- Select New and old images, then choose Turn on stream.
This will enable DynamoDB Streams on the table to expose both the old and new state of items in the stream records, so you can manage updates on items’ Time to Live (TTL) values.
Create a Lambda function
Complete the following steps to create your Lambda function:
- On the Lambda console, choose Functions in the navigation pane.
- Choose Create function.
- Select Author from scratch.
- For Function name, enter a name (for example, DDBStreamTriggerEventScheduler).
- Choose the latest Runtime for Node.js
- To the Lambda service role you attached to the function, add the AWS Identity and Access Management (IAM) managed policy AWSLambdaDynamoDBExecutionRole and an inline policy with scheduler:CreateSchedule permission.
- Choose Create function.
- After you create the Lambda function, choose Add trigger to configure the function as the event source mapping for the DynamoDB table.
- Choose DynamoDB as the source.
- For DynamoDB table enter the ARN for Appointment-Table.
- Leave the remaining settings as default and choose Add to create the trigger.
- On the Code tab of your Lambda function, replace the default code with the following Node.js code. Be sure to update the placeholders with appropriate values—such as replacing ses-verified-email@example.com with your verified Amazon SES email address. Additionally, ensure that the IAM role used by EventBridge Scheduler has the ses:SendEmail permission.
- Choose Deploy to deploy the latest function code.
Tip: To improve reliability and observability, you can configure Dead Letter Queues (DLQs) at two points in this architecture. First, you can add a DLQ to the Lambda function that consumes from the DynamoDB stream, this captures any failures that occur while processing stream records or creating schedules, such as malformed input or permission errors. Second, you can configure a DLQ as part of the EventBridge Scheduler target. This captures failures that happen at the time of execution, for example, if Amazon SES fails to send an email or the target service is unavailable. Using both DLQs allows you to track, analyze, and retry failures across the full scheduling and delivery lifecycle.
Generate sample writes to see the solution in action
Run the following AWS CLI command to simulate writes to your DynamoDB table. This loop inserts 10 sample items into the table, each with a unique partition key (PK) and a static sort key (SK). Each item includes a REMINDER_TIMESTAMP set to 3 minutes from the current time and a test email address. These writes will trigger the DynamoDB Stream, which invokes your Lambda function to schedule reminder emails via EventBridge Scheduler. Be sure to replace abc@example.com with a valid, verified email address in Amazon SES to observe the full flow of the solution.
To monitor the reminder emails being sent, navigate to the Monitoring tab of the EventBridge schedule group. You can view the events invoked by EventBridge Scheduler at a specific time by looking at the InvocationAttemptCount metric. In our case, the invocations are appointment reminder emails to users through Amazon SES. For a list of all metrics available for a schedule group, refer to Monitoring Amazon EventBridge Scheduler with Amazon CloudWatch.
Clean up
If you created a test environment to follow along with this post, make sure to delete the DynamoDB table, Lambda function, EventBridge schedule, and any other resources you created for testing the solution.
Summary
In this post, we showed how to use EventBridge Scheduler for fine-grained event scheduling based on data written to DynamoDB. This solution allows you to drive precise, timely actions based on timestamps stored in your DynamoDB items when you want to send time-sensitive notifications or invoke downstream jobs.
Across this three-part series, we explored how to extend the native capabilities of Amazon DynamoDB using event-driven architecture patterns tailored to real-world needs:
- Part 1 introduced a near real-time TTL solution using Amazon EventBridge Scheduler to delete expired items with greater precision than native TTL.
- Part 2 demonstrated how to build a strict data management solution using a sharded global secondary index (GSI), EventBridge Scheduler, and Lambda to periodically query and evict expired records.
- Part 3 focused on using DynamoDB Streams and EventBridge Scheduler to schedule future downstream actions based on data written to a DynamoDB table, for example, sending reminder emails for upcoming appointments.
To dive deeper and explore additional best practices for designing with DynamoDB and EventBridge, visit the Amazon DynamoDB documentation and Amazon EventBridge documentation.