AWS Compute Blog
Introducing logging support for Amazon EventBridge Pipes
Today, AWS is announcing support for logging with EventBridge Pipes. Amazon EventBridge Pipes is a point to point integration solution that connects event producers and consumers with optional filter, transform, and enrichment steps. EventBridge Pipes reduces the amount of integration code builders must write and maintain when building event-driven applications. Popular integrations include connecting Amazon Kinesis streams together with filtering, Amazon DynamoDB direct integrations with Amazon EventBridge, and Amazon SQS integrations with AWS Step Functions.
EventBridge Pipes logging introduces insights into different stages of the pipe execution. It expands on the Amazon CloudWatch metrics support, and provides you with additional methods for troubleshooting and debugging.
You can now gain insights into various successful and failure scenarios within the pipe execution steps. When event transformation or enrichment succeeds or fails, you can use logs to delve deeper and initiate troubleshooting for any issues with their configured pipes.
EventBridge Pipes execution steps
Understanding pipes execution steps can assist you in selecting the appropriate log level, which determines how much information is logged.
A pipe execution is an event or batch of events received by a pipe that travel from the source to the target. As events travel through a pipe, they can be filtered, transformed, or enriched using AWS Step Functions, AWS Lambda, Amazon API Gateway, and EventBridge API Destinations.
A pipe execution consists of two main stages: the enrichment and target. Both of these stages encompass transformation and invocation steps.
You can use input transformers, enabling modification to the payload of the event before the events undergo enrichment or get dispatched to the downstream target. This gives you fine-grained control over the manipulation of event data during the execution of their configured pipes.
When the pipe execution starts, the execution enters the enrichment stage. If you don’t configure an enrichment stage, the execution proceeds to the target stage.
At the pipe execution, transformation, enrichment, and target phases EventBridge can log information to help debug or troubleshoot. Pipes logs can include payloads, errors, transformations, AWS requests, and AWS responses.
To learn more about pipe executions, read this documentation.
Configuring log levels with EventBridge Pipes
When logging is enabled for your pipe, EventBridge produces a log entry for every execution step and sends these logs to the specified log destinations.
EventBridge Pipes supports three log destinations: Amazon CloudWatch Logs, Amazon Kinesis Data Firehose stream, and Amazon S3. The records sent can be customized by configuring the log level of the pipe (OFF, ERROR, INFO, TRACE).
- OFF – EventBridge does not send any records.
- ERROR – EventBridge sends records related to errors generated during the pipe execution. Examples include Execution Failed, Execution Timeout and Enrichment Failures.
- INFO – EventBridge sends records related to errors and selected information performed during pipe execution. Examples include Execution Started, Execution Succeeded and Enrichment Stage Succeeded.
- TRACE – EventBridge sends any record generated during any step in the pipe execution.
The ERROR log level proves beneficial in gaining insights into the reasons behind a failed pipe execution. Pipe executions may encounter failure due to various reasons, such as timeouts, enrichment failure, transformation failure, or target invocation failure. Enabling ERROR logging allows you to learn more about the specific cause of the pipe error, facilitating the resolution of the issue.
The INFO log level supplements ERROR information with additional details. It not only informs about errors but also provides insights into the commencement of the pipe execution, entry into the enrichment phase, progression into the transformation phase, and the initiation and successful completion of the target stage.
For a more in-depth analysis, you can use the TRACE log level to obtain comprehensive insights into a pipe execution. This encompasses all supported pipe logs, offering a detailed view beyond the INFO and ERROR logs. The TRACE log level reveals crucial information, such as skipped pipe execution stages and the initiation of transformation and enrichment processes.
For more details on the log levels and what logs are sent, you can read the documentation.
Including execution data with EventBridge Pipes logging
To help with further debugging, you can choose to include execution data within the pipe logs. This data comprises event payloads, AWS requests, and responses sent to and received from configured enrichment and target components.
You can also use the execution data to gain further insights into the payloads, requests, and responses sent to AWS services during the execution of the pipe.
Incorporating execution data can enhance understanding of the pipe execution, providing deeper insights and aiding in the debugging of any encountered issues.
Execution data within a log contains three parts:
- payload: The content of the event itself. The payload of the event may contain sensitive information and EventBridge makes no attempt to redact the contents. Including execution data is optional and can be turned off.
- awsRequest: The request sent to the enrichment or target in serialized JSON format. For API destinations this includes the HTTP request sent to that endpoint.
- awsResponse: The response returned by the enrichment or target in JSON format. For API Destinations, this is the response returned from the configured endpoint.
The payload of the event is populated when the event itself can be updated. These stages include the initial pipe execution, enrichment phase, and target phase. The awsRequest and awsResponse are both generated at the final steps of enrichment and targeting.
For more information on log levels and execution data, visit this documentation.
Getting started with EventBridge Pipes logs
This example creates a pipe with logging enabled and includes execution data. The pipe connects two Amazon SQS queues using an input transformer on the target with no enrichment step. The input transformer customizes the payload of the event before reaching the target.
- Creating the source and target queues
# Create a queue for the source
aws sqs create-queue --queue-name pipe-source
# Create a queue for the target
aws sqs create-queue --queue-name pipe-target
2. Navigate to EventBridge Pipes and choose Create pipe.
3. Select SQS as the Source and select pipe-source as the SQS Queue.
4. Skip the filter and enrichment phase and add a new Target. Select SQS as the Target service and pipe-target as the Queue.
5. Open Target Input Transformer section and enter the transformer code into the Transformer field.
{
"body": "Favorite food is <$.body>"
}
6. Choose Pipe settings to configure the log group for the new pipe.
7. Verify that CloudWatch Logs is set as the log destination, and select Trace as the log level. Check the “Include execution data” check box. This logs all traces to the new CloudWatch log group and includes the SQS messages that are sent on the pipe.
8. Choose Create Pipe.
9. Send an SQS message to the source queue.
# Get the Queue URL
aws sqs get-queue-url --queue-name pipe-source
# Send a message to the queue using the URL
aws sqs send-message --queue-url {QUEUE_URL} --message-body "pizza"
10. All trace logs are shown in the monitoring tab, use CloudWatch Logs section for more information.
Conclusion
EventBridge Pipes enables point-to-point integration between event producers and consumers. With logging support for EventBridge Pipes, you can now gain insights into various stages of the pipe execution. Pipe log destinations can be configured to CloudWatch Logs, Kinesis Data Firehose, and Amazon S3.
EventBridge Pipes supports three log levels. The ERROR log level configures EventBridge to send records related to errors to the log destination. The INFO log level configures EventBridge to send records related to errors and selected information during the pipe execution. The TRACE log level sends any record generated to the log destination, useful for debugging and gaining further insights.
You can include execution data in the logs, which includes the event itself, and AWS requests and responses made to AWS services configured in the pipe. This can help you gain further insights into the pipe execution. Read the documentation to learn more about EventBridge Pipes Logs.
For more serverless learning resources, visit Serverless Land.