AWS Compute Blog

Category: Amazon Managed Streaming for Apache Kafka (Amazon MSK)

re:Invent Banner

The serverless attendee’s guide to AWS re:Invent 2024

AWS re:Invent 2024 offers an extensive selection of serverless and application integration content. AWS re:Invent Banner For detailed descriptions and schedule, visit the AWS re:Invent Session Catalog. Join AWS serverless experts and community members at the AWS Modern Apps and Open Source Zone in the AWS Expo Village. This serves as a hub for serverless […]

Architecture Diagram

Triggering AWS Lambda function from a cross-account Amazon Managed Streaming for Apache Kafka

This post is written by Subham Rakshit, Senior Specialist Solutions Architect, and Ismail Makhlouf, Senior Specialist Solutions Architect. Many organizations use a multi-account strategy for stream processing applications. This involves decomposing the overall architecture into a single producer account and many consumer accounts. Within AWS, in the producer account, you can use Amazon Managed Streaming for […]

Converting Apache Kafka events from Avro to JSON using EventBridge Pipes

This post is written by Pascal Vogel, Solutions Architect, and Philipp Klose, Global Solutions Architect. Event streaming with Apache Kafka has become an important element of modern data-oriented and event-driven architectures (EDAs), unlocking use cases such as real-time analytics of user behavior, anomaly and fraud detection, and Internet of Things event processing. Stream producers and consumers […]

Lambda updated initial scaling

Scaling improvements when processing Apache Kafka with AWS Lambda

AWS Lambda is improving the automatic scaling behavior when processing data from Apache Kafka event-sources. Lambda is increasing the default number of initial consumers, improving how quickly consumers scale up, and helping to ensure that consumers don’t scale down too quickly. There is no additional action that you must take, and there is no additional […]

Reference architecture

Using custom consumer group ID support for AWS Lambda event sources for MSK and self-managed Kafka

This post shows how to use the new custom consumer group ID feature of the Lambda event source mapping for Amazon MSK and self-managed Kafka. This feature can be used to consume messages with Lambda starting at a specific timestamp or offset within a Kafka topic. It can also be used to consume messages from a consumer group that is replicated from another Kafka cluster using MirrorMaker v2.

Batching controls with Lambda event source mapping

Introducing AWS Lambda batching controls for message broker services

This post is written by Mithun Mallick, Senior Specialist Solutions Architect. AWS Lambda now supports configuring a maximum batch window for instance-based message broker services to fine tune when Lambda invocations occur. This feature gives you an additional control on batching behavior when processing data. It applies to Amazon Managed Streaming for Apache Kafka (Amazon […]

Offset lag metric for Amazon MSK as an event source for Lambda

This post written by Adam Wagner, Principal Serverless Solutions Architect. Last year, AWS announced support for Amazon Managed Streaming for Apache Kafka (MSK) and self-managed Apache Kafka clusters as event sources for AWS Lambda. Today, AWS adds a new OffsetLag metric to Lambda functions with MSK or self-managed Apache Kafka event sources. Offset in Apache […]

Consumer function log stream

Introducing mutual TLS authentication for Amazon MSK as an event source

This post is written by Uma Ramadoss, Senior Specialist Solutions Architect, Integration. Today, AWS Lambda is introducing mutual TLS (mTLS) authentication for Amazon Managed Streaming for Apache Kafka (Amazon MSK) and self-managed Kafka as an event source. Many customers use Amazon MSK for streaming data from multiple producers. Multiple subscribers can then consume the streaming […]

Setting up NAT Gateway

Setting up AWS Lambda with an Apache Kafka cluster within a VPC

Using resources such as NAT Gateways and VPC endpoints with PrivateLink, you can ensure that your data remains secure while also granting access to resources such as Lambda to help you create a Kafka consumer application. This post provides some tips to help you set up a Lambda function using Kafka as a trigger. It also explains various options available to send data securely.