.NET on AWS Blog

AXA Belgium’s Journey to Running .NET Web APIs on AWS Lambda

Among others, the insurance industry is embracing the benefits of cloud-focused serverless computing services to run code without the overhead of provisioning or managing servers. Moving on-premises .NET Web APIs to such destinations come with a broad range of considerations. This blog post provides a return of experience from AXA Belgium’s engineering team and their approach on the validation of AWS Lambda for running .NET Web APIs. It gives guidance on addressing critical requirements such as standardization, portability, and performance, drawing insights from the Proof of Concept (PoC) work they undertook. The PoC included exploring different optimization mechanisms, Native AOT, and the new Lambda SnapStart feature.

Moving to the Cloud at AXA Belgium

AXA Group is a global insurer serving nearly 100 million customers in 50 countries. Back in 2017, the company launched a digital transformation program named ‘Move to the Cloud’ and defined the Cloud Binding Principles that all its subsidiaries, including AXA Belgium, will use in their decision-making process. One of the strongest principles is to favor as much as possible the managed cloud services to build and run applications. Teams will leverage serverless capabilities, preferred over investing in maintaining VM infrastructure or containers.

In line with this strategic guidance, AXA Belgium’s IT Department started exploring AWS Lambda as a way, not only to run fully managed, scalable, cost-effective cloud native applications, but also as an opportunity to modernize and migrate as much as possible the legacy on-premises Web APIs. AXA Belgium needs to do so in a corporate landscape, where multiple feature teams operate within highly structured and regulated environments. These teams will identify and address stringent requirements such as cost, security, performance, portability, and adherence to industry standards.

At AXA Belgium, this task falls under the purview of the Engineering Excellence team, comprised of seasoned technical leaders and software engineers. The Engineering Excellence team defines application patterns that adhere to the AXA Group’s guidelines, promotes actionable software development best-practices across the organization and, above all, ensures the technology aligns with IT and business domains’ most critical requirements.

Beginning of the Journey

Alongside with Java, .NET (C#) occupies the largest footprint in the company for Web API development.

In 2023, the Engineering Excellence team investigated running .NET applications on AWS Lambda, and more precisely to understand how fit this serverless computing service is for addressing three specific considerations:

  • The first consideration is to determine how much development effort would represent to move a complete .NET Web API from on-premises VMs to AWS Lambda. The underlying goal is to achieve migrations with the fewest code changes possible, to reduce to the minimum the involvement of development teams. This also means less room for bugs and regressions that would keep developers busy fixing instead of building new features.
  • The second consideration is maintaining compatibility with existing .NET Web API standards and best practices. This involves not only structuring the code in a certain way, i.e., organizing an API in multiple controllers where each controller manages a specific resource, but also being able to use ASP.NET Core features like built-in attributes and helper methods.
  • The third one is preserving performance: guaranteeing that application performance remains at or above current levels throughout and after the migration process. In particular, it was important to understand if the “cold start” had a negative impact on Web APIs and what mitigation methods there are against it.

Two important questions that needed to be answered are how to verify if AWS Lambda meets those requirements and what the testing procedure should be?

Building and fine-tuning a proof of concept

The Challenge

As an enterprise development team managing a diverse technology stack of Java, .NET, and Angular applications, the team needed to validate AWS Lambda’s capability to host the .NET applications effectively. The primary concerns included:

  • Cold-start performance for enterprise-grade applications
  • Maintaining compliance with Microsoft’s Web API development guidelines
  • Supporting complex dependency injection scenarios
  • Ensuring minimal code modifications for cross-platform deployment

Evaluating Implementation Approaches

The engineering team began investigating three approaches to running .NET Web APIs on AWS Lambda:

  • Traditional Amazon API Gateway and Lambda Function Handlers:
    • One endpoint per function handler per project.
    • Manual HTTP method binding through API Gateway.
  • API Gateway and Lambda Annotations Framework:
    • Multiple endpoints per controller.
    • Automatic endpoint-to-lambda conversion.
    • Simplified API Gateway configuration through serverless templates.
  • AWS Lambda Hosting with Function URLs:
    • Single-package installation requirement (Amazon.Lambda.AspNetCoreServer.Hosting).
    • Minimal code changes.
    • Complete Web API support in a single Lambda function.

The third approach, using Amazon.Lambda.AspNetCoreServer.Hosting proved most promising, requiring minimal code changes while supporting the entire Web API structure in a single Lambda function.

Testing setup using .NET 6

A proof of concept was created with these key components:

  • .NET 6.0 SDK
  • Swagger (Swashbuckle library)
  • Controller-based OpenAPI-compliant RESTful API
  • Controller with GET, POST, PUT, PATCH and DELETE endpoints
  • Entity Framework Core with an in-memory database
  • Dependency Injection
  • Layered architecture with Service and Data layers.
  • Custom mapping, validation, and exceptions.
  • Request model validation using Data Annotations

Performance Testing and Optimization Strategy

One of the major concerns was the well-known “cold start” limitation, where AWS Lambda takes additional time to initialize the execution environments after some inactivity or when scaling out, serving additional simultaneous requests. The “slow” cold start can negatively affect user experience, so the team worked on understanding it and finding possible remediation strategies.

To test and record warm and cold start latency, the team ran batches of 10 requests in a row, forcing a Lambda function reset each time to make sure they could start a new test from clean slate.

The CloudWatch Log Insights query shown in Figure 1 was used to generate the results:

CloudWatch Log Insights query

Figure 1: CloudWatch Log Insights query

Prior to AWS Lambda SnapStart for .NET optimization availability in November 2024, the team tested and compared different memory configurations and also Intel and ARM processor architectures:

  • Tested memory allocations: 256MB, 512MB, 1024MB, 2048MB.
  • Monitored CPU allocation correlation.
  • Evaluated cost-performance trade-offs.

Apart from that, they also investigated the impact of:

  • Testing the API performance with various record counts in the underlying database.
  • Different compiler options, like “ReadyToRun”.

Test Results

Table showing 9 test cases with varying AWS Lambda memory allocations and code optimisations, resulting in changes in cold start and warm start times

Figure 2: .NET 6 test results

Key findings from these first tests:

  • Increasing memory reduced cold starts in our tests, but with diminishing returns past 1024 MB.
  • ARM architecture didn’t provide performance gains for our .NET code.
  • We found that the cost-effective sweet spot for our use case was using the Intel x86_64 architecture, having between 1024 and 1536 MB of memory, and having the PublishReadyToRun property enabled.

Additional Optimizations with Native Ahead of Time Compilation and SnapStart for .NET 8

.NET 8 Ahead of Time Compilation (AOT)

The engineering team was curious about what performance improvements they could get when migrating the APIs’ codebase to the latest .NET 8 LTS release.

After .NET 8 became available as a managed runtime on AWS Lambda, they ran additional performance tests comparing .NET 8 Controller-based API against Native AOT-compiled Minimal API (because the Controller-based APIs aren’t supported by AOT in .NET 8).

To work around this limitation, they recreated the tests using Minimal API with the same structure and functionality as the Controller-based one, removing EF Core and Data Annotations because of compatibility limitations.

In the table below are presented the implementation details for both APIs:

.NET 8 Controller-based API versus .NET 8 Native AOT Minimal API The main difference between the two is in the usage of the Entity Framework and Data Annotations for controller-based API, while using Custom in-memory database and custom request model validation for the minimal API

Figure 3: Controller-based API vs. Native AOT Minimal API setup

[1] Not using EF Core on this API because it doesn’t support Native AOT yet.
[2] Not using Data Annotations because Minimal APIs don’t support them yet.

The following table summarizes the optimization techniques applied to mitigate cold start issues:

.NET 8 Controller-based API versus .NET 8 Native AOT Minimal API: The difference for the native AOT Minimal API is that it has the following parameters set: <PublishAOT> = true, <InvariantGlobalizarion> = true, <TrimMode> = full, <StripSymbols> = true

Figure 4: Optimization techniques for Contoller-based and AOT Minimal API

Comparing .NET 6 to .NET 8

In this table are the results between the same controller-based API using .NET 6 and the new one after it was migrated to .NET 8 with no other change:

This table shows that based on the tests, the warm and the cold start times increase in .NET8 by 10 to 20 percent compared to .NET6

Figure 5: Start time comparison between .NET 6 and .NET 8

The results show there was a slight overall increase in both cold start and warm start latencies, which is consistent with other benchmarks, like the Lambda Cold Starts benchmark by maxday.

Comparing .NET 8 to .NET 8 Native AOT

The following table compares the results of the two APIs, one with the standard Controller-based approach and the other using a Minimal approach, which enables to use of the new Native AOT feature.

Table shows that based on the tests, the warm and the cold start times were reduced in .NET8 with AOT by over 80 percent compared to .NET8 without AOT

Figure 6: Start time comparison between .NET 8 without AOT and .NET 8 with AOT

Hugo Portela from the AXA Belgium engineering team commented:

On average, we got a whopping 83% improvement in cold starts and 87% improvement in warm starts. It’s really impressive that not only AOT improved the cold start, but the warm starts as well, and by such large percentage values no less.”

.NET 8 Native AOT with Different Memory Amounts

Table showing that varying the memory size between 512K and 2048K has very small impact to the cold start and the warm start times

Figure 7: Comparing startup times between different memory allocations

Hugo Portela continues:

“The difference in execution time between the memory sizes was minimal.

These tests show that for Native AOT, memory doesn’t seem to affect results that much, which means theoretically we can, for similar performance, save up to 50% or more in costs just by decreasing the memory amount.

.NET Lambda SnapStart

As per the AWS Lambda SnapStart documentation: “The cold start duration is the sum of Restore Duration + Duration”. Therefore, the following CloudWatch query was used to generate the results:

CloudWatch Log Insights query

Figure 8: CloudWatch Log Insights query

.NET 8 API Without AOT

In this table are presented the results between the same Controller API, one with the SnapStart feature enabled and the other without:

Table showing test results comparison between the snapstart off and on, the cold starts times are shorter

Figure 9 – Comparing startup times without and with SnapStart without AOT

As we can see, there was a slight increase in the warm start and a decrease in the cold start latencies. On average, the warm starts are 2ms (12%) slower and the cold starts are 205ms (15%) faster.

Note
Since the tests done by AXA Belgium, AWS released improvements for cold start performance in .NET Core workloads that should show further reduction of the cold starts.

.NET 8 Native AOT API

In this table are presented the results between the same native AOT APIs, one with the SnapStart feature enabled and the other without:

Table showing test results comparison between the snapstart off and on, the cold starts times are slightly longer

Figure 10 – Comparing startup times without and with SnapStart with AOT

On the contrary, for this case we see that there was a slight decrease in the warm start and an increase in cold start latencies. On average, the warm starts got 0.1ms faster and the cold starts got 29ms slower.

Summarizing the results of the SnapStart tests

Looking first at the results of the .NET 8 API witout AOT, we see that even though there was a slight increase in warm start latency, the cold start latency decreased, which validates the promise of the SnapStart feature. For this specific case, this is evidence that if a project is affected by cold start issues and needs to be as performant as possible, then yes, this feature will be a good solution to mitigate slow application starts.

In the .NET 8 Native AOT API use case, we see the results were not as pronounced, since the warm start latency remained practically unchanged and the cold start latency became slightly bigger. With such small changes however, it’s hard to draw any relevant conclusion about the effects of the SnapStart feature for Native AOT projects, where activating AOT brings much bigger reduction in cold and warm starts by itself.

Hugo Portela summarized the results of the tests done by his team at AXA Belgium:

“In our opinion, the SnapStart is a welcomed feature on AWS Lambda with .NET, because depending on the type of application (Controller-based vs Minimal) and its configurations (JIT vs AOT), it can indeed produce positive results. One of the most appealing aspects of it, is the fact that you can simply enable it on your AWS Console, typically with no changes to your function code, which is great for teams.

We just cannot forget about the SnapStart downsides, which include an added cost and some limitations regarding applications that need:

    • Uniqueness of State (which can be mitigated by saving state that depends on uniqueness during initialization).
    • Randomness (which can be mitigated by using cryptographically secure random number generators).

The SnapStart feature needs to be considered carefully depending on the workload and the use case. In some use cases, SnapStart won’t be improving our expectations, and in some other it will.

We of course, cannot forget that Native AOT compilation is still fairly new and that several ASP.NET Core features and 3rd party libraries do not currently support it . However, we firmly believe that slowly, but surely, we’ll see more and more support being added to existing libraries, since its advantages are way too good to ignore.

Overall, we are beyond pleased with these results, and we could never have imagined that such latency improvements were even possible.

To sum up, migrating from .NET 6 to .NET 8 without any other change, translates into slightly higher latency, but when developing an API with Native AOT, it more than justifies the migration effort since not only it was possible to gain an average of 80% in performance but also to save 50% or more in costs by decreasing the memory amount of the Lambda function. As far as the ARM architecture is concerned, in the tests performed by AXA Belgium, graviton-based Lambda wasn’t reaching the price-performance ratio justifying its usage instead of the x86 architectures.”

Ready to Transform Your .NET Applications?

Native AOT brings great cost advantage and significantly reduces cold starts.

The SnapStart feature needs to be considered carefully depending on the workload and the use case. In some use cases, SnapStart won’t improve performance, and in others it will.

Do not forget that Native AOT compilation is still fairly new and that several ASP.NET Core features and third-party libraries don’t currently support it. However, we firmly believe we’ll see more and more support being added to existing libraries, since its advantages are way too good to ignore.

Explore the Documentation

Start Your SnapStart Journey

  • Download the Amazon.Lambda.AspNetCoreServer.Hosting pacakge to get started in minutes.
    Enable SnapStart on your existing Lambda functions with just a few clicks – no code changes required.
  • Measure the impact using CloudWatch metrics to quantify your performance improvements.

Build Your Proof of Concept

  • Download the Amazon.Lambda.AspNetCoreServer.Hosting package to get started in minutes.
  • Follow AXA Belgium’s proven approach: start with minimal code changes, then optimize incrementally.
  • Test both SnapStart and Native AOT to find your optimal configuration.

Pro Tip

Start small with a non-critical API to build confidence and expertise. Once you’ve validated the approach, use your learnings to create a migration roadmap for your entire .NET portfolio.

For new applications:

  • Start with .NET 8 or later, targeting Native AOT where possible
  • Design with serverless architecture patterns in mind
  • Utilize minimal APIs for simpler deployment and better performance

For existing applications:

  • Evaluate migration to .NET 8 or later and test Native AOT compatibility
  • Implement SnapStart selectively based on function characteristics
  • Optimize JSON serialization and database access patterns

By following these insights and continuously evaluating new technologies, organizations can successfully leverage AWS Lambda for their .NET applications, achieving both performance and cost-efficiency in their serverless architectures.

Join the growing community of enterprises running .NET on AWS Lambda. Your serverless transformation starts with a single function—why not deploy your first one today?

Hugo Portela

Hugo Portela

Hugo Portela is a Senior Software Engineer at ATS4IT, currently working as a Technical Lead at AXA Belgium. Most of his 12 years of experience have been spent as a Full-stack Developer, focusing primarily on the creation of complex Web Applications using .NET and Angular frameworks.

Christophe Gastineau

Christophe Gastineau

Christophe Gastineau is a Senior Solutions Architect at Amazon Web Services. He spent half of his 20 year career as a .NET developer before pivoting to exploring solution architecture in the cloud. Christophe is now supporting customers to design, evolve and build well architected solutions on AWS.

Alexey Dolganov

Alexey Dolganov

Alexey Dolganov is a Senior Solutions Architect at Amazon Web Services working with global financial institutions helping them in their journey to the Cloud. He has extensive 25 year experience in designing and operating hybrid IT architectures, from 100% on-premises to full Cloud-Native.