AWS Developer Tools Blog
Reduce Lambda cold start times: migrate to AWS SDK for JavaScript v3
The AWS SDK for JavaScript (JS SDK) v3 is a rewrite of v2 with a modular architecture and frequently requested features, such as a first-class TypeScript support and a new middleware stack.
As our customers migrate their applications from JS SDK v2 to v3, they have been requesting reliable benchmarks to assess the SDKs performance across common use cases. In response to these requests, the JS SDK team benchmarked cold start times on AWS Lambda as it is a common use case for customers as well as a good standard for reference. Our benchmarks show that v3 has reduced cold start times compared to v2 in most common use cases. Although these benchmarks focus on Lambda cold start times, migrating to v3 generally improves application performance regardless of which compute service you use.
Lambda has made several performance optimizations to its Node.js 18 runtime since launch and the data in this blog post is based on the latest runtime version. If you are sensitive to Lambda cold start times, we recommend bundling your Lambda function, which uses barebones SDK clients with command objects and includes JS SDK v3. You should run your own benchmarks for your production applications, and use this blog post as a reference. You can also refer to the previous blog post on Optimizing Node.js dependencies which dives deep into bundling and minifying Lambda functions.
What was benchmarked?
We benchmarked the Lambda cold start times for an example application using both JS SDK v3 and v2. A significant proportion of JS SDK customers send requests from Lambda, and many are sensitive to cold start times.
The following Lambda function example code imports STS client from JS SDK v3, creates a client instance outside the function and returns the response of getCallerIdentity
.
import { STS } from "@aws-sdk/client-sts";
const client = new STS();
export const handler = async () => client.getCallerIdentity();
The code in v2 would import from “aws-sdk” package, and call promise() on API call as follows:
import AWS from "aws-sdk";
const client = new AWS.STS();
export const handler = async () => client.getCallerIdentity().promise();
The benchmarks were obtained for three common use cases:
- As-is with Lambda provided SDK
- As-is with SDK in user uploaded node_modules
- Bundled using esbuild
We used a custom fork of measure-cold-starts, which gets cold start metrics for multiple related Lambda functions at once, and returns specific metrics and stats in a readable table format. It measures Init Duration as recorded by Lambda in CloudWatch Logs. Each benchmark was run for 100 invocations.
The benchmarks shared in this blog post are for Lambda Functions written in ECMAScript module format, aka ESM. We gathered benchmarks for functions written in CommonJS module format as well, and they are similar. We use ESM, as it’s the official standard format to package JavaScript code for reuse.
As-is with Lambda Provided SDK
Lambda provides an SDK version in their setup, as a convenience for developers building simpler functions or using the Lambda console for development. This allows customers to skip providing SDK artifacts in node_modules
folder. While this is the most convenient use case, it is not the most performant.
In this benchmark setup, the application is uploaded to Lambda with just function source code. It has two file system nodes:
package.json
contains the project manifestindex.mjs
contains the function source code
The JS SDK v3 was introduced in Lambda provided SDK with Lambda Node.js 18. That’s why the v2 benchmark is run on nodejs16.x
, while the v3 benchmark is run on nodejs18.x
.
The ESM imports from NODE_PATH are not available when using Lambda’s Node Runtime 14 and 16. For testing Lambda provided JS SDK v2 in Node.js 16, we create a symlink to /var/runtime/node_modules
in out test setup. For details, check GitHub discussion aws/aws-sdk-js/#4432.
The benchmarks for this setup show that cold start times for Lambda Functions with v3 take >100 ms less time as compared to functions with v2 when using Lambda provided SDK.
The versions v2.1374.0 and v3.188.0 are Lambda provided SDK versions at the time of writing. The Lambda provided SDK versions are regularly updated to the latest SDK versions.
As-is with SDK in user uploaded node_modules
Our benchmark setup for user uploaded node_modules has the following changes as compared to Lambda provided SDK setup:
- The node_modules directory is added with SDK artifacts.
- Both Lambda functions are benchmarked on
nodejs18.x
- The symlink for
/var/runtime/node_modules
is removed.
The benchmarks for this setup show that cold start times for Lambda Functions with v3 take ~140 ms less time as compared to functions with v2. The Lambda Function size is also reduced by ~10.8 MB.
For this setup, the improvement is expected to be consistently more than 100ms for all services. The Lambda Function size difference will depend on the service client, but it will be smaller in v3, which is modular as compared to that in v2.
Bundled using esbuild
A bundler is a tool that combines multiple modules or files into a single file, typically for the purpose of optimizing the delivery and execution of a web application. Bundlers are commonly used in front-end development, specifically for JavaScript-based projects, but they can be used in back-end development as well.
Our benchmark setup for bundled with esbuild uses the named imports in both v2 and v3, and bundles the application using following command:
esbuild source.mjs --bundle --platform=node --format=esm --main-fields=module,main
For creating ESM bundle of v2, we need to provide polyfill for require with the following esbuild option:
For details on why this is needed, see evanw/esbuild/issues/1921#issuecomment-1403107887.
The benchmarks for this setup show that cold start times for Lambda Functions with v3 take >400 ms less time compared to functions with v2 when bundled with esbuild. There’s also a difference of 2 MB in Lambda Function size.
To reduce the function size in v2, some customers use deep import of STS client as follows:
import STS from "aws-sdk/clients/sts.js";
When we ran benchmarks with deep import of STS client, the cold start times in v3 are still ~10ms less. Although, the bundled v2 function with deep imports is smaller than the one with global import, bundled v3 function is still almost half the size.
The bundle size in v3 can also be reduced further by using barebones client with command objects. Under the hood, this imports a light client, and only the operations that your application needs to call. This structure can be used for cases even outside of Lambda Functions.
import { STSClient, GetCallerIdentityCommand } from "@aws-sdk/client-sts";
const client = new STSClient();
export const handler = async () => client.send(new GetCallerIdentityCommand({}));
When the benchmarks were run with command import of STS client in v3, the bundled application size reduced by ~2KB. The cold start times are reduced by ~3ms.
What can we learn from these benchmarks?
The JS SDK v3 is faster than v2 when it comes to Lambda cold start times in common use cases.
The way you implement your application impacts performance. If you are using JS SDK on Lambda and are sensitive to cold start times, please use JS SDK v3 with barebones clients and command objects, and bundle your application before deploying to Lambda. This setup has smaller cold start times as Node.js needs to read just one file, which contains the entire source code of your application. There is no time spent in module resolution, or reading multiple files. When your application uses v3, the bundle size is smaller as v3 is modular.
The JS SDK team recommends using AWS Cloud Development Kit (CDK) for managing your Lambda functions. You can use CDK NodejsFunction construct to bundle the application source code. Just remember to pass an empty array in bundling.externalModules
configuration so that it bundles the SDK. For details, check aws/aws-cdk/#25492.
Feedback
To get started with JS SDK v3, visit our Getting Started page. We value your feedback so if you have any questions, comments, concerns, or ideas, please open a discussion on GitHub.