Skip to main content
2025

Figma uses Amazon SageMaker AI to accelerate innovation

Discover how collaborative design company Figma accelerated AI model development by using Amazon SageMaker AI.

Overview

Figma has built its reputation on making design collaboration seamless and accessible across organizations of all sizes. Today, Figma has evolved from a design tool to a platform that helps teams go from idea to shipped product. When generative AI emerged as a new capability, the company recognized an opportunity to further transform each step of the product development process.

To meet the needs of an expanding user base, Figma needed to rapidly develop and deploy innovative AI features to enhance product development workflows while maintaining the reliability and performance that its enterprise customers demanded. Figma turned to Amazon Web Services (AWS) and its comprehensive suite of AI services to accelerate that transformation.

About Figma

Figma is where teams come together to turn ideas into some of the world’s best digital products and experiences. Founded in 2012, Figma has evolved from a design tool to a connected, AI-powered platform that helps teams go from idea to shipped product.

Opportunity | Accelerating innovation using Amazon SageMaker AI

Figma is where companies of all sizes, from startups to Fortune 500 corporations, turn ideas into some of the world’s best digital products and experiences. The company’s web-based software makes it possible for designers, developers, and stakeholders to work together in near real time, removing the traditional bottlenecks that hinder product development and create friction between creative and technical teams.

“Figma is focused on how AI can empower designers and builders,” says Matt Ritter, engineering manager on Figma’s AI team. “Config—our annual flagship conference—is one of our main opportunities during the year to show customers what we’ve been building.”

Advances in generative AI made the creation of AI Search possible. This feature would empower users to find design components and files by using natural language descriptions or screenshots instead of manually browsing through potentially thousands of design files. AI Search was planned to be announced at Config 2024, and it would require a robust, scalable AI infrastructure.

Although Figma’s existing infrastructure was well suited for traditional workloads, it was not optimized for the intensive and dynamic compute requirements and specialized hardware needs of modern AI development. The company needed a foundation capable of supporting rapid experimentation, handling massive datasets securely, and serving AI models to millions of users with enterprise-grade reliability.

To meet this challenge, Figma chose to build its infrastructure by using Amazon SageMaker AI—a fully managed service that brings together a broad set of tools to facilitate high-performance, low-cost machine learning (ML) for virtually any use case—as the foundation. Using Amazon SageMaker AI model training features, Figma could access on-demand compute resources to post-train open source AI models to better understand Figma concepts, streamline the deployment of AI features at scale, and accelerate its pace of innovation.

Solution | Scaffolding an infrastructure for AI model development

In January 2024, Figma began to build the infrastructure necessary to accelerate the development of AI Search. First, the development team established secure data processing pipelines to handle the massive scale of design content needed for AI training. Using Amazon EMR with Apache Spark—a big-data processing service that accelerates analytics workloads—Figma processed billions of design elements from its production systems, filtering and preparing data while maintaining strict customer privacy controls. The processed data was stored in Amazon Simple Storage Service (Amazon S3), an object storage service built to retrieve virtually any amount of data from anywhere. That created a foundation for model training at scale.

Second, Figma built an AI training infrastructure that is based on jobs in Amazon SageMaker Model Training, which is used to train and fine-tune ML and generative AI models. On top of Amazon SageMaker Pipelines—a serverless workflow orchestration service that’s purpose-built for MLOps and LLMOps automation—Figma also created the FigmaStep framework. With FigmaStep, Figma’s engineers can compose training pipelines from a set of reusable steps, implemented as simple Python functions. The framework automatically orchestrates the underlying infrastructure, supporting everything from single-machine experiments to large-scale distributed training across multiple high-performance instances.

“Amazon SageMaker AI has a wide range of features, but we also wanted guidance on how we could best help our developers build in a flexible manner,” says Ritter. “We worked with several AWS representatives to create a high-quality development experience.”

Third, for model deployment, Figma integrated Amazon SageMaker AI endpoints into its existing Kubernetes-based infrastructure by using AWS Controllers for Kubernetes. That integration meant that Figma’s developers could deploy AI models by using the same workflows and tools they use for traditional services. (See figure 1.)

By the end of April 2024—5 months after starting the project—Figma successfully trained and deployed the models powering AI Search. In June, the feature launched at Config 2024. The infrastructure’s impact extends far beyond AI Search. It has accelerated Figma’s pace of innovation. In 2025 alone, Figma’s team ran more than 10,000 Amazon SageMaker AI training jobs, with dozens of engineers running nearly 100 workflows per month—proving the infrastructure’s ability to support continual experimentation and iteration.

Outcome | Scaling AI product innovation with a flexible foundation

At Config 2025, 1 year after the development of AI Search, Figma doubled its existing platform with four new products. Those capabilities included Figma Make, an AI prompt-to-app tool that helps users generate functional designs and prototypes. The foundation that Figma built on Amazon SageMaker AI has been a catalyst for innovation, empowering the company to rapidly develop and deploy AI-powered features that continue transforming the product development experience.

“Amazon SageMaker AI has the tools to quickly scaffold an AI infrastructure with relatively low initial investment,” says Ritter. “At the same time, we have the flexibility to customize our infrastructure so that we can build upon it as our use cases become more advanced.”

Figure 1.

The data flow in Figma’s AI-deployment solution

Missing alt text value
Amazon SageMaker AI has the tools to quickly scaffold an AI infrastructure with relatively low initial investment.

Matt Ritter

Engineering Manager

Did you find what you were looking for today?

Let us know so we can improve the quality of the content on our pages