- Solutions›
- Case Studies
Intercom revolutionizes customer service using AWS and Anthropic
Learn how customer service software provider Intercom helps customers achieve resolution rates of up to 90 percent by working with Anthropic and using AWS.
Overview
Intercom, a leading artificial intelligence (AI) customer service software provider, wanted to improve its operations by automating customer support while maintaining high accuracy and reliability. Working alongside Amazon Web Services (AWS) and AWS Partner Anthropic, Intercom developed Fin, an AI agent that achieves average resolution rates of 56 percent within 30 days of deployment—with some customers reaching resolution rates of up to 90 percent. Intercom’s solution has been adopted by thousands of customers and has generated tens of millions of dollars in revenue since its launch.
Opportunity
Defining the requirements for an accurate, complex agent
Since its founding in 2011, Intercom has relied on the scalable infrastructure of AWS to support its hypergrowth trajectory. As AI emerged as a transformative force in customer service, the company recognized the need to develop a sophisticated solution that could automate support while maintaining high accuracy standards. The challenge was particularly complex, given the need to handle various types of customer inquiries and protect against AI hallucinations.
The company needed a solution that could effectively handle the multiple components of the customer support process. This included assuring that a question is answerable, applying guidance that complies with customer support policies, and generating accurate responses. This in turn meant that Intercom needed to find the right AI partner to help develop an architecture that could scale reliably while maintaining the company’s high performance standards.
Solution
Developing a sophisticated solution for customer support
Through extensive testing, Anthropic’s Claude emerged as the highest-performing large language model for Intercom’s needs. Intercom worked with Anthropic to develop Fin on AWS, using services including Amazon Bedrock, a fully managed service that offers a choice of high-performing foundation models, and Amazon SageMaker, which delivers an integrated experience for analytics and AI. The solution is made up of 15–20 different subcomponents, each paired with a specific large language model optimized for tasks like answer verification, policy compliance checking, and response generation.
The architecture lets Fin analyze customer inquiries, access knowledge repositories, fact-check answers, and verify compliance with policies and guidelines. The solution can also identify opportunities for further automation. “Fin will say, ‘Hey, here's knowledge I think you should add. Here’s an API that, if I had, I could resolve more,’” says Des Traynor, cofounder of Intercom.
Outcome
Helping customers achieve an up to 90 percent resolution rate
The implementation has delivered significant results for Intercom and its customers. Within 30 days of deployment, customers achieve, on average, a resolution rate of 56 percent, with some customers reaching 80–90 percent. The success has led to rapid adoption, with over 4,000 customers implementing Fin since its March 2023 launch. The solution’s effectiveness is further validated by Anthropic becoming one of Fin’s largest customers, using it to handle Anthropic’s own customer support volume.
Looking ahead, Intercom will continue to innovate, exploring opportunities in proactive support, visual processing, and audio capabilities. “We want to keep pushing the boundaries,” says Traynor. “We want to always be on the edge, testing to see what’s possible using AWS along with Anthropic.”
We want to keep pushing the boundaries. We want to always be on the edge, testing to see what’s possible using AWS along with Anthropic.
Des Traynor
Cofounder, IntercomDid you find what you were looking for today?
Let us know so we can improve the quality of the content on our pages