Affinda rewrites the rules of document AI with Amazon Bedrock
Learn how Affinda uses generative AI on AWS to reduce setup time by 90 percent and deliver faster, more accurate document processing.
Benefits
90%
reduction in setup time for new use cases90%
cost savings for product delivery teamOverview
AI-driven software is increasingly being used to extract data from documents, and Affinda provides this capability to organizations worldwide. The company recently set out to reduce technical configuration time and achieve high data accuracy levels faster. By transitioning to generative AI on Amazon Web Services (AWS), Affinda revamped its document AI processing solution—cutting setup time for new cases by 90 percent, reducing costs by 90 percent, and improving customer experience.
About Affinda
Affinda is an AI transformation partner to organizations globally. With deep expertise in business process automation and a proprietary intelligent document processing platform, Affinda enables organizations to integrate AI workflows and agents, driving operational efficiency and margin improvements.
Opportunity | Accelerating accuracy with generative AI
Organizations that process high volumes of documents are adopting AI-based tools to extract important information quickly and accurately. Affinda meets this need with intelligent document processing software that automates the extraction of information from physical documents. The company initially relied on traditional machine learning (ML) models that required extensive technical setup and data annotation, making the process time-consuming and complex for both Affinda and its customers.
Andrew Bird, head of AI at Affinda, explains, “We used open-source models that we fine-tuned for specific document types and schemas, and it took a lot of engineering effort to do that. Even when we eventually created a self-service model for our customers, they still needed to produce their own training data sets.” To overcome these challenges and achieve faster accuracy, Affinda shifted its focus to generative AI technology.
Solution | Revamping document AI with Amazon Bedrock
Affinda had already used Amazon SageMaker to support its ML models and has since expanded its approach with AWS generative AI technologies, using Amazon Bedrock–based large language models (LLMs) powered by Claude Sonnet 3.5 V2, Claude 3.7 Sonnet, and Claude Sonnet 4. “AWS is our preferred cloud provider, and technologies like Amazon SageMaker and Amazon Bedrock give us confidence in data sovereignty, regional data processing, and seamless integration with our existing AWS environment,” Bird says.
 
The company runs its Kubernetes clusters on Amazon Elastic Kubernetes Service (Amazon EKS) with Amazon Elastic Compute Cloud (Amazon EC2) and trains hundreds of custom models on Amazon SageMaker. Affinda also uses AWS CloudFormation to automate infrastructure management across its technology stack.
Affinda revamped its AI document processing solution on its new generative AI environment and rolled it out to customers in late 2024. The setup supports asynchronous inferences triggered by customer actions and maintains a hybrid approach that combines traditional and generative AI models. “We initially targeted a specific customer use case where we needed the model to learn more rapidly and dynamically based on user input,” says Bird. “That success convinced us to rebuild the entire application around this new paradigm, moving away from reliance on fine-tuned models.”
Outcome | Accelerating document processing with a 90% faster setup
Affinda reduced upfront engineering effort for new use cases while maintaining high data accuracy. “With Amazon Bedrock, there’s very little need for specific model training data. As a result, we’ve seen a 90 percent reduction in configuration time for new document extraction use cases,” says Bird. “Now, you just need a small number of idiomatic examples that show the LLM in its context window what the problem is.” Affinda also simplified model correction through natural-language explanations, cutting engineering overhead. “Engineers don’t even have to be involved,” Bird explains. “Someone in our product or implementation team can manage it on their own.”
The company reports a 90 percent cost savings for its product delivery team in the effort and resources required to roll out a custom use case. The platform now allows customers to self-serve, configuring new data extraction models themselves in a matter of minutes—a task that previously took Affinda’s internal team weeks or even months.
 
By lowering the barrier to entry, Affinda has made intelligent document processing accessible to organizations across industries and use cases. This approach has reduced time to value and increased return on investment for Affinda’s customers. Meanwhile, customers also benefit from instant learning and adaptation, which improves accuracy and allows for straight-through processing at scale.
Affinda is now advancing its AI capabilities by developing agents that act like seasoned data-entry professionals—detecting anomalies, tracing answers through internal systems, asking smart questions of the right people, and delivering resolved outcomes. “It’s important for us to stay at the forefront of AI advancements,” Bird says. “We can do that with AWS.”
 
 
                     AWS is our preferred cloud provider, and technologies like Amazon SageMaker and Amazon Bedrock give us confidence in data sovereignty, regional data processing, and seamless integration with our existing AWS environment.
Andrew Bird
Head of AI, AffindaAWS Services Used
Get Started
Organizations of all sizes across all industries are transforming their businesses and delivering on their missions every day using AWS. Contact our experts and start your own AWS journey today.
Did you find what you were looking for today?
Let us know so we can improve the quality of the content on our pages