Customer Stories / Software & Internet / India

Fractal Analytics Reduces Call Handling Time by Up to 15% with Generative AI on AWS
Fractal Analytics used generative AI powered by Amazon Bedrock and Amazon EKS to develop a unified knowledge base for knowledge workers, increasing employee and customer satisfaction while improving call center agent efficiency.
10–15%
average reduction in call handling time
30%
call deflection rate thanks to self-service capability
>200,000
monthly queries served
Overview
Fractal Analytics is a prominent provider of artificial intelligence solutions to Fortune 500 companies. To accelerate development while ensuring security at scale, the firm chose to build its Knowledge Assist solution on AWS.
Knowledge Assist relies on Amazon Bedrock to run large language models, Amazon EKS for software orchestration, and Amazon OpenSearch Service for semantic search. Customers using Knowledge Assist reported lower call handling times, higher customer and employee satisfaction, and improved compliance with less supervisor intervention.

Opportunity | Using Generative AI to Empower Innovation
Fractal Analytics believes human imagination is at the heart of every decision, whether or not that decision is aided by artificial intelligence (AI). The global firm, one of India’s original AI unicorns, uses technologies including generative AI to empower imagination with data-driven intelligence.
Fractal offers an array of customizable solutions to improve customer and business outcomes across industries such as finance and healthcare. Patient Jarvis, for example, is an AI-powered data accelerator that puts a dynamic view of patient healthcare history at practitioners’ fingertips.
One of the company’s latest innovations, Knowledge Assist, utilizes large language models (LLMs) to make knowledge retrieval more efficient within large enterprises. Traditionally, data retrieval and consolidation from numerous internal sources has been a highly time-consuming task for knowledge workers. This data is often unstructured, increasing the complexity of each query.

Being able to choose from various LLMs on Amazon Bedrock, which we can swiftly implement or experiment with, along with the ability to use the platform as an API without hosting concerns, helps us experiment and scale faster.”
Ritesh Radhakrishnan
Client Partner for Products and Accelerators, Fractal Analytics
Solution | Experimenting with Multiple Large Language Models
Fractal’s client base comprises numerous Fortune 500 companies, so data security and privacy are paramount when developing new products. Most clients deploy Fractal’s solutions within their private networks, so platform flexibility is also important. Having already executed many large-scale software as a service (SaaS) implementations on Amazon Web Services (AWS), including a chat solution that handles 15 million queries per month, Fractal chose to build Knowledge Assist on AWS.
The company deployed its LLMs on Amazon Bedrock, a fully managed service with a broad set of capabilities to build generative AI applications with security, privacy, and responsible AI. Fractal’s engineering team experimented extensively with available models, evaluating cost and performance against their intended use cases.
Ritesh Radhakrishnan, client partner for Products and Accelerators at Fractal Analytics, says, “The generative AI space is evolving rapidly. Being able to choose from various LLMs on Amazon Bedrock, which we can swiftly implement or experiment with, along with the ability to use the platform as an API without hosting concerns, helps us experiment and scale faster.”
Fractal worked closely with AWS during the development process, receiving valuable guidance including benchmarking service costs. “For nearly seven years we’ve gotten constant support from AWS, whether for getting credits for experimentation, conducting technology workshops and training, or simply having access to experts who can help with a deeper understanding of AWS services and AI best practices,” adds Radhakrishnan.
In addition to Amazon Bedrock, Fractal used Amazon Elastic Container Service (Amazon ECS) to build connectors for Knowledge Assist and Amazon OpenSearch Service for vector/semantic search. The SaaS application layer runs on Amazon Elastic Kubernetes Service (Amazon EKS) and AWS Lambda serverless compute. Radhakrishnan comments, “The robustness of each of these services helps us focus on the application, rather than having to worry about the scaling and stability of the underlying framework. That speeds up our entire development process.”
Knowledge Assist upholds the same stringent security and privacy requirements that make Fractal trusted by its clients. The platform uses private endpoints to protect data within each client’s network, and data is encrypted end to end. Additionally, personally identifiable information is masked before storage in the application’s analytics layer.
Outcome | Empowering Employees to Serve Customers Faster, Better
During a pilot program in the first six months after launch, nearly 500 knowledge workers in contact centers adopted Knowledge Assist as a unified knowledge base. The solution fielded hundreds of thousands of queries per month and seamlessly handled complex data from more than 10,000 documents across three file formats, namely .pdf, .docx, and .pptx.
From the pilot group, Fractal’s client observed a 10–15 percent reduction in average data retrieval time. Clients utilizing Knowledge Assist for customer-facing information queries reported a 30 percent call deflection rate, thanks to its self-service capability.
All of Fractal’s clients reported improved customer and employee satisfaction, less supervisor involvement in calls, and enhanced upsell opportunities due to more time available on each call. Radhakrishnan explains, “With Knowledge Assist, customers not only get faster, but also better answers, leading to CSAT [customer satisfaction score] improvement. Likewise, agents are less frustrated because they don’t need to go into different systems looking for answers.”
Another benefit of the solution is enhanced compliance. “Knowledge Assist always provides the latest information, reducing the instances where customers are given wrong or outdated information from a knowledge worker’s best guesses or oversight,” Radhakrishnan explains. “We’re experiencing an overall higher level of first-time issue resolution.”
Today, Fractal is looking at implementing more automated LLM evaluations and ways to generate fresh insights into calls—helping clients proactively address recurring issues to reduce call volume.
Learn More
To learn more, visit aws.amazon.com/ai/generative-ai/.
About Fractal Analytics
Fractal Analytics is a provider of AI solutions. Its vision is to power every decision in enterprises and to bring AI, engineering, and design to help the world’s most admired Fortune 500 companies.
AWS Services Used
Amazon Bedrock
Amazon Bedrock is a fully managed service that offers a choice of high-performing foundation models (FMs) from leading AI companies like AI21 Labs, Anthropic, Cohere, Meta, Mistral AI, Stability AI, and Amazon through a single API, along with a broad set of capabilities you need to build generative AI applications with security, privacy, and responsible AI.
Amazon OpenSearch Service
Amazon OpenSearch Service makes it easy for you to perform interactive log analytics, real-time application monitoring, website search, and more.
Learn more »
Amazon Elastic Kubernetes Service
Amazon Elastic Kubernetes Service (Amazon EKS) is a managed Kubernetes service to run Kubernetes in the AWS cloud and on-premises data centers.
Learn more »
Amazon Elastic Container Service
Amazon Elastic Container Service (ECS) is a fully managed container orchestration service that helps you to more efficiently deploy, manage, and scale containerized applications.
Learn more »
More Software & Internet Customer Stories
Get Started
Organizations of all sizes across all industries are transforming their businesses and delivering on their missions every day using AWS. Contact our experts and start your own AWS journey today.