AWS Partner Network (APN) Blog
How Infosys built AWS Generative AI based assistant for a healthcare payer company
![]() |
Infosys |
![]() |
By Pramit Saha, Senior Technology Architect — Infosys
Vignesh Venkatachalam, Senior Consultant – Infosys
Shrinivas Damodar Kudva, Technology Architect – Infosys
Tina Purushotam, Digital Specialist Engineer – Infosys
Catherine Alexander, Principal Customer Solutions Manager – AWS
Ashutosh Dubey, Gen AI Specialist Solutions Architect – AWS
Customer Service Representatives (CSR) working for any healthcare provider face challenges with response times while fielding questions from end users. These questions can be regarding policy details, claims clarifications, coverage, and more. Answering these questions timely and accurately is challenging due to the manual search process across various policy and member-specific documents. This inefficiency not only extends customer wait times, negatively impacting their experience, but also affects CSR productivity.
In this blog post, you will explore the solution “Generative AI Assistant for Customer Service Representatives”, aimed at enhancing CSR efficiency by utilizing generative AI (Gen AI) hosted on the Amazon Web Services (AWS).
Business Case
CSRs play a crucial role in shaping the customer experience. For one of our healthcare insurance customers, the CSRs were facing challenges around handling a high volume of inquiries (about 2000/day), including repetitive questions (about 50 to 60% of the volume). The knowledge required to answer customer queries is spread across five different systems, making it difficult to provide complete answers in a timely manner. Additionally, CSRs often lack access to the latest product information and troubleshooting guidelines, impacting their ability to resolve issues promptly.
These challenges resulted in decreased customer satisfaction, lower agent productivity, and increased operational expenses.
Solution Implementation Challenges
Implementing Gen AI for semantic search applications presents several challenges. It needs to handle diverse document sources and types (PDFs, Docx, XMLs and more), large volumes of data (100s of documents and 250,000+ of pages), and ensure periodic data synchronization. Additionally, extracting structured data from unstructured sources and optimizing performance for querying vector stores and LLMs are crucial considerations. These challenges require extensive and iterative testing along with fine-tuning the system.
Business and Technology Drivers
Businesses are seeking advanced solutions to enhance operational efficiency and customer engagement due to digital transformation and competitive pressures. Rapid advancements in Artificial Intelligence (AI) space are enabling organizations to innovate and implement sophisticated tools to address these demands. In the following table, we present the specific business and technology drivers for this use case.
Business Drivers | Technology Drivers |
Shorten the time required to respond to customer queries |
|
Automated ingestion of documents from various sources and document formats with Semantic AI (Metadata tagging, classification) |
|
Natural Language Query Search |
|
Human-like language generation capabilities in a Healthcare context. |
|
Intuitive User Experience |
|
Application Security |
|
Infrastructure Security |
|
Business Capabilities of the Solution
The application utilizes AI and GenAI to automate document ingestion, extraction and enrichment, enhancing search capabilities. Natural Language Processing (NLP) extracts meaningful content, while ontology-based tagging organizes and categorizes information. Cognitive Search enables intuitive interactions using natural language queries. Sophisticated ranking and boosting mechanisms refine content recommendations, and LLMs provide tailored summaries. These features collectively improve search efficiency and user experience.
Figure1. The Solution Features
Solution Functional Architecture
Let us discuss the functional architecture of this solution using the following figure.
Figure2. Solution Functional Architecture
The application consists of four layers, each serving a specific role: User Interface (UI), Service, AI, and Data Indexing. The UI layer handles user interactions, while the Service layer manages APIs and conversation storage. The Data Indexing layer connects Amazon Kendra to various data repositories to index documents. The AI layer uses techniques like Retrieval Augmented generation (RAG) to retrieve relevant documents and generate answers using Amazon Kendra, Amazon Bedrock, and Anthropic Claude LLM model. This layered approach ensures efficient handling of customer queries, leveraging AI capabilities and seamless integration with diverse data sources to optimize service delivery and user experience.
Solution: Reference Architecture
The following is the reference architecture for the solution as deployed on AWS.
Figure3. Solution Reference Architecture
Front End: The QnA portal is a single-page application built using Vue.js and JavaScript, with Vuex for state management. It is deployed on Amazon Elastic Container Service (ECS) as a Docker container and integrates with the organization’s existing SSO for authentication and authorization.
API Management: The application exposes APIs through Amazon API Gateway and enforces access control using a Lambda authorizer that validates access tokens from the user interface. The API expects user-selected parameters as filters for document searches. The application uses Amazon DynamoDB to store and retrieve data in JSON format.
Data Ingestion: The application utilizes Amazon Kendra as a vector database to index data extracted from various enterprise sources using connectors like Salesforce, Adobe Experience Manager, and a custom connector for Nuxeo DMS (document management service). Amazon SQS, SNS, and Lambda functions are employed for data processing and messaging.
Semantic Search: The application employs Amazon Kendra for semantic search, with results sent to Amazon Bedrock’s LLM for answer generation. LangChain is used to create chains that generate LLM responses with retrieved information and send answers to the user interface. Prompt engineering is tailored to requirements, and the Bedrock API facilitates interaction with the selected LLM (e.g., Claude 2). Custom synonyms (thesaurus) are added to Kendra for improved semantic search results.
Security Measures: To ensure secure and efficient access management within the AWS infrastructure, the application requires several IAM roles to grant permissions for accessing and managing services such as Bedrock and Kendra data sources. AWS Secrets Manager securely provides access to secrets required for authentication and authorization processes to connect the systems securely. An AWS KMS key is used to encrypt sensitive data at rest in the AWS data services such as Amazon S3 and Amazon DynamoDB. A private subnet is designated to host the data source, isolating it from public access. This configuration adheres to the best practices for security and network architecture. The solution is HIPAA-compliant as per the industry standards.
The application observability and monitoring is done using Amazon CloudWatch service.
Conclusion
The development of the “Generative AI Assistant for Customer Service Representatives” marks a significant leap forward in addressing the challenges faced by Customer Service Representatives (CSRs) in the healthcare sector. By leveraging advanced AI technologies hosted on the AWS, this solution not only streamlines response times for customer inquiries but also enhances the overall efficiency and productivity of CSRs. Overall benefits are categorized across three big areas:
Rapid response times: Most queries are now answered within 20 seconds, compared to minutes earlier.
Enhanced resolution rates: Over 75% of queries are completely resolved and 100% of responses supported with source links.
Reduced training time: Customer service representatives’ training has been significantly shortened, down from 3 weeks to 1 week.
Moving forward, the continuous refinement and optimization of this AI-powered assistant will be crucial in maintaining its effectiveness and relevance in an evolving healthcare landscape. As AI technologies continue to advance, solutions like these promise to redefine customer service standards, setting new benchmarks for responsiveness, accuracy, and efficiency in the industry.
References
how-infosys-built-an-enterprise-knowledge-management-assistant-using-generative-ai-on-aws
Infosys — AWS Partner Spotlight
Infosys is an AWS Premier Consulting Partner that helps enterprises transform through strategic consulting, operational leadership, and co-creation of solutions in mobility, sustainability, big data, and cloud computing.
Contact Infosys | Partner Overview
*Already worked with Infosys? Rate the Partner
*To review an AWS Partner, you must be a customer that has worked with them directly on a project.