Guidance for Omnichannel Claims Processing Powered by Generative AI on AWS
How it works
These technical details feature an architecture diagram to illustrate how to effectively use this solution. The architecture diagram shows the key components and their interactions, providing an overview of the architecture's structure and functionality step-by-step.
Deploy with confidence
Ready to deploy? Review the sample code on GitHub for detailed deployment instructions to deploy as-is or customize to fit your needs.
Well-Architected Pillars
The architecture diagram above is an example of a Solution created with Well-Architected best practices in mind. To be fully Well-Architected, you should follow as many Well-Architected best practices as possible.
Operational Excellence
Lambda facilitates the integration of various AWS services, eliminating the need for manual infrastructure management and reducing the operational overhead associated with on-premises servers. Similarly, AWS Fargate , a serverless compute engine, abstracts away the underlying infrastructure, allowing teams to concentrate on the application logic rather than managing the foundational resources.
The capabilities of Amazon Bedrock include analysis of vehicle damage images and the estimation of repair and replacement costs. Furthermore, Amazon Bedrock helps teams to proactively monitor and maintain the health and performance of their AI applications, deploy changes, as well as identify and resolve any issues that may arise.
Read the Operational Excellence whitepaperSecurity
AWS Identity and Access Management (IAM) controls access to various services used in this Guidance through granular permissions based on roles. IAM policies have been scoped down to the minimum permissions required for the application to function properly. Furthermore, CloudFront , the content delivery network (CDN) service, improves the overall security of the web applications by providing traffic encryption, access controls, and integration with AWS Shield . Shield is a managed service that protects against distributed denial of service (DDoS) attacks, further bolstering the security of the application. Lastly, AWS WAF is integrated with CloudFront to provide an additional layer of security. AWS WAF allows teams to define custom rules to inspect web traffic and block requests that match specific patterns, such as those originating from known malicious IP addresses or exhibiting suspicious behaviour. This helps to protect the web applications from common web-based threats.
Read the Security whitepaperReliability
Amazon S3 provides the reliable and fault-tolerant storage capability for critical customer documents, owing to its highly durable and redundant storage architecture, as well as its ability to seamlessly replicate data across multiple Availability Zones (AZs).
Moreover, the Application Load Balancer is employed to distribute the workload across multiple Fargate instances, thereby enhancing high availability and fault tolerance.
CloudFront is used to globally distribute the frontend, caching the content closer to the geographical locations of users. Lastly, the incorporation of monitoring and observability capabilities through services such as Amazon CloudWatch enables the identification and resolution of any reliability issues that may arise.
Read the Reliability whitepaperPerformance Efficiency
The services integrated throughout this Guidance are designed to accommodate high-volume traffic, provide low-latency responses, and scale automatically to meet the evolving performance requirements of the application. For instance, the deployment of DynamoDB in an on-demand capacity configuration enables a high-performance, low-latency database service to the application, coupled with a scalable and efficient data storage approach, thereby helping to ensure fast and reliable data access.
Cost Optimization
Amazon Connect and Amazon Lex provide a pay-as-you-use pricing model, allowing users to only pay for the resources they consume, thus optimizing costs by eliminating the need for upfront investments while also reducing licensing costs.
OpenSearch Serverless is used as the vector database for the generative AI powered agent assistant. This is a fully managed and serverless search and analytics service, offering a scalable and cost-effective framework by automatically provisioning and scaling resources based on demand, reducing the overhead of infrastructure management.
Read the Cost Optimization whitepaperSustainability
This Guidance uses a variety of serverless services, including Amazon Lex , Lambda , Amazon S3 , DynamoDB , Fargate , and OpenSearch Serverless , which are designed to only consume resources as necessary, thereby helping to reduce the carbon footprint of the user. The dynamic scaling capabilities inherent to these serverless and managed services further contribute to sustainability by helping to ensure that resources are provisioned and scale based on actual demand, thereby avoiding the need to over-provision and maintain excess capacity. In contrast, traditional contact centers that operate within on-premises data centers, with provisioned compute resources and online data stores, often have a larger carbon footprint due to their energy consumption.
Finally, the Customer Carbon Footprint Tool , which enables users to measure, review, and forecast the carbon emissions generated from their AWS usage, facilitates informed decision-making and the implementation of sustainable practices.
Read the Sustainability whitepaperDisclaimer
Did you find what you were looking for today?
Let us know so we can improve the quality of the content on our pages