Skip to main content
2025

inGenious.ai Improves Chatbot Comprehension by 80 percent with Amazon Nova and Amazon Bedrock

Learn how inGenious.ai developed a generative AI chatbot that improves comprehension of complex queries by 80 percent using Amazon Nova and Amazon Bedrock

Benefits

83%

decrease in latency

20%

lower AWS Lambda function cost per gigabyte-second

About inGenious.ai

inGenious.ai specializes in the development of artificial intelligence (AI)-powered chatbots. Designed for non-technical users, the chatbots seamlessly integrate with multiple business systems and provide companies with real-time insights into customer experiences.

Opportunity | Elevating Traditional Chatbot Intelligence

inGenious.ai, an Australia-based startup, provides a platform that helps organizations create and manage generative AI chatbots without needing in-house AI expertise. As demand grows for AI-powered customer engagement, enterprises turn to the platform to meet rising expectations for responsiveness—while maintaining enterprise-grade security, transparency, and control. 

Traditional natural language processing (NLP) chatbots often struggle to handle lengthy, complex queries or those with poor syntax—leading to frustrating user experiences. In addition, when reviewing the market, inGenious.ai noted that businesses building their own chatbots frequently faced challenges balancing cost and performance. Development was even slower in risk-averse sectors like financial services, where security and compliance concerns added further friction to innovation. 

To address these limitations, the team sought to enhance chatbot performance by integrating large language models (LLMs) capable of answering complex questions with greater accuracy. Mark Chatterton, founder and chief executive officer at inGenious.ai, explains, “We wanted to test and validate multiple LLMs to get to a solution that responded to queries in less than one second without sacrificing comprehension. Merging LLM and NLP technology to maximize speed, accuracy, and cost would be vital to our mission.” 

Solution | Simplifying Chatbot Deployment While Meeting Compliance Needs

To scale its generative AI workloads while meeting data residency and security requirements—including System and Organization Controls (SOC) 2 compliance and Australian Prudential Regulation Authority (APRA) standards for financial services customers—inGenious.ai selected AWS to power the infrastructure behind its enterprise platform. “There’s a lot to be said for the reliability and maturity of AWS services because they give you the confidence to innovate,” says Chatterton. “Furthermore, knowing AWS deployment patterns inside and out helped us avoid additional compliance overheads.”

inGenious.ai built its next-generation chatbots using Amazon Bedrock, a fully managed service that provides access to a range of high-performing LLMs. “We had played around with Amazon Bedrock within a week of its release, which gave us a head start going into development,” says Chatterton. By using Amazon Bedrock, developers could experiment with multiple LLMs through a single interface, without needing to manage third-party integrations or complex infrastructure. “Having one point of integration through AWS meant we could keep our generative AI architecture simple, without any security or compliance changes,” Chatterton adds. 

Thanks to the multi-model flexibility of Amazon Bedrock, the team evaluated Gemini 2, Llama 3.2, Mistral 7B, and Anthropic Claude 3.5 Haiku, before opting for Amazon Nova Micro. “The first benefit of Amazon Nova was its speed,” recalls Chatterton. “Our customers want fast response times, and Amazon Nova Micro was perfect for this use case, delivering subsecond performance.” inGenious.ai also valued the model’s reliability in adhering to prompt instructions. “The context window for responses isn’t huge, but each instruction plays a critical role in quality and compliance,” Chatterton explains. “Amazon Nova is great at accurately following these instructions across hundreds of thousands of requests.”

With the right model selected, inGenious.ai quickly moved into production. “With Amazon Bedrock, a single developer built and deployed a production-grade generative AI chatbot feature in just eight weeks,” says Chatterton. “Integrating models felt like plug and play—just a couple of clicks and off you go.” To streamline integration with customer data sources and reduce implementation costs, inGenious.ai uses Amazon Bedrock Knowledge Bases. “It helps us avoid costly custom development and deploy secure, scalable RAG [retrieval-augmented generation] systems,” states Chatterton. Furthermore, the business engages Amazon Bedrock Guardrails to protect sensitive information, further accelerating development. “Keeping everything within the AWS infrastructure meant we didn’t have to make changes to our security controls, and we could keep our architecture clean,” Chatterton adds.

Outcome | Scaling Customer Engagement with Precise, High-Performing AI Chatbots

After deploying its new generative AI chatbot into its platform, inGenious.ai gained the ability to rewrite and add context to customer queries before routing them to the appropriate NLP workflows. The solution also improved resolution speeds by generating precise summaries for smoother agent handovers—plus provided businesses with deeper insights into chatbot performance and customer engagement outcomes. “Using Amazon Bedrock, we built a chatbot that understands the nuance and unpredictability of human language better than ever before,” says Chatterton.

Major Australian businesses—including those in financial services and online bookmaking—adopted inGenious.ai chatbots to strengthen customer service operations. At National Australia Bank (NAB), the technology led to an 87 percent decrease in “I don’t know” chatbot responses and enabled approximately 10,000 more queries to be answered correctly on the first attempt. The bank also achieved sub-300 millisecond latency for real-time intent classification. Following these improvements, NAB recorded a 14-point increase in its Net Promoter Score.

At Latitude Financial Services, inGenious.ai chatbots helped reduce agent handover rates by 24 percent by automatically summarizing customer interactions. With more queries resolved without escalation, support ticket volume dropped by 33 percent—allowing agents to focus on complex issues. Compared to a traditional NLP-based solution, the updated platform delivered a 28 percent decrease in comprehension failures, an 80 percent improvement in handling long and complex queries, and a 31 percent gain in overall accuracy. 

inGenious.ai plans to build on the chatbot’s success by adopting new Amazon Bedrock features, such as prompt caching and built-in knowledge base connectors that eliminate the need for custom RAG implementations. As LLMs continue to advance, the team also expects to use prompt distillation to streamline long and complex inputs—helping reduce costs while preserving accuracy. “We’ll be leaning into Amazon Bedrock even more as the service evolves,” Chatterton concludes. “With AWS, a startup like ours gets the opportunity to build rapidly at the forefront of generative AI.”

Missing alt text value
Using Amazon Nova and Amazon Bedrock, we built a chatbot that understands the nuance and unpredictability of human language better than ever before.

Mark Chatterton

Founder and Chief Executive Officer, inGenious.ai