To scale its generative AI workloads while meeting data residency and security requirements—including System and Organization Controls (SOC) 2 compliance and Australian Prudential Regulation Authority (APRA) standards for financial services customers—inGenious.ai selected AWS to power the infrastructure behind its enterprise platform. “There’s a lot to be said for the reliability and maturity of AWS services because they give you the confidence to innovate,” says Chatterton. “Furthermore, knowing AWS deployment patterns inside and out helped us avoid additional compliance overheads.”
inGenious.ai built its next-generation chatbots using Amazon Bedrock, a fully managed service that provides access to a range of high-performing LLMs. “We had played around with Amazon Bedrock within a week of its release, which gave us a head start going into development,” says Chatterton. By using Amazon Bedrock, developers could experiment with multiple LLMs through a single interface, without needing to manage third-party integrations or complex infrastructure. “Having one point of integration through AWS meant we could keep our generative AI architecture simple, without any security or compliance changes,” Chatterton adds.
Thanks to the multi-model flexibility of Amazon Bedrock, the team evaluated Gemini 2, Llama 3.2, Mistral 7B, and Anthropic Claude 3.5 Haiku, before opting for Amazon Nova Micro. “The first benefit of Amazon Nova was its speed,” recalls Chatterton. “Our customers want fast response times, and Amazon Nova Micro was perfect for this use case, delivering subsecond performance.” inGenious.ai also valued the model’s reliability in adhering to prompt instructions. “The context window for responses isn’t huge, but each instruction plays a critical role in quality and compliance,” Chatterton explains. “Amazon Nova is great at accurately following these instructions across hundreds of thousands of requests.”
With the right model selected, inGenious.ai quickly moved into production. “With Amazon Bedrock, a single developer built and deployed a production-grade generative AI chatbot feature in just eight weeks,” says Chatterton. “Integrating models felt like plug and play—just a couple of clicks and off you go.” To streamline integration with customer data sources and reduce implementation costs, inGenious.ai uses Amazon Bedrock Knowledge Bases. “It helps us avoid costly custom development and deploy secure, scalable RAG [retrieval-augmented generation] systems,” states Chatterton. Furthermore, the business engages Amazon Bedrock Guardrails to protect sensitive information, further accelerating development. “Keeping everything within the AWS infrastructure meant we didn’t have to make changes to our security controls, and we could keep our architecture clean,” Chatterton adds.