Generative AI Application Builder on AWS

Rapidly develop and deploy production-ready generative AI applications

Overview

One of the main challenges in building generative AI applications is complex cloud setup and the need for deep AI expertise. Generative AI Application Builder on AWS simplifies this process, helping you develop, test, and deploy AI applications without extensive AI knowledge. This solution speeds up your AI development by easily incorporating your business data, comparing the performance of large language models (LLMs), running multi-step tasks through AI Agents, quickly building extensible applications, and deploying them with enterprise-grade architecture. Generative AI Application Builder comes with a ready-to-use generative AI chatbot and API that can be quickly integrated into your business processes or applications.

This solution includes integrations with Amazon Bedrock and its LLMs in addition to LLMs deployed on Amazon SageMaker. It uses Amazon Bedrock tools for Retrieval Augmented Generation (RAG) to enhance AI responses, Amazon Bedrock Guardrails to implement safeguards and reduce hallucinations, and Amazon Bedrock Agents to create workflows for complex tasks. You can also connect to other AI models using LangChain or AWS Lambda. Start with the simple, no-code wizard to build AI applications for conversational search, AI-generated chatbots, text generation, and text summarization.

Use cases for this AWS Solution
  • Headline
More…

Benefits

Rapid experimentation

This solution allows users to experiment quickly by removing the heavy lifting required to deploy multiple instances with different configurations and compare outputs and performance. Experiment with multiple configurations of various LLMs, prompt engineering, enterprise knowledge bases, guardrails, AI agents, and other parameters.

Configurability

With pre-built connectors to a variety of LLMs, such as models available through Amazon Bedrock, this solution gives you the flexibility to deploy the model of your choice, as well as the AWS and leading FM services you prefer. You can also enable Amazon Bedrock Agents to fulfill various tasks and workflows.

Production-ready

Built with AWS Well-Architected design principles, this solution offers enterprise-grade security and scalability with high availability and low latency, ensuring seamless integration into your applications with high performance standards.

Extensible modular architecture

Extend this solution’s functionality by integrating your existing projects or natively connecting additional AWS services. Because this is an open-source application, you can use the included LangChain orchestration layer or Lambda functions to connect with the services of your choice.

Technical details

You can automatically deploy this architecture using the implementation guide and the accompanying AWS CloudFormation template that deploys three separate architectures:

  1. Agent Use Case – The Agent Use Case enables users to hand off tasks for completion using Amazon Bedrock Agents. You can select a model, write a few instructions in natural language, and Amazon Bedrock AI Agents will analyze, orchestrate, and complete the tasks by connecting to your data sources, or other APIs, to fulfill your request.
  2. Text Use Case – The Text Use Case enables users to experience a natural language interface using generative AI. This use case can be integrated into new or existing applications, and is deployable through the Deployment Dashboard or independently through a provided URL.
  3. Deployment Dashboard – The Deployment Dashboard is a web UI that serves as a management console for admin users to view, manage, and create their use cases. This dashboard enables customers to rapidly experiment, iterate, and deploy generative AI applications using multiple configurations of LLMs and data.
  • Agent Use Case
  • Text Use Case
  • Deployment Dashboard
Deployment options
Ready to get started?
Deploy this solution by launching it in your AWS Console
AWS Service
Amazon Bedrock

The easiest way to build and scale generative AI applications with foundation models.

Learn more 
Guidance
Guidance for Generative AI Deployments using Amazon SageMaker JumpStart

This Guidance demonstrates how to deploy a generative artificial intelligence (AI) model provided by Amazon SageMaker JumpStart to create an asynchronous SageMaker endpoint with the ease of the AWS Cloud Development Kit (AWS CDK).

Learn more 
Guidance
Guidance for Natural Language Queries of Relational Databases on AWS

This Guidance demonstrates how to build an application enabling users to ask questions directly of relational databases using natural language queries (NLQ).

Learn more 
AWS Service
Generative AI for every business

Boost productivity, build differentiated experiences, and innovate faster with AWS.

Learn more 
Case Study
Launching a High-Accuracy Chatbot Using Generative AI Solutions on AWS with Megamedia

This case study demonstrates how broadcast company Megamedia created a generative AI–powered chatbot to simplify access to important public information using AWS.

Read the case study 
Video
Solving with AWS Solutions: Generative AI Application Builder on AWS
Watch the video 

Was this page helpful?