AWS Marketplace
Generative AI partner offerings in AWS Marketplace: Core & Infrastructure Software
Amazon Web Services (AWS) Marketplace helps you use leading-edge technology innovations with a selection of 15,000+ listings from more than 4000 sellers across 70 categories, including generative artificial intelligence (generative AI). In part one of this three-part blog post series on AWS Partner offerings for generative AI in AWS Marketplace, we will explore available core and infrastructure software offerings. The next two blogs will cover use case-specific generative AI offerings and generative AI professional services offerings categorized by customer stage. This series will help you navigate the hundreds of offerings for generative AI in AWS Marketplace using a straightforward classification system.
How soon can you start using generative AI?
There’s no doubt that 2023 was the tipping point for generative AI. The wide applicability of use cases combined with user demand has created urgency for organizations of all types and sizes to adopt this game-changing technology. The AWS partner community is ready to help you on your journey, with products and solutions that can help you deploy generative AI workloads quickly and efficiently to realize the full potential of generative AI faster.
Since the announcement of generative AI services on AWS in April 2023, the listings of generative AI partner offerings in AWS Marketplace have continued to evolve rapidly. These offerings give you access to the right tools and resources across each layer of the stack (see image 1 below for three layers of AWS Generative AI Stack) to drive successful generative AI solutions and the ability to scale.
Image 1: AWS Generative AI Stack
To help you navigate the breadth of partner solutions, I am sharing a classification framework for AWS Marketplace partner offerings. I have grouped them into core (offerings core to generative AI such as Foundation Models and Vector Databases) and infrastructure software offerings, use case-specific software and consulting offerings, and generative AI professional services offerings categorized by customer stage. In this post, I will focus on the core and infrastructure software offerings (see image 2 below for core and infrastructure partner offerings).
Image 2: Core and Infrastructure Software Offerings in AWS Marketplace
Core and infrastructure software offerings fall into three broad categories:
1. Foundation models (FMs) and vector databases
2. Large language model operations (LLMOps), observability, and security and privacy tools
3. Compute services, inference endpoints, and prepackaged offerings
Foundation models and vector databases
FMs and vector databases are the core components of a generative AI system. The FMs give the system the ability to understand and respond to prompts, while vector databases are used to store and retrieve various data points, which the FM uses to enhance the accuracy and reliability of responses. Tapping into partner offerings for FMs and vector databases in AWS Marketplace helps you get started with projects quickly and move forward with the confidence that you’re getting expertly developed and supported technology that integrates seamlessly with AWS to simplify deployment and management and optimize performance and scalability.
Foundational models are sophisticated machine learning (ML) models trained on a broad spectrum of generalized data and capable of generating original content and conversing in natural language. You can choose from dozens of cutting-edge FMs available on Amazon SageMaker and Amazon Bedrock. In addition to those, AWS Marketplace offers FMs from partners such as A121 Labs, Cohere, Stability.ai, LightOn, Jina AI, Voyage AI, Upstage, Alt Inc, Nomic, Maritaca AI, NCSOFT and Twelve Labs.
Amazon Bedrock is a fully managed service with a choice of FMs from leading AI companies such as AI21 Labs, Anthropic, Cohere, Meta, Mistral AI, Stability.ai, and Amazon through a single API.
Amazon SageMaker JumpStart is an ML hub that provides access to FMs, built-in algorithms, and pre-built ML solutions that you can deploy in a few steps. It provides access to FMs from popular model providers such as AI21 Labs, Amazon, Cohere, Hugging Face, Stability.ai, Meta AI, LightOn, and more.
Partner Story: Cohere
“Cohere is proud to be a part of the AWS Partner Network, where enterprise customers can access our leading LLMs on AWS Marketplace. We believe generative AI holds immense potential for enterprises, and we’re committed to working with AWS to enable companies to move beyond proof of concept and into production. Cohere’s Command and Embed models are highly scalable and optimized for advanced RAG capabilities to deliver accurate information with citations that mitigate hallucinations. We’ve found a like-minded partner that prioritizes data privacy and security for customers. By partnering with AWS, we’re able to offer our state-of-the-art LLMs to a vast number of enterprises, which can leverage the power of AI to streamline workflows, personalize customer experiences, and drive real business value. We look forward to continuing to help enterprise customers leverage the full potential of AI to unlock productivity and efficiency gains.”
Vinod Devan, Global Head of Partnerships – Cohere
Vector databases give you the ability to store and retrieve vectors as high-dimensional points. They are typically used to power visual, semantic, and multimodal search. They can also be paired with generative AI text models to create intelligent agents that provide conversational search experiences. Vector databases play a critical role in one of the most common implementation patterns for generative AI – Retrieval Augmented Generation (RAG) – and are well-suited to implement recommendation and personalization use cases.
Some vector databases are available in AWS Marketplace as software-as-a-service (SaaS). These include DataStax Astra DB, Elastic Cloud’s (Elasticsearch Service), Weaviate, Pinecone, MongoDB Atlas, Zilliz, Qdrant, Redis, Neo4j Aura, SingleStore, Kx Kdb Insights, CrateDB, and Couchbase Capella. Franz Inc. AllegroGraph and Unum UniStore CE are available as Amazon Machine Image (AMI).
Partner Story: MongoDB
“By choosing AWS as its cloud provider, Scalestack has been able to keep its cloud costs low and its data usage elastic. Scalestack also uses MongoDB Atlas Vector Search for RAG to perform quick searches over large datasets using vector similarity. The integration between Atlas Vector Search and Amazon Bedrock makes it easier for developers to create applications on AWS that use generative AI to complete complex tasks for a wide range of use cases and deliver up-to-date responses based on proprietary data processed by MongoDB Atlas Vector Search. Scalestack started using Amazon Bedrock in Q1 of 2024. The ability to customize Amazon Bedrock models allows the company to fine-tune AI models to specific use cases, enhancing the Spotlight copilot’s relevance and performance. Scalestack’s mission is to help organizations unlock sales productivity, and our relationship with MongoDB and AWS has been integral to that. Using Scalestack’s AI, customers have seen drastically reduced research times and significant increases in productivity. Customers saw a 40% increase in rep productivity, and GTM teams saw a 53X ROI on average on Scalestack, measured as delta revenues influenced by Scalestack.”
Elio Narciso, co-founder and CEO – Scalestack.
LLMOps, observability, and security and privacy tools
These tools are used to optimize the reliability, performance, and security of generative AI systems. LLMOps starts by reducing the complexity of deploying and operating LLMs, while observability tools provide insights into model behavior once they are up and running. Advanced security and privacy tools are critical for mitigating the risks inherent in handling sensitive or private data. AWS Marketplace allows you to find, buy, and start using these tools quickly and easily. Better yet, you get the benefits of a curated selection from trusted partners to help you accelerate innovation in the cloud.
LLMOps and observability tools are critical for accelerating time to value and reducing operational management complexity for generative AI projects. They can also provide explainability for enterprises in highly regulated industries. SaaS offerings in AWS Marketplace include CraftAI MLOps Platform, Comet MLOps, Arthur, Abacus.AI Platform, Weights & Biases AI Development Platform, and Union Cloud. There is also an AMI option, LangGenius Dify and an option on Amazon SageMaker Mphasis Factual Consistency Metric – LLMOps.
Security and privacy tools help secure generative AI applications and maintain privacy. SaaS solutions in AWS Marketplace include AIShield GuArdIan from Bosch, Add Value Machine’s Enterprise Generative AI Security Hub, Portal26’s AI TRiSM, MixMode Real-time Threat Detection and Response, Compliance.sh Security Compliance Automated with AI, and Blink Ops Blink security automation copilot.
Partner Story: AIShield
“A large multinational engineering and technology company was looking to leverage an LLM and generative AI–powered internal chatbot. Realizing that software engineers’ productivity is significantly enhanced with the assistance of a gen AI chatbot for building, explaining, and reviewing code, they started evaluating the implementation and usage challenges. For a responsible and careful implementation, there was a critical need to implement stringent controls within the chatbot utilized by software engineers during coding to prevent IP loss, data leaks, jailbreaking attempts, copyright infringement, and introducing code vulnerabilities to ensure a compliant coding environment and adhere to secure code practices.
GuArdIan has been seamlessly integrated as a middleware solution, equipped with preset policies to prevent the leakage of IP and sensitive information and mitigate code security vulnerabilities and manage the risks with a higher degree of control and customization. GuArdIan interfaces directly with Amazon Bedrock, providing access to a comprehensive suite of LLMs, complementing the underlying guardrails provided by the target LLM.
Leveraging the AWS infrastructure and our innovative approach has led to a system that is proficient in assimilating organizational coding standards and administering compliance checks within generated code—both as input to the LLM as well as output from the LLM. As outcomes, this company is now able to reduce the risk of IP and data leakage by 90%, and thwart jailbreaking attempts with an effectiveness boost of almost 400%, while maintaining the developer productivity gains with generative AI. The company IT operations team was able to install GuArdIan and set their policies in less than 30 minutes with our interactive GUI for three different applications connected to two different LLMs on Amazon Bedrock. The chatbot developers in the IT development team were able to integrate the GuArdIan middleware with five lines of code changes and deploy the GuArdIan features in less than 2 hours. Furthermore, GuArdIan’s interactive response serves as a vital learning tool, enhancing developer behavior while interacting with LLMs by differentiating between appropriate and inappropriate queries. Real-time monitoring allowed for immediate action against potential safety and security risks. The reasoning and observability feature provided detailed explanations for query decisions, aiding in compliance audits and reinforcing responsible AI usage.”
Manojkumar Parmar, CEO & CTO – AIShield
In addition to SaaS, AWS Marketplace offers different deployment options tailored to specific needs and preferences, such as launching virtual machines (VMs) with preconfigured software stacks using AMI, running containerized applications in a managed environment with Amazon Elastic Container Service (Amazon ECS), or using Kubernetes orchestration capabilities through Amazon Elastic Kubernetes Services (Amazon EKS). These options are available from partners such as Philterd Airlock, SpinSys MDACA PrivateGPT with Keycloak, Private AI: Detect, Redact and Anonymize PII, and Tonic.
Partner Story: Elastic
“Randstad has been using Elastic Security to run their security operations center (SOC), monitor with in-depth visibility, and respond to threats in our cloud workloads for years. Now, we’re looking at ways to safely leverage generative AI in production for detection engineering, especially with our small but mighty security team. We look forward to implementing Elastic’s AI Assistant and Amazon Bedrock to decrease our investigation time, reduce skill shortage impact, and improve our detection capabilities.”
Stijn Holzhauer, Lead Global Security Operations – Randstad Group Netherlands.
Platforms, compute services, inference endpoints, and prepackaged offerings
The offerings in this category facilitate the development, deployment, and management of generative AI models. They provide the necessary tools, infrastructure, and solutions to create, train, deploy, and use generative AI. Because they use cloud computing, they give you the scalability, flexibility, and accessibility to harness the power generative of AI without needing extensive hardware resources or specialized in-house expertise.
Generative AI platform offerings provide applications, libraries, or frameworks along with infrastructure for developing, deploying, and managing generative AI models. They range from end-to-end platforms to low-code and no-code development platforms. SaaS solutions include Fireworks AI, Stratio Generative AI Data Fabric, Arcee.ai SLM Adaptation System, IBM watsonx Orchestrate, FriendliAI Friendli Dedicated Endpoints, Reconify, Vectara GenAI Platform, Saturn Cloud, TrustPortal for AWS Generative AI and Articul8 AI.
Also available are solutions through AMI, AWS CloudFormation template, Amazon Elastic Compute Cloud (Amazon EC2) and Amazon Elastic Kubernetes Services (Amazon EKS), including NVIDIA AI Enterprise, Intel Distribution of OpenVINO Toolkit, Ai Bloks llmware for Enterprise-Grade LLM Applications, ZS Max.AI Platform, Datasaur.ai LLM Labs and ecosystem.Ai platform.
Compute services, inference endpoints, and prepackaged offerings come with various fulfillment options, such as OctoAI in a SaaS model and MK1 Flywheel in Amazon SageMaker. AMI and AWS CloudFormation template-based solutions include Meetrix.io and Apps4Rent offerings.
Fulfill all your generative AI needs in one place
Here at AWS, we are excited to see how we can work together to use generative AI to help you reinvent customer experiences, create new applications, and boost productivity. We have recently launched AWS Marketplace Generative AI partner solutions page to help you quickly and easily find trustworthy and secure generative AI solutions. The page features listings for generative AI models, tools, applications, and professional services designed to accelerate innovation. You can use it as your starting point to find a generative AI solution provider that can help you kick-start your transformation journey.
Author bio
Amit Singh
Amit is the WW Head of GTM & Use Cases, Generative AI & ML Partnerships at AWS. Outside of work, he likes to play soccer, read books, explore new beers, and enjoy outdoor activities like paddling and hiking.