Posted On: Mar 1, 2024

Mistral AI’s Mixtral 8x7B and Mistral 7B foundation models are now generally available on Amazon Bedrock. Mistral AI models are now offered in Amazon Bedrock, joining other leading AI companies like AI21 Labs, Anthropic, Cohere, Meta, Stability AI, and Amazon. You now have even more choice of high-performing models available in Amazon Bedrock via a single API, so you can choose the optimal model to build generative AI applications with security, privacy, and responsible AI.

Mistral AI’s Mixtral 8x7B and Mistral 7B models elevate publicly available models to state-of-the-art performance. Mixtral 8x7B is a popular, high-quality sparse Mixture-of-Experts (MoE) model that is ideal for text summarization, question and answering, text classification, text completion, and code generation. Mistral 7B is the first foundation model from Mistral. It supports English text generation tasks with natural coding abilities and can quickly and easily be fine-tuned with your custom data to address specific tasks. The model is optimized for low latency with a low memory requirement and high throughput for its size. Mistral 7B is a powerful model supporting a variety of use cases from text summarization and classification to text completion and code completion.

Mistral AI’s Mixtral 8x7B and Mistral 7B models in Amazon Bedrock are available in the US West (Oregon) AWS Region. To learn more, read the AWS News launch blog, Mistral AI on Amazon Bedrock product page, and documentation. To get started with Mistral AI on Amazon Bedrock, visit the Amazon Bedrock console.