Benefits
Meet Mistral AI
Mistral AI is on a mission to push AI forward. It’s cutting-edge models reflect the company's ambition to become the leading supporter of the generative AI community, and elevate publicly available models to state-of-the-art performance.
Use cases
Model versions
Mistral Large 2 (24.07)
The latest version of Mistral AI's flagship large language model, with significant improvements on multilingual accuracy, conversational behavior, coding capabilities, reasoning and instruction-following behavior.
Max tokens: 128K
Languages: Dozens of languages supported, including English, French, German, Spanish, Italian, Chinese, Japanese, Korean, Portuguese, Dutch, Polish, Arabic and Hindi
Fine-tuning supported: No
Supported use cases: multilingual translation, text summarization, complex multilingual reasoning tasks, math and coding tasks including code generation
Mistral Large (24.02)
Mistral Large is a cutting-edge text generation model with top-tier reasoning capabilities. Its precise instruction-following abilities enables application development and tech stack modernization at scale.
Max tokens: 32K
Languages: Natively fluent in English, French, Spanish, German, and Italian
Fine-tuning supported: No
Supported use cases: precise instruction following, text summarization, translation, complex multilingual reasoning tasks, math and coding tasks including code generation
Mistral Small (24.02)
Mistral Small is a highly efficient large language model optimized for high-volume, low-latency language-based tasks. It provides outstanding performance at a cost-effective price point. Key features of Mistral Small include RAG specialization, coding proficiency, and multilingual capabilities.
Max tokens: 32K
Languages: English, French, German, Spanish, Italian
Fine-tuning supported: No
Supported use cases: Optimized for straightforward tasks that can be performed in bulk, such as classification, customer support, or text generation
Mixtral 8x7B
A 7B sparse Mixture-of-Experts model with stronger capabilities than Mistral AI
7B. Uses 12B active parameters out of 45B total.
Max tokens: 32K
Languages: English, French, German, Spanish, Italian
Fine-tuning supported: No
Supported use cases: Text summarization, structuration, question answering,
and code completion
Mistral 7B
A 7B dense Transformer, fast-deployed and easily customizable. Small, yet
powerful for a variety of use cases.
Max tokens: 32K
Languages: English
Fine-tuning supported: No
Supported use cases: Text summarization, structuration, question answering,
and code completion
Customers
-
Zalando
Zalando is building the leading pan-European ecosystem for fashion and lifestyle e-commerce. They offer an inspiring and quality multi-brand shopping experience for fashion and lifestyle products to about 50 million active customers in 25 markets.
-
BigDataCorp