Mistral Large (Feb '24)
Описание
Mistral Large 3 (675B Instruct 2512) is a state-of-the-art general-purpose Multimodal granular Mixture-of-Experts model with 41B active parameters and 675B total parameters trained from scratch with 3000 H200s. This model is the instruct post-trained version in FP8, fine-tuned for instruction tasks, making it ideal for chat, agentic and instruction based use cases. A no-loss FP8 version to reduce resource requirements. Can be deployed on a node of B200s or H200s. Designed for reliability and long-context comprehension - It is engineered for production-grade assistants, retrieval-augmented systems, scientific workloads, and complex enterprise workflows.
Радар способностей
Science использует прокси на основе рассуждений, когда специализированные научные бенчмарки недоступны.
Рейтинги
| Домен | #Место | Оценка | Источник |
|---|---|---|---|
| Code Ranking | 355 | 19.0 | AA |
| General Ranking | 410 | 24.0 | AA |
| Math Reasoning | 288 | 25.0 | AA |
| Science | 404 | 23.0 | AA |
Оценки бенчмарков (LLM Stats)
Biology
Code
Factuality
General
Math
Индексы оценки AA
Оценки категорий LLM Stats
Цены
Скорость
Доступные провайдеры
(Внутренние единицы LS)| Провайдер | Цена ввода | Цена вывода |
|---|---|---|
| Mistral AI | 500K | 1.5M |