Mistral Large (Feb '24)
Descripción
Mistral Large 3 (675B Instruct 2512) is a state-of-the-art general-purpose Multimodal granular Mixture-of-Experts model with 41B active parameters and 675B total parameters trained from scratch with 3000 H200s. This model is the instruct post-trained version in FP8, fine-tuned for instruction tasks, making it ideal for chat, agentic and instruction based use cases. A no-loss FP8 version to reduce resource requirements. Can be deployed on a node of B200s or H200s. Designed for reliability and long-context comprehension - It is engineered for production-grade assistants, retrieval-augmented systems, scientific workloads, and complex enterprise workflows.
Radar de capacidades
Science usa un proxy de razonamiento cuando los benchmarks científicos dedicados no están disponibles.
Rankings
| Dominio | #Posición | Puntuación | Fuente |
|---|---|---|---|
| Code Ranking | 355 | 19.0 | AA |
| General Ranking | 410 | 24.0 | AA |
| Math Reasoning | 288 | 25.0 | AA |
| Science | 404 | 23.0 | AA |
Puntuaciones de benchmarks (LLM Stats)
Biology
Code
Factuality
General
Math
Índices de evaluación AA
Puntuaciones por categoría LLM Stats
Precios
Velocidad
Proveedores disponibles
(Unidades internas LS)| Proveedor | Precio de entrada | Precio de salida |
|---|---|---|
| Mistral AI | 500K | 1.5M |