Saltar al contenido principal

Mistral Large (Feb '24)

MistralMistralOpen WeightApache 2.0 · Commercial OK

Descripción

Mistral Large 3 (675B Instruct 2512) is a state-of-the-art general-purpose Multimodal granular Mixture-of-Experts model with 41B active parameters and 675B total parameters trained from scratch with 3000 H200s. This model is the instruct post-trained version in FP8, fine-tuned for instruction tasks, making it ideal for chat, agentic and instruction based use cases. A no-loss FP8 version to reduce resource requirements. Can be deployed on a node of B200s or H200s. Designed for reliability and long-context comprehension - It is engineered for production-grade assistants, retrieval-augmented systems, scientific workloads, and complex enterprise workflows.

Fecha de lanzamiento
2024-02-26
Parámetros
675.0B
Longitud del contexto
128K
Modalidades
image, text

Radar de capacidades

21
general
18
coding
23
reasoning
24
scienceest.
0
agents
75
multimodal

Science usa un proxy de razonamiento cuando los benchmarks científicos dedicados no están disponibles.

Rankings

Dominio#PosiciónPuntuaciónFuente
Code Ranking355
19.0
AA
General Ranking410
24.0
AA
Math Reasoning288
25.0
AA
Science404
23.0
AA

Puntuaciones de benchmarks (LLM Stats)

Biology

GPQA43.9%Aut.

Code

LiveCodeBench34.4%Aut.

Factuality

SimpleQA23.8%Aut.

General

MMMLU85.5%Aut.

Math

AMC_2022_2352.0%Aut.

Índices de evaluación AA

Intelligence Index
9.9
Math 500
0.5
Mmlu Pro
0.5
Gpqa
0.4
Scicode
0.2
Livecodebench
0.2
Hle
0.0
Aime
0.0

Puntuaciones por categoría LLM Stats

Language
90
Math
70
General
50
Reasoning
50
Biology
40
Chemistry
40
Physics
40
Code
30
Factuality
20

Precios

Precio de entrada$4 / 1M tokens
Precio de salida$12 / 1M tokens
Precio mixto (3:1)$6 / 1M tokens

Velocidad

Tokens/seg0.0 tokens/s
Retraso del primer token0.00s
Tiempo hasta la respuesta0.00s

Proveedores disponibles

(Unidades internas LS)
ProveedorPrecio de entradaPrecio de salida
Mistral AI500K1.5M

Fuentes externas