Skip to main content

Mistral Large (Feb '24)

MistralMistralOpen WeightApache 2.0 · Commercial OK

Description

Mistral Large 3 (675B Instruct 2512) is a state-of-the-art general-purpose Multimodal granular Mixture-of-Experts model with 41B active parameters and 675B total parameters trained from scratch with 3000 H200s. This model is the instruct post-trained version in FP8, fine-tuned for instruction tasks, making it ideal for chat, agentic and instruction based use cases. A no-loss FP8 version to reduce resource requirements. Can be deployed on a node of B200s or H200s. Designed for reliability and long-context comprehension - It is engineered for production-grade assistants, retrieval-augmented systems, scientific workloads, and complex enterprise workflows.

Release Date
2024-02-26
Parameters
675.0B
Context Length
128K
Modalities
image, text

Capability Radar

21
general
18
coding
23
reasoning
24
scienceest.
0
agents
75
multimodal

Science uses a reasoning proxy when dedicated science benchmarks are unavailable.

Rankings

Domain#RankScoreSource
Code Ranking355
19.0
AA
General Ranking410
24.0
AA
Math Reasoning288
25.0
AA
Science404
23.0
AA

Benchmark Scores (LLM Stats)

Biology

GPQA43.9%SR

Code

LiveCodeBench34.4%SR

Factuality

SimpleQA23.8%SR

General

MMMLU85.5%SR

Math

AMC_2022_2352.0%SR

AA Evaluation Indices

Intelligence Index
9.9
Math 500
0.5
Mmlu Pro
0.5
Gpqa
0.4
Scicode
0.2
Livecodebench
0.2
Hle
0.0
Aime
0.0

LLM Stats Category Scores

Language
90
Math
70
General
50
Reasoning
50
Biology
40
Chemistry
40
Physics
40
Code
30
Factuality
20

Pricing

Input Price$4 / 1M tokens
Output Price$12 / 1M tokens
Blended Price (3:1)$6 / 1M tokens

Speed

Tokens/sec0.0 tokens/s
Time to First Token0.00s
Time to Answer0.00s

Available Providers

(LS internal units)
ProviderInput PriceOutput Price
Mistral AI500K1.5M

External Sources