Skip to main content

Ministral 3 14B

MistralMistralOpen WeightApache 2.0 · Commercial OK

Description

A balanced model in the Ministral 3 family, Ministral 3 14B is a powerful, efficient tiny language model with vision capabilities. This model is the reasoning post-trained version, trained for reasoning tasks, making it ideal for math, coding and stem related use cases. The Ministral 3 family is designed for edge deployment, capable of running on a wide range of hardware. Ministral 3 14B can even be deployed locally, capable of fitting in 24GB of VRAM in BF16, and less than 12GB of RAM/VRAM when quantized.

Release Date
2025-12-02
Parameters
14.0B
Context Length
262K
Modalities
image, text

Capability Radar

30
general
21
coding
35
reasoning
35
scienceest.
0
agents
10
multimodal

Science uses a reasoning proxy when dedicated science benchmarks are unavailable.

Rankings

Domain#RankScoreSource
Code Ranking318
22.0
AA
General Ranking322
34.0
AA
Math Reasoning265
30.0
AA
Science306
35.0
AA

Benchmark Scores (LLM Stats)

Biology

GPQA71.2%SR

Code

LiveCodeBench64.6%SR

Math

AIME 202489.8%SR
AIME 202585.0%SR

AA Evaluation Indices

Math Index
30.0
Intelligence Index
16.0
Coding Index
10.9
Mmlu Pro
0.7
Gpqa
0.6
Livecodebench
0.4
Ifbench
0.3
Aime 25
0.3
Tau2
0.3
Scicode
0.2
Lcr
0.2
Hle
0.0
Terminalbench Hard
0.0

LLM Stats Category Scores

Math
90
Reasoning
80
Biology
70
Chemistry
70
General
70
Physics
70
Code
60

Pricing

Input Price$0.2 / 1M tokens
Output Price$0.2 / 1M tokens
Blended Price (3:1)$0.2 / 1M tokens

Speed

Tokens/sec143.4 tokens/s
Time to First Token0.33s
Time to Answer0.33s

Available Providers

(LS internal units)

No provider data available

External Sources