Skip to main content

Sarvam 30B (high)

SarvamOpen WeightApache 2.0 · Commercial OK

Description

Sarvam-30B is an open-source 30B-parameter Mixture-of-Experts reasoning model from Sarvam AI trained from scratch and optimized for Indian languages, coding, and conversational workloads. It uses 128 sparse experts with 2.4B active parameters per token, Grouped Query Attention, and was pre-trained on 16 trillion tokens spanning code, mathematics, multilingual, and web data.

Release Date
2026-03-06
Parameters
30.0B
Context Length
Modalities

Capability Radar

11
general
10
coding
63
reasoning
37
scienceest.
40
agents
0
multimodal

Science uses a reasoning proxy when dedicated science benchmarks are unavailable.

Rankings

Domain#RankScoreSource
Agents & Tools90
36.0
LS
Code Ranking436
8.0
AA
General Ranking431
21.0
AA
Science297
36.0
AA

Benchmark Scores (LLM Stats)

Agents

BrowseComp35.5%SR

Biology

GPQA66.5%SR

Code

HumanEval92.1%SR
SWE-Bench Verified34.0%SR

Creativity

Arena-Hard v249.0%SR

Finance

MMLU85.1%SR
MMLU-Pro80.0%SR

General

MBPP0.93 / 100SR
LiveCodeBench v670.0%SR

Math

MATH-50097.0%SR
AIME 202596.7%SR
HMMT2574.2%SR
HMMT 202573.3%SR
Beyond AIME58.3%SR

AA Evaluation Indices

Intelligence Index
12.3
Coding Index
7.9
Gpqa
0.6
Tau2
0.3
Ifbench
0.3
Scicode
0.2
Hle
0.1
Terminalbench Hard
0.0
Lcr
0.0

LLM Stats Category Scores

Finance
80
Healthcare
80
Language
80
Legal
80
Math
80
Biology
70
Chemistry
70
General
70
Physics
70
Reasoning
70
Code
60
Writing
50
Creativity
50
Agents
40
Search
40
Frontend Development
30

Pricing

Input PriceFree
Output PriceFree
Blended Price (3:1)Free

Speed

Tokens/sec168.7 tokens/s
Time to First Token1.23s
Time to Answer13.09s

Available Providers

(LS internal units)

No provider data available

External Sources