Llama 4 Maverick
Open SourceOpen SourceMultimodalAPI Available
Benchmark Scores
88.1/100MMLU89.2
HumanEval91
Math82.5
Reasoning88
Coding90
About Llama 4 Maverick
Llama 4 Maverick is Meta's larger MoE model with 128 experts and 400B total parameters. It competes with GPT-4o and Gemini 2.0 Flash on many benchmarks while remaining open source.
Strengths
- +Top-tier open-source performance
- +128 experts MoE architecture
- +Strong coding and reasoning
- +Multimodal
Weaknesses
- -Large total parameter count
- -Complex deployment
- -Newer model
Pricing
Per 1M tokens
Input$0.30
Output$0.30
Quick Facts
- Context Window
- 1M
- Parameters
- 17B active (400B total)
- Release Date
- 2025-04-05
- Category
- Open Source
Compare Models
See how Llama 4 Maverick stacks up against other models