Mixtral 8x22B
Open SourceOpen SourceAPI Available
Benchmark Scores
74.9/100MMLU77.8
HumanEval79
Math64.2
Reasoning75.5
Coding78
About Mixtral 8x22B
Mixtral 8x22B is Mistral AI's mixture-of-experts model with 141B total parameters but only 39B active per inference. It offers strong performance-to-cost ratio and is fully open source with Apache 2.0 license.
Strengths
- +MoE efficiency
- +Open weights
- +Good throughput
- +Strong code and math for its cost
Weaknesses
- -Older architecture
- -Below current frontier
- -Complex hosting
Pricing
Per 1M tokens
Input$0.90
Output$0.90
Quick Facts
- Context Window
- 64K
- Parameters
- 141B (39B active)
- Release Date
- 2024-04-17
- Category
- Open Source
Compare Models
See how Mixtral 8x22B stacks up against other models