Mixtral 8x22B

Open Source
By Mistral AIReleased 2024-04-1764K context141B (39B active) parameters
Open SourceAPI Available

Benchmark Scores

74.9/100
MMLU77.8
HumanEval79
Math64.2
Reasoning75.5
Coding78

About Mixtral 8x22B

Mixtral 8x22B is Mistral AI's mixture-of-experts model with 141B total parameters but only 39B active per inference. It offers strong performance-to-cost ratio and is fully open source with Apache 2.0 license.

Strengths

  • +MoE efficiency
  • +Open weights
  • +Good throughput
  • +Strong code and math for its cost

Weaknesses

  • -Older architecture
  • -Below current frontier
  • -Complex hosting

Pricing

Per 1M tokens

Input$0.90
Output$0.90

Quick Facts

Context Window
64K
Parameters
141B (39B active)
Release Date
2024-04-17
Category
Open Source

Compare Models

See how Mixtral 8x22B stacks up against other models

Try Mixtral 8x22B on Vincony

Compare Mixtral 8x22B with 400+ other AI models side-by-side. Run AI debates, generate images and videos, and find the best model for your task.

Visit Vincony.com