Benchmark Scores
84/100MMLU87
HumanEval84.5
Math79
Reasoning86
Coding83.5
About Jamba 2
Jamba 2 is AI21 Labs' hybrid Mamba-Transformer model with a 512K context window that scales linearly with length. It excels at long-document understanding tasks where traditional transformers struggle with compute costs.
Strengths
- +512K context with linear scaling
- +Efficient Mamba architecture
- +Strong long-document tasks
- +Fast inference on long contexts
Weaknesses
- -Below frontier on short-context tasks
- -Newer architecture with less tooling
- -Limited community adoption
Pricing
Per 1M tokens
Input$2.00
Output$8.00
Quick Facts
- Context Window
- 512K
- Parameters
- 398B (MoE-Mamba)
- Release Date
- 2025-12-01
- Category
- Mid-Tier
Compare Models
See how Jamba 2 stacks up against other models