Jamba 2

Mid-Tier
By AI21 LabsReleased 2025-12-01512K context398B (MoE-Mamba) parameters
API Available

Benchmark Scores

84/100
MMLU87
HumanEval84.5
Math79
Reasoning86
Coding83.5

About Jamba 2

Jamba 2 is AI21 Labs' hybrid Mamba-Transformer model with a 512K context window that scales linearly with length. It excels at long-document understanding tasks where traditional transformers struggle with compute costs.

Strengths

  • +512K context with linear scaling
  • +Efficient Mamba architecture
  • +Strong long-document tasks
  • +Fast inference on long contexts

Weaknesses

  • -Below frontier on short-context tasks
  • -Newer architecture with less tooling
  • -Limited community adoption

Pricing

Per 1M tokens

Input$2.00
Output$8.00

Quick Facts

Context Window
512K
Parameters
398B (MoE-Mamba)
Release Date
2025-12-01
Category
Mid-Tier

Compare Models

See how Jamba 2 stacks up against other models

Try Jamba 2 on Vincony

Compare Jamba 2 with 400+ other AI models side-by-side. Run AI debates, generate images and videos, and find the best model for your task.

Visit Vincony.com