Jamba 1.5 Large

Open Source
By AI21 LabsReleased 2024-08-22256K context398B (94B active) parameters
Open SourceAPI Available

Benchmark Scores

73.9/100
MMLU80
HumanEval75.5
Math62.4
Reasoning76.8
Coding75

About Jamba 1.5 Large

Jamba 1.5 Large uses AI21's novel SSM-Transformer hybrid architecture for efficient long-context processing. Its 256K context window and Mamba-based architecture enable faster processing of very long documents.

Strengths

  • +Massive context window
  • +SSM-Transformer hybrid
  • +Efficient long-context processing
  • +Open weights

Weaknesses

  • -Below frontier on benchmarks
  • -Complex architecture
  • -Limited ecosystem

Pricing

Per 1M tokens

Input$2.00
Output$8.00

Quick Facts

Context Window
256K
Parameters
398B (94B active)
Release Date
2024-08-22
Category
Open Source

Compare Models

See how Jamba 1.5 Large stacks up against other models

More from AI21 Labs

Try Jamba 1.5 Large on Vincony

Compare Jamba 1.5 Large with 400+ other AI models side-by-side. Run AI debates, generate images and videos, and find the best model for your task.

Visit Vincony.com