Jamba 1.5 Large
Open SourceOpen SourceAPI Available
Benchmark Scores
73.9/100MMLU80
HumanEval75.5
Math62.4
Reasoning76.8
Coding75
About Jamba 1.5 Large
Jamba 1.5 Large uses AI21's novel SSM-Transformer hybrid architecture for efficient long-context processing. Its 256K context window and Mamba-based architecture enable faster processing of very long documents.
Strengths
- +Massive context window
- +SSM-Transformer hybrid
- +Efficient long-context processing
- +Open weights
Weaknesses
- -Below frontier on benchmarks
- -Complex architecture
- -Limited ecosystem
Pricing
Per 1M tokens
Input$2.00
Output$8.00
Quick Facts
- Context Window
- 256K
- Parameters
- 398B (94B active)
- Release Date
- 2024-08-22
- Category
- Open Source
Compare Models
See how Jamba 1.5 Large stacks up against other models