Monday, January 5, 2026Major ReleaseHigh Impact
DeepSeekVersion V3
DeepSeek V3 Released as Open-Source with MoE Architecture
DeepSeek released V3, an open-source mixture-of-experts model with 671B total parameters (37B active). It achieves competitive performance with proprietary models while being freely available for commercial use.
Key Highlights
1
671B total parameters, 37B active (MoE)
2
Competitive with GPT-4o and Claude Sonnet 3.5
3
Apache 2.0 license for commercial use
4
Can run on consumer hardware with quantization