Mistral AI Unveils Mistral 3: Flagship Open Multimodal Models Including Massive 675B Large 3
Mistral AI released the Mistral 3 family today, featuring efficient Ministral 3 dense models (3B, 8B, 14B) optimized for edge devices and the powerhouse Mistral Large 3, a 675B total parameter sparse MoE model with 41B active params. All models support multimodal image understanding, 40+ languages, and are fully open under Apache 2.0, debuting at #2 on LMSYS OSS leaderboards. Available now on major platforms with NVIDIA optimizations.
This open frontier model release intensifies competition, offering top-tier performance for free to developers and enterprises challenging closed labs.