Arcee AI Launches Trinity-Large-Thinking: Rare 399B Open-Source US-Made Reasoning Model for Enterprises
San Francisco-based Arcee AI released Trinity-Large-Thinking, a 399-billion parameter Mixture-of-Experts model under Apache 2.0, optimized for multi-step reasoning, long-context agents, and tool use. It rivals frontier models like Claude Opus on benchmarks such as PinchBench (91.9%) and AIME25 (96.3%), while being fully customizable and efficient with only 13B active parameters per token. Available now on Hugging Face and OpenRouter, it addresses demand for sovereign US alternatives amid geopolitical shifts.
Provides enterprises with a powerful, inspectable open-weight model from a US lab, filling a void as Chinese models go proprietary and big labs restrict access.