Zyphra's ZAYA1-8B: Tiny On-Device Model Outperforms Claude Sonnet 4.5 and GPT-5 on Reasoning Benchmarks
Palo Alto startup Zyphra released ZAYA1-8B, an 8B MoE model trained on AMD GPUs that beats much larger frontier models in math, coding, and reasoning while running efficiently on edge devices. Available open-source under Apache 2.0, it highlights efficiency gains in specialized architectures.