Updated 27 mins ago 18 visible stories 1 fresh this run 9 active categories
Top Story Signal 9/10
Models blogs.nvidia.com 16 hours ago 100% confidence

NVIDIA Launches Nemotron 3 Super: Open 120B MoE Model Optimized for Agentic AI Systems

NVIDIA released Nemotron 3 Super, a fully open 120B parameter (12B active) hybrid Mamba-Transformer MoE model with 1M token context, designed for efficient multi-agent applications and high-throughput reasoning. It features latent MoE, multi-token prediction, and NVFP4 training, outperforming larger models in speed and efficiency while releasing weights, data, and recipes. Available now on Hugging Face and inference platforms.

Why it matters

Delivers state-of-the-art open agentic AI efficiency, accelerating multi-agent development on NVIDIA hardware and democratizing advanced capabilities.

Monitor

Intelligence Stream

Quick Bites

Perplexity Personal Computer: secure 24/7 AI on your Mac Mini
Big Tech
Cursor nears $50B valuation on $2B ARR, 125x growth in 19mo
Startups
Yann LeCun's AMI: $1.03B seed, $3.5B valuation, JEPA world models for robotics
Startups
Nemotron 3 Super: Latent MoE activates 4x experts at 1x cost, multi-token pred for 3x faster gen
Models
Nemotron 3 Super: 85.6% on agentic PinchBench, 5x throughput on Blackwell.
Models
Oracle RPO surges 325% YoY to $553B on AI deals
Big Tech
Nemotron 3 Super: 120B MoE hits top agentic benchmarks, 5x throughput on NVIDIA NIM.
Models
Nemotron 3 Super on Workers AI: blazing fast inference ⚡️
Models

Market Pulse

AI Pulse
55/100
neutral

AI stocks steady premarket; focus on custom chips and agentic models amid infrastructure hype

META +0.5%
Meta Platforms chips
NVDA +0.8%
NVIDIA chips

Recurring Movers

NVDA 8 hits · +1.2%
ORCL 3 hits · +8%
META 2 hits · -1.5%
LWLG 2 hits · +12%