NVIDIA Shatters MoE AI Performance Records With a Massive 10x Leap on GB200 ‘Blackwell’ NVL72 Servers, Fueled by Co-Design Breakthroughs

03.12.2025 в 19:02,
Hard news

Scaling performance on 'Mixture of Experts' AI models is one of the biggest industry constraints, but it appears that NVIDIA has managed to make a breakthrough, credited to co-design performance scali

ng laws. NVIDIA's GB200 NVL72 AI Cluster Manages to Bring In 10x Higher Performance on the MoE-Focused Kimi K2 Thinking LLM The AI world has been racing to scale up foundational LLMs by ramping up tok ...

Автор: wccftech.com
Источник: https://wccftech.com/nvidia-shatters-moe-ai-performance-records-with-a-massive-10x-leap-on-gb200-nvl72/
×