Scaling performance on 'Mixture of Experts' AI models is one of the biggest industry constraints, but it appears that NVIDIA has managed to make a breakthrough, credited to co-design performance scali
ng laws. NVIDIA's GB200 NVL72 AI Cluster Manages to Bring In 10x Higher Performance on the MoE-Focused Kimi K2 Thinking LLM The AI world has been racing to scale up foundational LLMs by ramping up tok ...
Автор: wccftech.com
Источник: https://wccftech.com/nvidia-shatters-moe-ai-performance-records-with-a-massive-10x-leap-on-gb200-nvl72/