Nvidia debuts Nemotron 3 with hybrid MoE and Mamba-Transformer to drive efficient agentic AI
PositiveTechnology

- Nvidia has launched Nemotron 3, a new version of its AI models, featuring a hybrid mixture-of-experts architecture designed to enhance accuracy and reliability for agentic AI applications. The models come in three sizes: Nano with 30B parameters, Super with 100B parameters, and Ultra with around 500B parameters, catering to various complexities in AI tasks.
- This development is significant for Nvidia as it reinforces its position in the competitive AI landscape, providing enterprises with improved scalability and performance for multi-agent autonomous systems, which are increasingly in demand.
- The introduction of Nemotron 3 occurs amid a rapidly evolving AI market, where competitors like Mistral AI are launching open-source models, and companies such as Meta are exploring alternatives to Nvidia's chips. This highlights a broader trend of diversification in AI infrastructure, as firms seek to balance performance with cost and accessibility.
— via World Pulse Now AI Editorial System




