Not enough good American open models? Nvidia wants to help

Not enough good American open models? Nvidia wants to help

By Tobias Mann
Publication Date: 2025-12-16 19:31:00

For many, enterprise AI adoption depends on the availability of high-quality open-weights models. Exposing sensitive customer data or hard-fought intellectual property to APIs so you can use closed models like ChatGPT is a non-starter.

Outside of Chinese AI labs, the few open-weights models available today don’t compare favorably to the proprietary models from the likes of OpenAI or Anthropic.

This isn’t just a problem for enterprise adoption; it’s a roadblock to Nvidia’s agentic AI vision that the GPU giant is keen to clear. On Monday, the company added three new open-weights models of its own design to its arsenal.

Open-weights models are nothing new for Nvidia — most of the company’s headcount is composed of software engineers. However, its latest generation of Nemotron LLMs is by far its most capable and open.

When they launch, the models will be available in three sizes, Nano, Super, and Ultra, which weigh in at about 30, 100, and 500 billion parameters, respectively. 

In addition to the model weights, which will roll out on popular AI repos like Hugging Face over the next few months beginning with Nemotron 3 Nano this week, Nvidia has committed to…