By Timothy Prickett Morgan
Publication Date: 2026-01-16 20:11:00
No, we did not miss the fact that Nvidia did an “acquihire” of AI accelerator and system startup and rival Groq on Christmas Eve. But, because our family was traveling on Christmas Day and The Next Platform was on holiday, we knew we would have to circle back to suss out what Nvidia was shelling out $20 billion to get its hands on.
We did – embarrassed to say – entirely miss that Nvidia did a similar and much smaller acquihire of key personnel and licensing of key intellectual property at network convergence startup Enfabrica back in the middle of September 2025 for a reported $900 million.
Both point to what could end up being a totally new approach to AI inferencing for the GPU accelerator and interconnect maker – so much so that the devices that Nvidia ultimately makes a few generations from now cannot be called GPUs at all.
One could almost make that case with the current crop of datacenter-class GPU accelerators from Nvidia, which look less and less like graphics processing units and more like complex aggregations of vector and tensor engines, caches, and fabric interconnects for doing the relatively low precision mathematics that underpins GenAI and other kinds of machine learning and sometimes HPC.
This deal with Groq is peculiar in a number of ways. The first one is why Groq’s investors sold in the first place. As we pointed out in our analysis of the $10 billion deal between AI model maker OpenAI and AI hardware upstart Cerebras Systems (which…