By Katie Tarasov
Publication Date: 2025-11-21 13:00:00
Nvidia outperformed all expectations, reporting soaring profits Wednesday thanks to its graphics processing units that excel at AI workloads. But more categories of AI chips are gaining ground.
Custom ASICs, or application-specific integrated circuits, are now being designed by all the major hyperscalers, from Google‘s TPU to Amazon‘s Trainium and OpenAI’s plans with Broadcom. These chips are smaller, cheaper, accessible and could reduce these companies’ reliance on Nvidia GPUs. Daniel Newman of the Futurum Group told CNBC that he sees custom ASICs “growing even faster than the GPU market over the next few years.”
Besides GPUs and ASICs, there are also field-programmable gate arrays, which can be reconfigured with software after they’re made for use in all sorts of applications, like signal processing, networking and AI. There’s also an entire group of AI chips that power AI on devices rather than in the cloud. Qualcomm, Apple and others have championed those on-device AI chips.
CNBC talked to experts and insiders at the Big Tech companies to break down the crowded space and the various kinds of AI chips out there.
GPUs for general compute
Once used primarily for gaming, GPUs made Nvidia the world’s most valuable public company after their use shifted toward AI workloads. Nvidia shipped some 6 million current-generation Blackwell GPUs over the past year.
Nvidia senior director of AI infrastructure Dion Harris shows CNBC’s Katie Tarasov how 72 Blackwell GPUs work together as…