Nvidia sells the lion’s share of the parallel compute underpinning AI training, and it has a very large – and probably dominant – share of AI inference. But will these hold?
This is a reasonable question as we watch the rise of homegrown XPUs for AI processing by the hyperscalers and cloud builders. They are all in various stages of creating their own Arm-based CPUs and vector/tensor math engines for AI workloads and, maybe someday, for supporting some traditional HPC simulation…
Article Source
https://www.nextplatform.com/2025/03/07/broadcom-and-marvell-ride-the-compute-engine-independence-wave/