By Anthony Di Pizio
Publication Date: 2026-04-08 23:11:00
Developing artificial intelligence (AI) requires substantial computing power, which is why it typically occurs in large, centralized data centers. This infrastructure includes thousands of specialized chips called graphics processing units (GPUs) specifically designed for data-intensive AI workloads.
High-bandwidth memory (HBM) is also a critical component in the hardware stack because it stores data in a ready state until GPUs are ready to process it. A low memory capacity can cause bottlenecks and force GPUs to pause their workloads while they wait to receive more data, whereas a high capacity keeps data flowing smoothly to unlock maximum processing speeds.
Nvidia (NVDA +2.06%) makes the world’s best GPUs, while Micron Technology (MU +7.52%) makes some of the highest-capacity HBM. Both companies have soared in value thanks to the AI boom, but which one has more upside potential from here?
Image source: Getty Images.
The case for Nvidia
Nvidia has led the market for AI data center chips since it launched the H100 GPU in 2022, which was based on its Hopper architecture. Today, the company’s Blackwell-based GB300 GPU delivers up to 50 times more performance than the H100 in certain configurations, so the pace of innovation has been staggering.
In the second half of this year, Nvidia will start shipping commercial quantities of its new Vera Rubin semiconductor platform, which includes the Rubin GPU, the Vera CPU, and a slate of new networking components. The company says…