By Jim O'Dorisio, senior vice president and general manager, HPE Storage
Publication Date: 2026-02-19 17:53:00
Partner Content Artificial intelligence has a habit of simplifying narratives. The models get bigger. GPUs are getting faster. Everything else takes a backseat.
Enterprise AI budgets have long reflected that thinking. In the first projects, about 80 percent of the spending went to computing, most of the rest went to networking, and storage was funded with leftover money. It was treated as necessary infrastructure, but rarely as a strategic constraint.
That assignment made sense at the time. Early AI efforts were experimental, narrow in scope, and modeled after hyperscale environments where data was assumed to be local, curated, and disposable. That perception does not survive contact with enterprise AI.
As organizations move from experimentation to production, they are finding that the limiting factor in AI is not model capability, but data preparation. And that understanding is pulling storage…