By TradingView
Publication Date: 2026-03-13 12:32:00
Major cloud companies are increasing efforts to design in-house artificial intelligence chips as alternatives to Nvidia hardware, a shift that could reshape the competitive landscape for AI infrastructure, according to Wedbush Securities.
The analysts said large hyperscalers including Alphabet GOOG, Amazon AMZN, and Meta Platforms
META are accelerating development of proprietary application-specific integrated circuits, or ASICs,custom chips designed for particular computing tasks, according to the note summarizing discussions from a recent investor call.
Wedbush said purchasing priorities among cloud providers appear to be moving away from raw computing capacity toward energy efficiency. The firm highlighted a metric known as joules per token, which measures how much power is required to generate a unit of AI output, suggesting companies are placing greater emphasis on throughput and power consumption as generative AI workloads expand.
The analysts added that demand patterns for AI chips may change as workloads shift from model training to inference, the stage where trained systems generate responses. That shift could favor more specialized processors designed for particular applications rather than broad-purpose chips supplied by Nvidia Corp. NVDA.
For emerging uses such as robotics and physical AI systems, the firm expects demand may lean toward processors optimized for latency, speed and efficiency, reflecting evolving requirements across AI computing workloads.