AMD’s CTO says AI inference will move out of data centers and increasingly to phones and laptops

AMD’s CTO says AI inference will move out of data centers and increasingly to phones and laptops

The lion’s share of artificial intelligence workloads moving from training to inference is great news for AMD, its CTO said.

AI training workloads — the ones that make up the gargantuan task of building large language models, imbuing them with a familiar writing or speaking style, and knowledge — used to be most of what AI computing was used for. Inference is the computing process that happens when AI generates outputs like answering questions or creating…

Article Source
https://www.businessinsider.com/ai-workloads-transition-inference-amd-mark-papermaster-edge-devices-2025-4