By Wayne Ma
Publication Date: 2026-03-02 05:49:00
The chip is aimed at speeding up inference computing, the term for running AI on servers after AI models have been fully developed, the newspaper said. The industry’s demand for AI chips is gradually shifting from training AI models to operating the