Amazon partners with AI startup Hugging Face to utilize Amazon chips

Spread the love

Amazon’s cloud unit has announced a partnership with AI startup Hugging Face to facilitate running thousands of AI models on Amazon’s custom computing chips. Hugging Face, valued at $4.5 billion, is a prominent platform for AI researchers and developers to access and test open source AI models. Supported by tech giants such as Amazon, Google, and Nvidia, Hugging Face plays a crucial role in the AI ecosystem by providing resources like the Llama 3 model from Meta Platforms.

Typically, developers modify open source AI models for use in their software applications. The collaboration between Amazon and Hugging Face aims to streamline this process by enabling developers to utilize these models on Amazon Web Services (AWS) chip called Inferentia2. This partnership will enhance efficiency in running AI models and make it more accessible and cost-effective for developers.

Jeff Boudier, head of product and growth at Hugging Face, emphasized the importance of efficiency in making AI models accessible to a wider audience. AWS intends to attract more AI developers to its cloud services for AI delivery. While Nvidia leads in the market for training AI models, AWS asserts that its chips can efficiently operate these trained models for inference at a lower cost in the long run.

Overall, the collaboration between Amazon and Hugging Face signifies a significant step in advancing the accessibility and affordability of AI technologies for developers. The partnership aims to optimize the utilization of AI models on AWS infrastructure, enhancing the scalability and cost-efficiency of AI applications.

Article Source
https://www.thehindu.com/sci-tech/technology/amazon-ai-startup-hugging-face-pair-to-use-amazon-chips/article68206407.ece