Amazon Web Services (AWS) has announced a partnership with artificial intelligence startup Hugging Face to make it easier for developers to run AI models on Amazon’s custom computing chips. Hugging Face, valued at $4.5 billion, is a popular platform for AI researchers and developers to access and test open source AI models. This partnership will allow developers to use modified AI models on AWS’s Inferentia2 chip to drive their software applications.
Efficiency is a key focus for Hugging Face, as they strive to ensure that developers can run models in a profitable manner. By partnering with AWS, Hugging Face aims to make it more accessible for developers to utilize AI models on the cloud platform. While Nvidia is a dominant player in training AI models, AWS is positioning itself as a cost-effective option for running inferencing on trained models.
Matt Wood, who leads AI products at AWS, highlighted the importance of inferencing in AI applications, emphasizing the efficiency and cost savings of using Inferentia2 for running models. The partnership between Hugging Face and AWS is expected to attract more AI developers to AWS’s cloud services, offering a competitive alternative for inferencing tasks.
In conclusion, the collaboration between AWS and Hugging Face aims to streamline the process of running AI models on custom computing chips, providing developers with a cost-effective and efficient solution for deploying AI applications. This partnership showcases the growing importance of inferencing in AI development and highlights the potential for AWS to become a leading platform for AI services.
Article Source
https://finance.yahoo.com/news/amazon-ai-startup-hugging-face-005732908.html