Skip to content

Google's Hugging Face Deal Harnesses 'Supercomputer' Power for Open-Source AI

Google Cloud collaborates with Hugging Face, offering developers cost-effective access to TPUs and GPUs for AI model development. A $4.5 billion valued Hugging Face welcomes this expansion, enriching the open-source AI ecosystem.

Google Cloud and Hugging Face Join Forces: Developers Get Access to TPUs and GPUs Without Subscription!

In a groundbreaking collaboration, Google Cloud partners with Hugging Face, enabling developers to harness the power of tensor processing units (TPUs) and GPU supercomputers without the need for a Google Cloud subscription. This strategic move allows outside developers using Hugging Face's platform to access Google's advanced computing resources in a cost-effective manner.

Hugging Face, a prominent AI model repository valued at $4.5 billion, hosts over 350,000 models, including open-sourced foundation models such as Meta's Llama 2 and Stability AI's Stable Diffusion. The platform serves as a go-to for developers, similar to GitHub for coders, offering a diverse range of models for training and deployment.

Google highlights that Hugging Face users can seamlessly integrate with the AI app-building platform Vertex AI and the Kubernetes engine, facilitating model training and fine-tuning. This integration is expected to roll out in the first half of 2024, providing developers with enhanced tools and resources for their AI projects.

The partnership emphasizes Google Cloud's commitment to supporting the open-source AI ecosystem development. While some of Google's models are available on Hugging Face, the collaboration extends the accessibility of Google's tensor processing units and GPU supercomputers, fostering innovation and collaboration within the AI community.