Hugging Face Gets Hyperscalers and AI Vendors to Work (and Invest) Together

Sept. 19, 2023
Hugging Face, which defines itself as an AI community building the future, has significant support from cloud hyperscalers and the AI hardware community.
Hugging Face
Hugging Face Logo

When we think of the leading cloud hyperescalers, Google, AWS, and Microsoft, we most often paint them as giants locked in a fierce battle for market domination, taking no actions that would seem to not provide a direct benefit to their efforts.

If only for putting a dent in that image, Hugging Face is worth a look as the place to start your AI exploration and investment.

What is Hugging Face?

Hugging Face provides tools for building applications for machine learning. It is a repository for open source AI models and data sets that are freely accessible.

They make their money by offering features such as direct access to computing resources and customer support for the development of NLP and LLMs. Currently, there are over 300,000 AI models, 100,000 applications and 50,000 data sets for use by their customers and community.

Uniquely, those models include contributions from hyperscalers, and others, who are in direct competition with each other, a la Google, Microsoft and AWS.

Still in the start-up stage, Hugging Face had raised over $160 million in five rounds of funding from a variety of venture capital funds and angel investors to get their community-based (much like GitHub) AI development model off the ground. 

That investment more than doubled in their most recent round of funding, completed last month. This latest round raised $235 million, primarily from an interesting group of technology companies, including Amazon, AMD, Google, IBM, Intel, Nvidia, and Salesforce.

When you look at the models open-sourced on the Hugging Face site, you will see that all of these companies have been supportive of Hugging Face’s efforts, with some of these competitors providing hundreds of open-source models and data sets for users of the community.

Why use Hugging Face?

The platform allows users and developers to upload and share their models and projects.

Hugging Face has a collection of software tools that they call libraries, which users can utilize to accelerate the development of their work by evaluating model performance, cleaning up selected data sets, and taking advantage of the open source code that is provided on the site.

Users can avail themselves of the commercial side of Hugging Face by paying for access to compute and storage locally (within the Hugging Face platform), or continue using their existing cloud services by making use of the integration with Azure, Google Cloud, and AWS.

There is also an educational side to Hugging Face via their Classrooms app, which offers free resources, teaching materials and support for teachers and students.

This capability does a good job of highlighting the collaborative aspects of the site. Students and teachers can work together on the models, datasets, and demos that are hosted within the shared classroom space.

Classrooms appear to be an extension of Spaces, the feature that allows users to create and deploy ML demos quickly.

Spaces can be used to demonstrate, showcase, and share projects for anyone from a conference audience to involved stakeholders who wish to collaborate on the project. By default, Spaces are private, so you are not automatically sharing your work with the world.

Given that Hugging Face can be accessed at no cost, it is a pretty straightforward way for your developers to evaluate ML projects and build demos for LLM and other AI-focused projects.

If your business has already embraced the open-source development model, it is unlikely you will find a simpler way to start integrating AI development into your projects.

Keep pace with the fast-moving world of data centers and cloud computing by connecting with Data Center Frontier on LinkedIn, following us on X/Twitter and Facebook, and signing up for our weekly newsletters using the form below.

About the Author

David Chernicoff

David Chernicoff is an experienced technologist and editorial content creator with the ability to see the connections between technology and business while figuring out how to get the most from both and to explain the needs of business to IT and IT to business.

Sponsored Recommendations

The AI Disruption: Challenges and Guidance for Data Center Design

From large training clusters to small edge inference servers, AI is becoming a larger percentage of data center workloads. Learn more.

A better approach to boost data center capacity – Supply capacity agreements

Explore a transformative approach to data center capacity planning with insights on supply capacity agreements, addressing the impact of COVID-19, the AI race, and the evolving...

How Modernizing Aging Data Center Infrastructure Improves Sustainability

Explore the path to improved sustainability in data centers by modernizing aging infrastructure, uncovering challenges, three effective approaches, and specific examples outlined...

How Modern DCIM Helps Multi-Tenant Colocation Data Centers Be More Competitive

Discover the transformative impact of modern DCIM software on multi-tenant colocation data centers, enhancing competitiveness through improved resiliency, security, environmental...


Unpacking CDU Motors: It’s Not Just About Redundancy

Matt Archibald, Director of Technical Architecture at nVent, explores methods for controlling coolant distribution units (CDU), the "heart" of the liquid cooling system.

White Papers

Get the full report.

Focusing on Data Center Expertise

Feb. 19, 2022
A new paper from CBRE looks at the importance of outsourcing as a way of delivering real-world data center facility management success.