What the Hugging Face and Microsoft Collaboration Means for Applied AI

This article is part of our series exploring the artificial intelligence business.

Last week, Hugging Face announced a new product in collaboration with Microsoft called Embracing face endpoints on Azurewhich allows users to configure and run thousands of machine learning models on Microsoft’s cloud platform.

Having started as a chatbot app, Hugging Face rose to prominence as a hub for transformer modelsa type of deep learning architecture that has been behind many recent advances in artificial intelligence, including great language models like OpenAI GPT-3 and DeepMind’s Protein Folding Model Alpha folding.

Humanoid greetings

Subscribe now for a weekly recap of our favorite AI stories

Big tech companies like Google, Facebook, and Microsoft have been using transformer models for several years. But the past two years have seen growing interest in processors among small businesses, many of which lack in-house machine learning talent.

This is a great opportunity for companies like Hugging Face, whose vision is to become the GitHub for machine learning. The company recently obtained $100 million in Series C at a $2 billion valuation. The company wants to provide a wide range of machine learning services, including ready-to-use transformer models.

However, building a business around Transformers presents challenges that favor big tech companies and disadvantage companies like Hugging Face. Hugging Face’s collaboration with Microsoft may be the start of market consolidation and possible acquisition in the future.

https://i0.wp.com/bdtechtalks.com/wp-content/uploads/2022/05/transformer-neural-network.jpg?resize=696%2C435&ssl=1

Transformer models can perform many tasks, including classification, synthesis, and text generation; answer to a question; Translation; writing software source code; and speech-to-text conversion. More recently, Transformers have also branched out into other fields, such as drug research and computer vision.

One of the main advantages of transformer models is their ability to scale. Recent years have shown that the performance of transformers increases as they are scaled up and trained on larger data sets. However, forming and operating large transformers is very difficult and expensive. A recent Facebook post shows some of the behind-the-scenes challenges of training very large language models. Although not all processors are as big as OpenAI’s GPT-3 and Facebook’s OPT-175B, they are still difficult to master.

Hugging Face provides a large repertoire of pre-trained ML models to ease the deployment burden on transformers. Developers can load transformers directly from the Hugging Face library and run them on their own servers.

The pre-formed models are ideal for experimentation and fine-tuning of transformers for downstream applications. However, when it comes to applying ML models to real products, developers need to consider many other parameters, including integration, infrastructure, scaling, and retraining costs. . If not properly configured, transformers can be expensive to operate, which can significantly impact the business model of the product.

Therefore, while transformers are very useful, many organizations that stand to benefit from them lack the talent and resources to train them or manage them profitably.

Embracing face endpoints on Azure

https://i0.wp.com/bdtechtalks.com/wp-content/uploads/2022/05/Hugging-Face-Endpoints-on-Azure.jpg?resize=696%2C424&ssl=1

An alternative to running your own transformer is to use ML models hosted on cloud servers. In recent years, several companies have launched services allowing to use machine learning models through API calls without the need to know how to train, configure and deploy ML models.

Two years ago, Hugging Face launched its own ML service, called Inference API, which provides access to thousands of pre-trained models (mostly transformers) as opposed to the limited options of other services. Customers can rent the shared resource-based inference API or have Hugging Face set up and maintain the infrastructure for them. Hosted models make ML accessible to a wide range of organizations, just as cloud hosting services brought blogs and websites to organizations that couldn’t set up their own web servers.

So why did Hugging Face turn to Microsoft? Turning hosted ML into a profitable business is very complicated (see, for example, OpenAI’s GPT-3 API). Companies like Google, Facebook, and Microsoft have invested billions of dollars in creating specialized processors and servers that reduce the costs of running transformers and other machine learning models.

Hugging Face Endpoints takes advantage of key Azure features, including flexible scaling options, global availability, and security standards. The interface is easy to use and only takes a few clicks to set up a consumption model and configure it to accommodate different demand volumes. Microsoft has already created a massive infrastructure to run transformers, which will likely reduce the cost of delivering Hugging Face ML models. (Currently in beta, Hugging Face Endpoints is free and users only pay Azure infrastructure costs. The company plans a usage-based pricing model when the product becomes available to the public.)

More importantly, Microsoft has access to a large share of the market targeted by Hugging Face.

According to face hugging blog“As 95% of Fortune 500 companies trust Azure for their business, it made perfect sense for Hugging Face and Microsoft to tackle this issue together.”

Many businesses find it frustrating to sign up and pay for various cloud services. The integration of Hugging Face’s hosted ML product with Microsoft Azure ML reduces barriers to delivering value from its product and expands the company’s market reach.

Image credit: 123RF (with modifications)

https://i0.wp.com/bdtechtalks.com/wp-content/uploads/2022/05/microsoft-hugging-face-partnership.jpg?resize=696%2C435&ssl=1

Hugging Face Endpoints can be the start of many more product integrations in the future, as Microsoft’s suite of tools (Outlook, Word, Excel, Teams, etc.) has billions of users and provides many use cases. use for transformer models. Company executives have already hinted at plans to expand their partnership with Microsoft.

“This is the start of the collaboration between Hugging Face and Azure that we are announcing today as we work together to make our solutions, machine learning platform and models accessible and easy to use on Azure. Hugging Face Endpoints on Azure is our first solution available on Azure Marketplace, but we are working hard to bring more Hugging Face solutions to Azure,” said Jeff Boudier, Product Director at Hugging Face. Tech Crunch. “We recognized [the] barriers to deploying machine learning solutions in production [emphasis mine] and began working with Microsoft to meet growing interest in a simple, out-of-the-box solution.

This can be hugely beneficial to Hugging Face, which needs to find a business model to justify its $2 billion valuation.

But Hugging Face’s collaboration with Microsoft won’t be without compromise.

Earlier this month, in a interview with ForbesClément Delangue, co-founder and CEO of Hugging Face, said he had turned down several “significant acquisition offers” and would not sell his company, like GitHub did to Microsoft.

However, the direction his company is taking right now will make his business model increasingly dependent on Azure (again, OpenAI provides a good example of where things are going) and possibly reduce the market for his product. independent inference API.

Without Microsoft’s market reach, Hugging Face’s product(s) will have greater barriers to adoption, a lower value proposition, and higher costs (the “Barriers” referenced above). And Microsoft can always launch a competing product that will be better, faster and cheaper.

If a Microsoft acquisition proposal comes along, Hugging Face will have to make a tough choice. It is also a reminder of where the market for large language models and applied machine learning is heading.

In comments posted to the Hugging Face blog, Delangue said, “Hugging Face’s mission is to democratize good machine learning. We strive to help every developer and organization build high-quality, ML-powered applications that positively impact society and business. »

Indeed, products like Hugging Face Endpoints will democratize machine learning for developers.

But transformers and big language models are also inherently undemocratic and will give too much power to a few companies that have the resources to build and run them. As more people will be able to build products on Azure-powered Transformers, Microsoft will continue to secure and expand market share in what looks to be the future of applied machine learning. Companies like Hugging Face will have to suffer the consequences.

This article was originally published by Ben Dickson on TechTalks, a publication that examines trends in technology, how they affect the way we live and do business, and the problems they solve. But we also discuss the evil side of technology, the darker implications of new technologies, and what we need to watch out for. You can read the original article here.

Leave a Reply

Your email address will not be published.