X
Innovation

Amazon AWS, Hugging Face team up to spread open-source deep learning

Amazon and startup Hugging Face say the ability to rapidly assemble thousands of neural networks inside of Amazon SageMaker, and train them with greater ease, will “democratize” use of deep learning.
Written by Tiernan Ray, Senior Contributing Writer

Two years ago, the New York-based startup Hugging Face burst onto the natural language processing scene with a way to let many more parties participate in state-of-the art deep learning. Transformers, a programming kit to quickly grab any of a number of natural language neural networks, including Google's BERT, was posted as a library to be invoked from the dominant programming frameworks, PyTorch and TensorFlow.

It has grown and grown in popularity. More than 5,000 organizations are now using the library, the company says. 

The popularity of that toolkit has been recognized by Amazon AWS, the world's biggest clouding computing provider, and the two companies Tuesday announced a partnership to combine Transformers with the best aspects of Amazon AWS's programming tools.

"As we became the leading open platform for NLP [natural language processing], it's important for us to collaborate really heavily with AWS as the leader in cloud platforms," said Clemente Delangue, co-founder and chief executive of Hugging Face, in an interview with ZDNet Monday.

"Many of our customers have been doing Hugging Face for NLP, and this was really about getting the capabilities into the hands of those customers," said Bratin Saha,  Amazon's vice president of machine learning, in the same interview.

The partnership consists of a few parts. Hugging Face has built two new services, AutoNLP and an Accelerated Inference API, have been built upon Amazon's SageMaker suite for AI. Hugging Face said it has selected AWS as its "preferred cloud provider."

The partnership offers a container service specialized for deep learning, referred to as the "DLC," or deep learning container, to speed up deployment of Hugging Face's library of over 7,000 natural language processing models.

And all of it is integrated into the SageMaker development environment. The partnership is a nice high-profile win for Amazon, which in December, during its annual re:Invent conference for AWS, said that more and more sophisticated practitioners of machine learning are using SageMaker to speed up development. 

SageMaker, introduced in 2017, can automate a lot of the grunt work that goes into setting up and running such tasks. 

"Compared to software 1.0 open source, the main challenge is ease of use," said Delangue. "Machine learning is a science-driven topic; it is fairly hard to use for people who are not NLP researchers."

aws-and-hugging-face-2021-for-placement.jpg

The collaboration, said Delangue, "is really about a step-function change in making it easier to use, bridging the different tech stacks, making sure they are well-integrated; that's what we did with the deep learning here."

Also: Amazon AWS says 'Very, very sophisticated practitioners of machine learning' are moving to SageMaker

A range of issues, of course, come along with such democratization, including bias in AI models and data, and the question of whether training data comes with copyright issues. Both Delangue and Saha said their respective platforms have various measures to address issues such as bias.

"If you look at the models, we have a section about deep learning, the potential biases of your model," said Delangue, "which data sets the model has been trained on, and we've integrated our platform with a lot of explainability and ethical analyses for you to be able to really look at what these models are and what kind of biases they can present."

SageMaker Clarify is one of numerous tools introduced in December said Saha, that allows for bias analysis and provides for explainability. "This is why we have the most comprehensive set of capabilities for building and training models." Saha noted that AWS makes all customers adhere to an "acceptable use" policy, the guidelines of which can be found here.

From a performance standpoint, the integration of Hugging Face and SageMaker means that SageMaker will be able to bring benefits to users of deep learning models in Transformers that are the same kinds of benefits that Amazon showed off in December, things such as a dramatic speed-up in time to converge neural networks during the training phase. 

"We've brought purpose-built entry points, an estimator, that indicates the places where you can invoke these models," explained Saha. That estimator means that model parallelism that SageMaker has applied to large transformers such as the "T5" can work across the various Hugging Face models. 

"The same optimizations that made possible the fastest time on Mask R-CNN are going to be available across Hugging Face," said Saha, referring to the popular Facebook neural net for image segmentation.

Both Delangue and Saha described the partnership as an effort to "democratize" use of deep learning technology. 

"One hundred percent of companies are going to use natural language processing," said Delangue, "But one hundred percent of companies do not have NLP researchers; if you are not a researcher but more like a practitioner, you need easier-to-use tools. That's what this container is."

Also: What is GPT-3? Everything your business needs to know about OpenAI's breakthrough AI language program

Asked about the commercial field of packaged software companies in AI, companies such as Moveworks that are building tools based on corporate data, Delangue said such software can benefit from the Hugging Face library.

"These companies to me are doing a really good job, they are leveraging this, thanks to specialization."

Delangue indicated Hugging Face is already expanding to a broader platform. The company began its journey in so-called conversational AI. During a bake-off of chat bots in 2018, called ConvAI2, Hugging Face's use of Transformer-based technology swept the boards in several tests of chat bot quality, establishing the company's chops in that domain.

"We started in conversational AI, but we have evolved into a larger set of use cases," said Delangue, "meaning that we realized that what we were building for conversational AI was actually useful for a really broad range of NLP tasks, from text classification and information extraction,  text generation, to now some sort of platform for machine learning specialized in NLP. 

For its part, AWS continues to support numerous partners, said Saha. "We want to make sure that everyone understands that AWS and SageMaker provides the best platform for machine learning, and we are going to collaborate with a lot of people."

Also: Why chatbots still leave us cold

Hugging Face's Transformers library is open source, licensed under the Apache License, version 2.0. As a result, some deep learning work that is closed source is not available to the platform. That includes OpenAI's GPT-3, a language model that has been grabbing headlines over the past twelve months for seeming to generate surprisingly realistic natural language compositions. 

Asked about the inability to incorporate such closed source projects, Delangue said the bulk of breakthrough research remains open source in the deep learning community.

"It's an epi-phenomenon rather than the norm," said Delangue of OpenAI's decision not to release its model publicly. "What always works in the science field is publishing openly, that's how science makes progress; it's rarely one company just pushing the state of the art." The older versions of GPT are available in Transformers, suggesting that over time, some closed source can find its way into the library.

Hugging Face, which also has offices in Paris, has received venture capital financing of roughly $64 million in six rounds since 2016, according to data compiled by FactSet. That includes a $40 million Series B investment round announced on March 11th. That round included investors Lux Capital, A Capital Management LLC, and Betaworks Investments LLC.

Asked whether the partnership includes any financial arrangements, including any equity investment by Amazon in Hugging Face, or guarantees for revenue or usage, Delangue and Saha declined to comment. 

Editorial standards