Company:Hugging Face
This article relies too much on references to primary sources. (February 2023) (Learn how and when to remove this template message) |
Type | Private |
---|---|
Industry | Artificial intelligence, machine learning, software development |
Founded | 2016 |
Headquarters | Manhattan, New York City |
Area served | Worldwide |
Key people |
|
Products | Models, datasets, spaces |
Website | huggingface |
Hugging Face, Inc. is a French-American company based in New York City that develops tools for building applications using machine learning. It is most notable for its transformers library built for natural language processing applications and its platform that allows users to share machine learning models and datasets and showcase their work.
History
The company was founded in 2016 by French entrepreneurs Clément Delangue, Julien Chaumond, and Thomas Wolf in New York City , originally as a company that developed a chatbot app targeted at teenagers.[1] The company was named after the "hugging face" emoji.[1] After open sourcing the model behind the chatbot, the company pivoted to focus on being a platform for machine learning.
In March 2021, Hugging Face raised US$40 million in a Series B funding round.[2]
On April 28, 2021, the company launched the BigScience Research Workshop in collaboration with several other research groups to release an open large language model.[3] In 2022, the workshop concluded with the announcement of BLOOM, a multilingual large language model with 176 billion parameters.[4][5]
In December 2022, the company acquired Gradio, an open source library built for developing machine learning applications in python.[6]
On May 5, 2022, the company announced its Series C funding round led by Coatue and Sequoia.[7] The company received a $2 billion valuation.
On August 3, 2022, the company announced the Private Hub, an enterprise version of its public Hugging Face Hub that supports SaaS or on-premises deployment.[8]
In February 2023, the company announced partnership with Amazon Web Services (AWS) which would allow Hugging Face's products available to AWS customers to use them as the building blocks for their custom applications. The company also said the next generation of BLOOM will be run on Trainium, a proprietary machine learning chip created by AWS.[9][10][11]
In August 2023, the company announced that it raised $235 million in a Series D funding, at a $4.5 billion valuation. The funding was led by Salesforce, and notable participation came from Google, Amazon, Nvidia, AMD, Intel, IBM, and Qualcomm.[12]
Services and technologies
Transformers Library
The Transformers library is a Python package that contains open-source implementations of transformer models for text, image, and audio tasks. It is compatible with the PyTorch, TensorFlow and JAX deep learning libraries and includes implementations of notable models like BERT and GPT-2.[13] The library was originally called "pytorch-pretrained-bert"[14] which was then renamed to "pytorch-transformers" and finally "transformers."
Hugging Face Hub
The Hugging Face Hub is a platform (centralized web service) for hosting:[15]
- Git-based code repositories, including discussions and pull requests for projects.
- models, also with Git-based version control;
- datasets, mainly in text, images, and audio;
- web applications ("spaces" and "widgets"), intended for small-scale demos of machine learning applications.
Other libraries
In addition to Transformers and the Hugging Face Hub, the Hugging Face ecosystem contains libraries for other tasks, such as dataset processing ("Datasets"), model evaluation ("Evaluate"), simulation ("Simulate"), and machine learning demos ("Gradio").[16]
See also
- OpenAI
- Station F
References
- ↑ 1.0 1.1 "Hugging Face wants to become your artificial BFF" (in en-US). 9 March 2017. https://techcrunch.com/2017/03/09/hugging-face-wants-to-become-your-artificial-bff/.
- ↑ "Hugging Face raises $40 million for its natural language processing library". 11 March 2021. https://techcrunch.com/2021/03/11/hugging-face-raises-40-million-for-its-natural-language-processing-library.
- ↑ "Inside BigScience, the quest to build a powerful open language model". 10 January 2022. https://venturebeat.com/2022/01/10/inside-bigscience-the-quest-to-build-a-powerful-open-language-model/.
- ↑ "BLOOM". https://bigscience.huggingface.co/blog/bloom.
- ↑ "Inside a radical new project to democratize AI" (in en). https://www.technologyreview.com/2022/07/12/1055817/inside-a-radical-new-project-to-democratize-ai/.
- ↑ Nataraj, Poornima (2021-12-23). "Hugging Face Acquires Gradio, A Customizable UI Components Library For Python" (in en-US). https://analyticsindiamag.com/hugging-face-acquires-gradio-a-customizable-ui-components-library-for-python/.
- ↑ Cai, Kenrick. "The $2 Billion Emoji: Hugging Face Wants To Be Launchpad For A Machine Learning Revolution" (in en). https://www.forbes.com/sites/kenrickcai/2022/05/09/the-2-billion-emoji-hugging-face-wants-to-be-launchpad-for-a-machine-learning-revolution/.
- ↑ "Introducing the Private Hub: A New Way to Build With Machine Learning". https://huggingface.co/blog/introducing-private-hub.
- ↑ Bass, Dina (2023-02-21). "Amazon's Cloud Unit Partners With Startup Hugging Face as AI Deals Heat Up". Bloomberg News. https://www.bloomberg.com/news/articles/2023-02-21/amazon-s-aws-joins-with-ai-startup-hugging-face-as-chatgpt-competition-heats-up.
- ↑ Nellis, Stephen (2023-02-21). "Amazon Web Services pairs with Hugging Face to target AI developers". Reuters. https://www.reuters.com/technology/amazon-web-services-pairs-with-hugging-face-target-ai-developers-2023-02-21/.
- ↑ "AWS and Hugging Face collaborate to make generative AI more accessible and cost efficient | AWS Machine Learning Blog" (in en-US). 2023-02-21. https://aws.amazon.com/blogs/machine-learning/aws-and-hugging-face-collaborate-to-make-generative-ai-more-accessible-and-cost-efficient/.
- ↑ Leswing, Kif (2023-08-24). "Google, Amazon, Nvidia and other tech giants invest in AI startup Hugging Face, sending its valuation to $4.5 billion" (in en). https://www.cnbc.com/2023/08/24/google-amazon-nvidia-amd-other-tech-giants-invest-in-hugging-face.html.
- ↑ "🤗 Transformers". https://huggingface.co/docs/transformers/index.
- ↑ "First release". Nov 17, 2018. https://github.com/huggingface/transformers/releases/tag/v0.1.2.
- ↑ "Hugging Face Hub documentation". https://huggingface.co/docs/hub/index.
- ↑ "Hugging Face - Documentation". https://huggingface.co/docs.
Original source: https://en.wikipedia.org/wiki/Hugging Face.
Read more |