Tuesday, June 06, 2023

AI, ML: Hugging Face


Hugging Face - Wikipedia

Hugging Face, Inc. is an American company that develops tools for building applications using machine learning.[1] It is most notable for its transformers library built for natural language processing applications and its platform that allows users to share machine learning models and datasets.

Transformers Library[edit]

The Transformers library is a Python package that contains open-source implementations of transformer models for text, image, and audio tasks. It is compatible with the PyTorchTensorFlow and JAX deep learning libraries and includes implementations of notable models like BERT and GPT.[13] The library was originally called "pytorch-pretrained-bert"[14] which was then renamed to "pytorch-transformers" and finally "transformers."

Hugging Face Hub[edit]

The Hugging Face Hub is a platform (centralized web service) for hosting:[15]

  • Git-based code repositories, with features similar to GitHub, including discussions and pull requests for projects.
  • models, also with Git-based version control;
  • datasets, mainly in text, images, and audio;
  • web applications ("spaces" and "widgets"), intended for small-scale demos of machine learning applications.

Other Libraries[edit]

In addition to Transformers and the Hugging Face Hub, the Hugging Face ecosystem contains libraries for other tasks, such as dataset processing ("Datasets"), model evaluation ("Evaluate"), simulation ("Simulate"), machine learning demos ("Gradio").[16]


🤗 All things transformers with Hugging Face featuring Sasha Rush (Practical AI #98) |> Changelog




No comments: