Friday, July 19, 2024

AI: Transformers.js, with WASM

xenova/transformers.js: State-of-the-art Machine Learning for the web. Run 🤗 Transformers directly in your browser, with no need for a server! @GitHub

Transformers.js uses ONNX Runtime to run models in the browser (and node.js)

The best part about it, is that you can easily convert your pretrained PyTorch, TensorFlow, or JAX models to ONNX using 🤗 Optimum.

For more information, check out the full documentation.


 Transformers.js @Hugging Face

State-of-the-art Machine Learning for the web. Run 🤗 Transformers directly in your browser, with no need for a server!

Transformers.js is designed to be functionally equivalent to Hugging Face’s transformers python library, meaning you can run the same pretrained models using a very similar API. These models support common tasks in different modalities, such as:

📝 Natural Language Processing: text classification, named entity recognition, question answering, language modeling, summarization, translation, multiple choice, and text generation.

🖼️ Computer Vision: image classification, object detection, and segmentation.

🗣️ Audio: automatic speech recognition and audio classification.

🐙 Multimodal: zero-shot image classification.



Although Transformers.js was originally designed to be used in the browser, it’s also able to run inference on the server. In this tutorial, we will design a simple Node.js API that uses Transformers.js for sentiment analysis.

No comments: