Transformers.js

Run 🤗 Transformers in your browser!

Demo

Play around with some of these models:

to
Max length
No. beams
No. samples
Temp.
Top K

Context:

Question:

Answer:

Notes:
  • Clicking Generate for the first time will download the corresponding model from the HuggingFace Hub. All subsequent requests will use the cached model.
  • For more information about the different parameters, check out HuggingFace's guide to text generation.

Quick tour

Installation
To install via NPM, run:
npm i @xenova/transformers
Alternatively, you can use it in vanilla JS, without any bundler, by using a CDN or static hosting. For example, using ES Modules, you can import the library with:
<script type="module">
    import { pipeline } from 'https://cdn.jsdelivr.net/npm/@xenova/transformers';
</script>
Basic example
It's super easy to translate from existing code!
from transformers import pipeline

# Allocate a pipeline for sentiment-analysis
pipe = pipeline('sentiment-analysis')

out = pipe('I love transformers!')
# [{'label': 'POSITIVE', 'score': 0.999806941}]

Python (original)

import { pipeline } from '@xenova/transformers';

// Allocate a pipeline for sentiment-analysis
let pipe = await pipeline('sentiment-analysis');

let out = await pipe('I love transformers!');
// [{'label': 'POSITIVE', 'score': 0.999817686}]

JavaScript (ours)


In the same way as the Python library, you can use a different model by providing its name as the second argument to the pipeline function. For example:
// Use a different model for sentiment-analysis
let pipe = await pipeline('sentiment-analysis', 'nlptown/bert-base-multilingual-uncased-sentiment');

For the full list of available tasks and architectures, check out the documentation.