huggingface.co Open in urlscan Pro
2600:1f18:147f:e850:698:5a11:1789:e209  Public Scan

Submitted URL: http://huggingface.co/
Effective URL: https://huggingface.co/
Submission: On December 22 via manual from IN — Scanned from DE

Form analysis 2 forms found in the DOM

<form>
  <div class="text-sm text-gray-500 mb-1.5">Mask token: <code>[MASK]</code></div> <label class="block "> <span
      class="  block overflow-auto resize-y py-2 px-3 w-full min-h-[42px] max-h-[500px] whitespace-pre-wrap inline-block border border-gray-200 rounded-lg shadow-inner outline-none focus:ring focus:ring-blue-200 focus:shadow-inner dark:bg-gray-925 svelte-1wfa7x9"
      role="textbox" contenteditable="" style="--placeholder:&quot;Your sentence here...&quot;;" spellcheck="false" dir="auto">The goal of life is [MASK].</span></label> <button class="btn-widget w-24 h-10 px-5 mt-2" type="submit">Compute</button>
</form>

<form><label class="block "> <span
      class="  block overflow-auto resize-y py-2 px-3 w-full min-h-[42px] max-h-[500px] whitespace-pre-wrap inline-block border border-gray-200 rounded-lg shadow-inner outline-none focus:ring focus:ring-blue-200 focus:shadow-inner dark:bg-gray-925 svelte-1wfa7x9"
      role="textbox" contenteditable="" style="--placeholder:&quot;Your sentence here...&quot;;" spellcheck="false" dir="auto">My name is Clara and I live in Berkeley, California. I work at this cool company called Hugging Face.</span></label>
  <button class="btn-widget w-24 h-10 px-5 mt-2" type="submit">Compute</button> </form>

Text Content

Hugging Face

 * Models
 * Datasets
 * Spaces
 * Docs
 * Solutions
 * Pricing
 * 

 * --------------------------------------------------------------------------------

 * Log In
 * Sign Up


THE AI COMMUNITY BUILDING THE FUTURE.

Build, train and deploy state of the art models powered by the reference open
source in machine learning.

Star

76,657
More than 5,000 organizations are using Hugging Face

ALLEN INSTITUTE FOR AI

non-profit • 148 models

META AI

company • 435 models

GRAPHCORE

company • 33 models

GOOGLE AI

company • 553 models

INTEL

company • 67 models

SPEECHBRAIN

non-profit • 59 models

MICROSOFT

company • 218 models

GRAMMARLY

company

Hub


HOME OF MACHINE LEARNING



Create, discover and collaborate on ML better.
Join the community to start your ML journey.


Sign Up


Tasks


PROBLEMS SOLVERS



Thousands of creators work as a community to solve Audio, Vision, and Language
with AI.


Explore tasks
Audio Classification

229 models

Image Classification

1,536 models

Object Detection

103 models

Question Answering

2,859 models

Summarization

689 models

Text Classification

13,478 models

Translation

1,806 models

Open Source


TRANSFORMERS



Transformers is our natural language processing library and our hub is now open
to all ML models, with support from libraries like Flair, Asteroid, ESPnet,
Pyannote, and more to come.


Read documentation
huggingface@transformers:~

from transformers import AutoTokenizer, AutoModelForMaskedLM
tokenizer = AutoTokenizer.from_pretrained("bert-base-uncased")
model = AutoModelForMaskedLM.from_pretrained("bert-base-uncased")

On demand


INFERENCE API



Serve your models directly from Hugging Face infrastructure and run large scale
NLP models in milliseconds with just a few lines of code.


Learn more
distilbert-base-uncased
Fill-Mask
Examples
Examples
Mask token: [MASK]
The goal of life is [MASK]. Compute
Computation time on Intel Xeon 3rd Gen Scalable cpu: cached

happiness
0.036

survival
0.031

salvation
0.017

freedom
0.017

unity
0.015
JSON Output Maximize
dbmdz/bert-large-cased-finetuned-conll03-english
Token Classification
Examples
Examples
My name is Clara and I live in Berkeley, California. I work at this cool company
called Hugging Face. Compute
Computation time on Intel Xeon 3rd Gen Scalable cpu: cached
My name is ClaraPER and I live in BerkeleyLOC, CaliforniaLOC. I work at this
cool company called Hugging FaceORG.
JSON Output Maximize
Science


OUR RESEARCH CONTRIBUTIONS

We’re on a journey to advance and democratize NLP for everyone. Along the way,
we contribute to the development of technology for the better.

🌸

T0


MULTITASK PROMPTED TRAINING ENABLES ZERO-SHOT TASK GENERALIZATION



Open source state-of-the-art zero-shot language model out of BigScience.


Read more

🐎

DistilBERT


DISTILBERT, A DISTILLED VERSION OF BERT: SMALLER, FASTER, CHEAPER AND LIGHTER



A smaller, faster, lighter, cheaper version of BERT obtained via model
distillation.


Read more

📚

HMTL


HIERARCHICAL MULTI-TASK LEARNING



Learning embeddings from semantic tasks for multi-task learning. We have
open-sourced code and a demo.


Read more

🐸

Dynamical Language Models


META-LEARNING FOR LANGUAGE MODELING



A meta learner is trained via gradient descent to continuously and dynamically
update language model weights.


Read more

🤖

State of the art


NEURALCOREF



Our open source coreference resolution library for coreference. You can train it
on your own dataset and language.


Read more

🦄

Auto-complete your thoughts


WRITE WITH TRANSFORMERS



This web app is the official demo of the Transformers repository's text
generation capabilities.


Start writing
Website
 * Model Hub
 * Inference API
 * AutoTrain
 * Organizations
 * Contributors
 * Expert Acceleration Program

Company
 * About
 * HF Store
 * Terms of service
 * Privacy
 * Jobs
 * Press

Resources
 * Course
 * Transformers docs
 * Blog
 * Forum
 * Newsletter
 * Service Status

Social
 * GitHub
 * Twitter
 * LinkedIn
 * Discord