Unverified Commit 01b8cd59 authored by Sylvain Gugger's avatar Sylvain Gugger Committed by GitHub
Browse files

Revert open-in-colab and add perceiver (#14683)

parent f6b87c5f
......@@ -218,6 +218,8 @@
title: GPT Neo
- local: model_doc/hubert
title: Hubert
- local: model_doc/perceiver
title: Perceiver
- local: model_doc/pegasus
title: Pegasus
- local: model_doc/phobert
......
......@@ -12,8 +12,6 @@ specific language governing permissions and limitations under the License.
# Benchmarks
[[open-in-colab]]
Let's take a look at how 🤗 Transformer models can be benchmarked, best practices, and already available benchmarks.
A notebook explaining in more detail how to benchmark 🤗 Transformer models can be found [here](https://github.com/huggingface/transformers/tree/master/notebooks/05-benchmark.ipynb).
......
......@@ -12,8 +12,6 @@ specific language governing permissions and limitations under the License.
# How to fine-tune a model for common downstream tasks
[[open-in-colab]]
This guide will show you how to fine-tune 🤗 Transformers models for common downstream tasks. You will use the 🤗
Datasets library to quickly load and preprocess the datasets, getting them ready for training with PyTorch and
TensorFlow.
......
......@@ -12,8 +12,6 @@ specific language governing permissions and limitations under the License.
# Multi-lingual models
[[open-in-colab]]
Most of the models available in this library are mono-lingual models (English, Chinese and German). A few multi-lingual
models are available and have a different mechanisms than mono-lingual models. This page details the usage of these
models.
......
......@@ -12,8 +12,6 @@ specific language governing permissions and limitations under the License.
# Perplexity of fixed-length models
[[open-in-colab]]
Perplexity (PPL) is one of the most common metrics for evaluating language models. Before diving in, we should note
that the metric applies specifically to classical language models (sometimes called autoregressive or causal language
models) and is not well defined for masked language models like BERT (see [summary of the models](model_summary)).
......
......@@ -12,8 +12,6 @@ specific language governing permissions and limitations under the License.
# Preprocessing data
[[open-in-colab]]
In this tutorial, we'll explore how to preprocess your data using 🤗 Transformers. The main tool for this is what we
call a [tokenizer](main_classes/tokenizer). You can build one using the tokenizer class associated to the model
you would like to use, or directly with the [`AutoTokenizer`] class.
......
......@@ -12,8 +12,6 @@ specific language governing permissions and limitations under the License.
# Quick tour
[[open-in-colab]]
Let's have a quick look at the 🤗 Transformers library features. The library downloads pretrained models for Natural
Language Understanding (NLU) tasks, such as analyzing the sentiment of a text, and Natural Language Generation (NLG),
such as completing a prompt with new text or translating in another language.
......
......@@ -12,8 +12,6 @@ specific language governing permissions and limitations under the License.
# Summary of the tasks
[[open-in-colab]]
This page shows the most frequent use-cases when using the library. The models available allow for many different
configurations and a great versatility in use-cases. The most simple ones are presented here, showcasing usage for
tasks such as question answering, sequence classification, named entity recognition and others.
......
......@@ -12,8 +12,6 @@ specific language governing permissions and limitations under the License.
# Summary of the tokenizers
[[open-in-colab]]
On this page, we will have a closer look at tokenization.
<Youtube id="VFp38yj8h3A"/>
......
......@@ -12,8 +12,6 @@ specific language governing permissions and limitations under the License.
# Fine-tuning a pretrained model
[[open-in-colab]]
In this tutorial, we will show you how to fine-tune a pretrained model from the Transformers library. In TensorFlow,
models can be directly trained using Keras and the `fit` method. In PyTorch, there is no generic training loop so
the 🤗 Transformers library provides an API with the class [`Trainer`] to let you fine-tune or train
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment