README.md 2.3 KB
Newer Older
LysandreJik's avatar
LysandreJik committed
1
2
3
4
5
# Examples

In this section a few examples are put together. All of these examples work for several models, making use of the very
similar API between the different models.

6
**Important**
thomwolf's avatar
thomwolf committed
7
8
To run the latest versions of the examples, you have to install from source and install some specific requirements for the examples.
Execute the following steps in a new virtual environment:
R茅mi Louf's avatar
R茅mi Louf committed
9
10

```bash
Julien Chaumond's avatar
Julien Chaumond committed
11
git clone https://github.com/huggingface/transformers
R茅mi Louf's avatar
R茅mi Louf committed
12
cd transformers
13
pip install .
thomwolf's avatar
thomwolf committed
14
pip install -r ./examples/requirements.txt
R茅mi Louf's avatar
R茅mi Louf committed
15
16
```

LysandreJik's avatar
LysandreJik committed
17
| Section                    | Description                                                                                                                                                |
18
|----------------------------|-----------------------------------------------------
19
| [TensorFlow 2.0 models on GLUE](#TensorFlow-2.0-Bert-models-on-GLUE) | Examples running BERT TensorFlow 2.0 model on the GLUE tasks. |
20
| [Running on TPUs](#running-on-tpus) | Examples on running fine-tuning tasks on Google TPUs to accelerate workloads. |
21
22
23
24
25
| [Language Model training](#language-model-training) | Fine-tuning (or training from scratch) the library models for language modeling on a text dataset. Causal language modeling for GPT/GPT-2, masked language modeling for BERT/RoBERTa. |
| [Language Generation](#language-generation) | Conditional text generation using the auto-regressive models of the library: GPT, GPT-2, Transformer-XL and XLNet. |
| [GLUE](#glue) | Examples running BERT/XLM/XLNet/RoBERTa on the 9 GLUE tasks. Examples feature distributed training as well as half-precision. |
| [SQuAD](#squad) | Using BERT/RoBERTa/XLNet/XLM for question answering, examples with distributed training. |
| [Multiple Choice](#multiple-choice) | Examples running BERT/XLNet/RoBERTa on the SWAG/RACE/ARC tasks. |
Jhuo IH's avatar
Jhuo IH committed
26
| [Named Entity Recognition](https://github.com/huggingface/transformers/tree/master/examples/ner) | Using BERT for Named Entity Recognition (NER) on the CoNLL 2003 dataset, examples with distributed training. |
VictorSanh's avatar
VictorSanh committed
27
| [XNLI](#xnli) | Examples running BERT/XLM on the XNLI benchmark. |
28
| [Adversarial evaluation of model performances](#adversarial-evaluation-of-model-performances) | Testing a model with adversarial evaluation of natural language inference on the Heuristic Analysis for NLI Systems (HANS) dataset (McCoy et al., 2019.) |
LysandreJik's avatar
LysandreJik committed
29