@@ -86,6 +86,18 @@ When TensorFlow 2.0 and/or PyTorch has been installed, you can install from sour
pip install[--editable] .
```
### Run the examples
Examples are included in the repository but are not shipped with the library.
Therefore, in order to run the examples you will first need to clone the
repository and install the bleeding edge version of the library. To do so, create a new virtual environment and follow these steps:
```bash
git clone git@github.com:huggingface/transformers
cd transformers
pip install .
```
### Tests
A series of tests are included for the library and the example scripts. Library tests can be found in the [tests folder](https://github.com/huggingface/transformers/tree/master/transformers/tests) and examples tests in the [examples folder](https://github.com/huggingface/transformers/tree/master/examples).
...
...
@@ -253,6 +265,11 @@ print("sentence_2 is", "a paraphrase" if pred_2 else "not a paraphrase", "of sen
## Quick tour of the fine-tuning/usage scripts
**Important**
Before running the fine-tuning scripts, please read the
[instructions](#run-the-examples) on how to
setup your environment to run the examples.
The library comprises several example scripts with SOTA performances for NLU and NLG tasks:
-`run_glue.py`: an example fine-tuning Bert, XLNet and XLM on nine different GLUE tasks (*sequence-level classification*)