Commit 6f70bb8c authored by Rémi Louf's avatar Rémi Louf Committed by Lysandre Debut
Browse files

add instructions to run the examples

parent 0cdfcca2
...@@ -86,6 +86,18 @@ When TensorFlow 2.0 and/or PyTorch has been installed, you can install from sour ...@@ -86,6 +86,18 @@ When TensorFlow 2.0 and/or PyTorch has been installed, you can install from sour
pip install [--editable] . pip install [--editable] .
``` ```
### Run the examples
Examples are included in the repository but are not shipped with the library.
Therefore, in order to run the examples you will first need to clone the
repository and install the bleeding edge version of the library. To do so, create a new virtual environment and follow these steps:
```bash
git clone git@github.com:huggingface/transformers
cd transformers
pip install .
```
### Tests ### Tests
A series of tests are included for the library and the example scripts. Library tests can be found in the [tests folder](https://github.com/huggingface/transformers/tree/master/transformers/tests) and examples tests in the [examples folder](https://github.com/huggingface/transformers/tree/master/examples). A series of tests are included for the library and the example scripts. Library tests can be found in the [tests folder](https://github.com/huggingface/transformers/tree/master/transformers/tests) and examples tests in the [examples folder](https://github.com/huggingface/transformers/tree/master/examples).
...@@ -253,6 +265,11 @@ print("sentence_2 is", "a paraphrase" if pred_2 else "not a paraphrase", "of sen ...@@ -253,6 +265,11 @@ print("sentence_2 is", "a paraphrase" if pred_2 else "not a paraphrase", "of sen
## Quick tour of the fine-tuning/usage scripts ## Quick tour of the fine-tuning/usage scripts
**Important**
Before running the fine-tuning scripts, please read the
[instructions](#run-the-examples) on how to
setup your environment to run the examples.
The library comprises several example scripts with SOTA performances for NLU and NLG tasks: The library comprises several example scripts with SOTA performances for NLU and NLG tasks:
- `run_glue.py`: an example fine-tuning Bert, XLNet and XLM on nine different GLUE tasks (*sequence-level classification*) - `run_glue.py`: an example fine-tuning Bert, XLNet and XLM on nine different GLUE tasks (*sequence-level classification*)
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment