"...gpu/git@developer.sourcefind.cn:gaoqiong/migraphx.git" did not exist on "90cfe47499d4c1f3a52ddb4ad4109d931978038d"
installation.rst 1.73 KB
Newer Older
1
2
3
Installation
================================================

thomwolf's avatar
thomwolf committed
4
PyTorch-Transformers is tested on Python 2.7 and 3.5+ (examples are tested only on python 3.5+) and PyTorch 1.1.0
5
6
7
8

With pip
^^^^^^^^

thomwolf's avatar
thomwolf committed
9
PyTorch Transformers can be installed using pip as follows:
10
11
12

.. code-block:: bash

thomwolf's avatar
thomwolf committed
13
   pip install pytorch-transformers
14

thomwolf's avatar
thomwolf committed
15
16
From source
^^^^^^^^^^^
17

thomwolf's avatar
thomwolf committed
18
To install from source, clone the repository and install with:
19

thomwolf's avatar
thomwolf committed
20
.. code-block:: bash
21

thomwolf's avatar
thomwolf committed
22
23
24
    git clone https://github.com/huggingface/pytorch-transformers.git
    cd pytorch-transformers
    pip install [--editable] .
25
26


thomwolf's avatar
thomwolf committed
27
28
Tests
^^^^^
29

thomwolf's avatar
thomwolf committed
30
An extensive test suite is included to test the library behavior and several examples. Library tests can be found in the `tests folder <https://github.com/huggingface/pytorch-transformers/tree/master/pytorch_transformers/tests>`_ and examples tests in the `examples folder <https://github.com/huggingface/pytorch-transformers/tree/master/examples>`_.
31

thomwolf's avatar
thomwolf committed
32
Tests can be run using `pytest` (install pytest if needed with `pip install pytest`).
33

thomwolf's avatar
thomwolf committed
34
Run all the tests from the root of the cloned repository with the commands:
35
36
37

.. code-block:: bash

thomwolf's avatar
thomwolf committed
38
39
    python -m pytest -sv ./pytorch_transformers/tests/
    python -m pytest -sv ./examples/
40
41


thomwolf's avatar
thomwolf committed
42
43
OpenAI GPT original tokenization workflow
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
44

thomwolf's avatar
thomwolf committed
45
If you want to reproduce the original tokenization process of the ``OpenAI GPT`` paper, you will need to install ``ftfy`` (use version 4.4.3 if you are using Python 2) and ``SpaCy`` :
46
47
48

.. code-block:: bash

thomwolf's avatar
thomwolf committed
49
50
51
   pip install spacy ftfy==4.4.3
   python -m spacy download en

thomwolf's avatar
thomwolf committed
52
If you don't install ``ftfy`` and ``SpaCy``\ , the ``OpenAI GPT`` tokenizer defaults to tokenize using BERT's ``BasicTokenizer`` followed by Byte-Pair Encoding (which should be fine for most usage, don't worry).