"test/git@developer.sourcefind.cn:gaoqiong/migraphx.git" did not exist on "200c7038aa837a222d8b0069a8650294ab7a9c90"
installation.md 5.21 KB
Newer Older
Sylvain Gugger's avatar
Sylvain Gugger committed
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
<!---
Copyright 2020 The HuggingFace Team. All rights reserved.

Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at

    http://www.apache.org/licenses/LICENSE-2.0

Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
-->

17
18
# Installation

19
馃 Transformers is tested on Python 3.6+, and PyTorch 1.1.0+ or TensorFlow 2.0+.
20

21
22
23
You should install 馃 Transformers in a [virtual environment](https://docs.python.org/3/library/venv.html). If you're
unfamiliar with Python virtual environments, check out the [user guide](https://packaging.python.org/guides/installing-using-pip-and-virtual-environments/). Create a virtual environment with the version of Python you're going 
to use and activate it.
24

25
26
Now, if you want to use 馃 Transformers, you can install it with pip. If you'd like to play with the examples, you
must install it from source.
27

28
29
30
## Installation with pip

First you need to install one of, or both, TensorFlow 2.0 and PyTorch.
31
32
33
34
Please refer to [TensorFlow installation page](https://www.tensorflow.org/install/pip#tensorflow-2.0-rc-is-available), 
[PyTorch installation page](https://pytorch.org/get-started/locally/#start-locally) and/or 
[Flax installation page](https://github.com/google/flax#quick-install)
regarding the specific install command for your platform.
35
36
37
38

When TensorFlow 2.0 and/or PyTorch has been installed, 馃 Transformers can be installed using pip as follows:

```bash
39
40
41
pip install transformers
```

42
Alternatively, for CPU-support only, you can install 馃 Transformers and PyTorch in one line with:
43
44
45
46
47

```bash
pip install transformers[torch]
```

48
or 馃 Transformers and TensorFlow 2.0 in one line with:
49
50
51
52
53

```bash
pip install transformers[tf-cpu]
```

54
55
56
57
58
59
or 馃 Transformers and Flax in one line with:

```bash
pip install transformers[flax]
```

60
61
62
To check 馃 Transformers is properly installed, run the following command:

```bash
63
python -c "from transformers import pipeline; print(pipeline('sentiment-analysis')('we love you'))"
64
65
66
67
68
```

It should download a pretrained model then print something like

```bash
69
[{'label': 'POSITIVE', 'score': 0.9998704791069031}]
70
71
72
73
74
```

(Note that TensorFlow will print additional stuff before that last statement.)

## Installing from source
75

76
To install from source, clone the repository and install with the following commands:
77
78
79
80

``` bash
git clone https://github.com/huggingface/transformers.git
cd transformers
81
82
83
84
85
86
87
pip install -e .
```

Again, you can run 

```bash
python -c "from transformers import pipeline; print(pipeline('sentiment-analysis')('I hate you'))"
88
89
```

90
91
to check 馃 Transformers is properly installed.

92
93
94
95
96
97
98
99
100
101
102
103
104

## With conda

Since Transformers version v4.0.0, we now have a conda channel: `huggingface`.

馃 Transformers can be installed using conda as follows:

```
conda install -c huggingface transformers
```

Follow the installation pages of TensorFlow, PyTorch or Flax to see how to install them with conda. 

105
106
107
## Caching models

This library provides pretrained models that will be downloaded and cached locally. Unless you specify a location with
108
`cache_dir=...` when you use methods like `from_pretrained`, these models will automatically be downloaded in the
109
110
folder given by the shell environment variable ``TRANSFORMERS_CACHE``. The default value for it will be the Hugging
Face cache home followed by ``/transformers/``. This is (by order of priority):
111

112
113
114
  * shell environment variable ``HF_HOME`` 
  * shell environment variable ``XDG_CACHE_HOME`` + ``/huggingface/``
  * default: ``~/.cache/huggingface/``
115

116
So if you don't have any specific environment variable set, the cache directory will be at
117
``~/.cache/huggingface/transformers/``.
118

119
**Note:** If you have set a shell environment variable for one of the predecessors of this library
120
(``PYTORCH_TRANSFORMERS_CACHE`` or ``PYTORCH_PRETRAINED_BERT_CACHE``), those will be used if there is no shell
121
environment variable for ``TRANSFORMERS_CACHE``.
122

123
### Note on model downloads (Continuous Integration or large-scale deployments)
124

125
126
127
If you expect to be downloading large volumes of models (more than 1,000) from our hosted bucket (for instance through
your CI setup, or a large-scale production deployment), please cache the model files on your end. It will be way
faster, and cheaper. Feel free to contact us privately if you need any help.
128
129
130
131
132

## Do you want to run a Transformer model on a mobile device?

You should check out our [swift-coreml-transformers](https://github.com/huggingface/swift-coreml-transformers) repo.

133
134
It contains a set of tools to convert PyTorch or TensorFlow 2.0 trained Transformer models (currently contains `GPT-2`, 
`DistilGPT-2`, `BERT`, and `DistilBERT`) to CoreML models that run on iOS devices.
135

136
At some point in the future, you'll be able to seamlessly move from pretraining or fine-tuning models in PyTorch or
137
138
TensorFlow 2.0 to productizing them in CoreML, or prototype a model or an app in CoreML then research its
hyperparameters or architecture from PyTorch or TensorFlow 2.0. Super exciting!