- 24 Jan, 2020 1 commit
-
-
Nicholas Lourie authored
T5WithLMHeadModel's doc string claims that indices of -1 are ignored while computing the cross-entropy loss in the forward pass; however, indices of -1 throw an error while indices of -100 are ignored. This commit updates the doc string to be consistent with the class's behavior.
-
- 15 Jan, 2020 1 commit
-
-
Julien Chaumond authored
-
- 10 Jan, 2020 1 commit
-
-
Martin Schrimpf authored
otherwise, `rp_bucket` will always be on cpu and fail if `self.relative_attention_bias` is on cuda
-
- 08 Jan, 2020 1 commit
-
-
thomwolf authored
-
- 07 Jan, 2020 1 commit
-
-
Genta Indra Winata authored
-
- 06 Jan, 2020 2 commits
-
-
alberduris authored
-
alberduris authored
-
- 24 Dec, 2019 1 commit
-
-
thomwolf authored
-
- 23 Dec, 2019 1 commit
-
-
Aymeric Augustin authored
-
- 22 Dec, 2019 8 commits
-
-
Aymeric Augustin authored
-
Aymeric Augustin authored
This prevents transformers from being importable simply because the CWD is the root of the git repository, while not being importable from other directories. That led to inconsistent behavior, especially in examples. Once you fetch this commit, in your dev environment, you must run: $ pip uninstall transformers $ pip install -e . -
Aymeric Augustin authored
This change is mostly autogenerated with: $ python -m autoflake --in-place --recursive --remove-all-unused-imports --ignore-init-module-imports examples templates transformers utils hubconf.py setup.py I made minor changes in the generated diff. -
Aymeric Augustin authored
This change is mostly autogenerated with: $ python -m autoflake --in-place --recursive examples templates transformers utils hubconf.py setup.py I made minor changes in the generated diff. -
Aymeric Augustin authored
-
Aymeric Augustin authored
-
Aymeric Augustin authored
Fixes flake8 warning W291 (x224).
-
Aymeric Augustin authored
This is the result of: $ isort --recursive examples templates transformers utils hubconf.py setup.py
-
- 21 Dec, 2019 1 commit
-
-
Aymeric Augustin authored
This is the result of: $ black --line-length 119 examples templates transformers utils hubconf.py setup.py There's a lot of fairly long lines in the project. As a consequence, I'm picking the longest widely accepted line length, 119 characters. This is also Thomas' preference, because it allows for explicit variable names, to make the code easier to understand.
-
- 19 Dec, 2019 1 commit
-
-
patrickvonplaten authored
-
- 13 Dec, 2019 2 commits
- 11 Dec, 2019 1 commit
-
-
thomwolf authored
-
- 10 Dec, 2019 2 commits
- 09 Dec, 2019 2 commits
- 02 Dec, 2019 1 commit
-
-
thomwolf authored
-
- 08 Nov, 2019 2 commits
- 07 Nov, 2019 1 commit
-
-
thomwolf authored
-
- 06 Nov, 2019 1 commit
-
-
thomwolf authored
-
- 05 Nov, 2019 3 commits