- 22 Dec, 2019 22 commits
-
-
Aymeric Augustin authored
Do manually what autoflake couldn't manage.
-
Aymeric Augustin authored
This change is mostly autogenerated with: $ python -m autoflake --in-place --recursive --remove-all-unused-imports --ignore-init-module-imports examples templates transformers utils hubconf.py setup.py I made minor changes in the generated diff. -
Aymeric Augustin authored
This change is mostly autogenerated with: $ python -m autoflake --in-place --recursive examples templates transformers utils hubconf.py setup.py I made minor changes in the generated diff. -
Aymeric Augustin authored
-
Aymeric Augustin authored
-
Aymeric Augustin authored
Ignore warnings related to Python 2, because it's going away soon.
-
Aymeric Augustin authored
-
Aymeric Augustin authored
-
Aymeric Augustin authored
-
Aymeric Augustin authored
-
Aymeric Augustin authored
-
Aymeric Augustin authored
-
Aymeric Augustin authored
-
Aymeric Augustin authored
-
Aymeric Augustin authored
-
Aymeric Augustin authored
-
Aymeric Augustin authored
Fixes flake8 warning W291 (x224).
-
Aymeric Augustin authored
-
Aymeric Augustin authored
Submodules shouldn't import from their parent in general.
-
Aymeric Augustin authored
-
Aymeric Augustin authored
We need https://github.com/timothycrosley/isort/pull/1000 but there's no release with this fix yet, so we'll install from GitHub.
-
Aymeric Augustin authored
This is the result of: $ isort --recursive examples templates transformers utils hubconf.py setup.py
-
- 21 Dec, 2019 18 commits
-
-
Aymeric Augustin authored
lines_after_imports = 2 is a matter of taste; I like it.
-
Aymeric Augustin authored
-
Aymeric Augustin authored
-
Aymeric Augustin authored
This is the result of: $ black --line-length 119 examples templates transformers utils hubconf.py setup.py There's a lot of fairly long lines in the project. As a consequence, I'm picking the longest widely accepted line length, 119 characters. This is also Thomas' preference, because it allows for explicit variable names, to make the code easier to understand. -
Aymeric Augustin authored
Likely it was added by accident.
-
Thomas Wolf authored
adding positional embeds masking to TFRoBERTa
-
Thomas Wolf authored
[WIP] Add MMBT Model to Transformers Repo
-
thomwolf authored
-
thomwolf authored
-
Thomas Wolf authored
closes #1960 Add saving and resuming functionality for remaining examples
-
Thomas Wolf authored
[BREAKING CHANGE] Setting all ignored index to the PyTorch standard
-
thomwolf authored
-
thomwolf authored
-
Thomas Wolf authored
-
Thomas Wolf authored
run_squad with roberta
-
Thomas Wolf authored
fix: wrong architecture count in README
-
Thomas Wolf authored
:zip: #2106 tokenizer.tokenize speed improvement (3-8x) by caching added_tokens in a Set
-
Thomas Wolf authored
-