1. 03 Mar, 2021 1 commit
  2. 09 Dec, 2020 1 commit
  3. 07 Dec, 2020 1 commit
  4. 17 Nov, 2020 2 commits
  5. 15 Nov, 2020 1 commit
    • Thomas Wolf's avatar
      [breaking|pipelines|tokenizers] Adding slow-fast tokenizers equivalence tests... · f4e04cd2
      Thomas Wolf authored
      
      [breaking|pipelines|tokenizers] Adding slow-fast tokenizers equivalence tests pipelines - Removing sentencepiece as a required dependency (#8073)
      
      * Fixing roberta for slow-fast tests
      
      * WIP getting equivalence on pipelines
      
      * slow-to-fast equivalence - working on question-answering pipeline
      
      * optional FAISS tests
      
      * Pipeline Q&A
      
      * Move pipeline tests to their own test job again
      
      * update tokenizer to add sequence id methods
      
      * update to tokenizers 0.9.4
      
      * set sentencepiecce as optional
      
      * clean up squad
      
      * clean up pipelines to use sequence_ids
      
      * style/quality
      
      * wording
      
      * Switch to use_fast = True by default
      
      * update tests for use_fast at True by default
      
      * fix rag tokenizer test
      
      * removing protobuf from required dependencies
      
      * fix NER test for use_fast = True by default
      
      * fixing example tests (Q&A examples use slow tokenizers for now)
      
      * protobuf in main deps extras["sentencepiece"] and example deps
      
      * fix protobug install test
      
      * try to fix seq2seq by switching to slow tokenizers for now
      
      * Update src/transformers/tokenization_utils_base.py
      Co-authored-by: default avatarLysandre Debut <lysandre@huggingface.co>
      
      * Update src/transformers/tokenization_utils_base.py
      Co-authored-by: default avatarLysandre Debut <lysandre@huggingface.co>
      Co-authored-by: default avatarLysandre Debut <lysandre@huggingface.co>
      f4e04cd2
  6. 29 Oct, 2020 1 commit
  7. 26 Oct, 2020 1 commit
    • Sylvain Gugger's avatar
      Doc styling (#8067) · 08f534d2
      Sylvain Gugger authored
      * Important files
      
      * Styling them all
      
      * Revert "Styling them all"
      
      This reverts commit 7d029395fdae8513b8281cbc2a6c239f8093503e.
      
      * Syling them for realsies
      
      * Fix syntax error
      
      * Fix benchmark_utils
      
      * More fixes
      
      * Fix modeling auto and script
      
      * Remove new line
      
      * Fixes
      
      * More fixes
      
      * Fix more files
      
      * Style
      
      * Add FSMT
      
      * More fixes
      
      * More fixes
      
      * More fixes
      
      * More fixes
      
      * Fixes
      
      * More fixes
      
      * More fixes
      
      * Last fixes
      
      * Make sphinx happy
      08f534d2
  8. 23 Oct, 2020 1 commit
  9. 13 Oct, 2020 1 commit
  10. 06 Oct, 2020 1 commit
  11. 05 Oct, 2020 1 commit
  12. 26 Aug, 2020 1 commit
  13. 22 Jul, 2020 1 commit
  14. 20 Jul, 2020 1 commit
  15. 09 Jul, 2020 1 commit
  16. 30 Jun, 2020 1 commit
    • Julien Plu's avatar
      Fix TensorFlow dataset generator (#4881) · fcf06524
      Julien Plu authored
      * fix TensorFlow generator
      
      * Better features handling
      
      * Apply style
      
      * Apply style
      
      * Fix squad as well
      
      * Apply style
      
      * Better factorization of TF Tensors creation
      fcf06524
  17. 26 Jun, 2020 1 commit
  18. 25 Jun, 2020 1 commit
  19. 04 Jun, 2020 1 commit
    • Julien Plu's avatar
      Tensorflow improvements (#4530) · f9414f75
      Julien Plu authored
      
      
      * Better None gradients handling
      
      * Apply Style
      
      * Apply Style
      
      * Create a loss class per task to compute its respective loss
      
      * Add loss classes to the ALBERT TF models
      
      * Add loss classes to the BERT TF models
      
      * Add question answering and multiple choice to TF Camembert
      
      * Remove prints
      
      * Add multiple choice model to TF DistilBERT + loss computation
      
      * Add question answering model to TF Electra + loss computation
      
      * Add token classification, question answering and multiple choice models to TF Flaubert
      
      * Add multiple choice model to TF Roberta + loss computation
      
      * Add multiple choice model to TF XLM + loss computation
      
      * Add multiple choice and question answering models to TF XLM-Roberta
      
      * Add multiple choice model to TF XLNet + loss computation
      
      * Remove unused parameters
      
      * Add task loss classes
      
      * Reorder TF imports + add new model classes
      
      * Add new model classes
      
      * Bugfix in TF T5 model
      
      * Bugfix for TF T5 tests
      
      * Bugfix in TF T5 model
      
      * Fix TF T5 model tests
      
      * Fix T5 tests + some renaming
      
      * Fix inheritance issue in the AutoX tests
      
      * Add tests for TF Flaubert and TF XLM Roberta
      
      * Add tests for TF Flaubert and TF XLM Roberta
      
      * Remove unused piece of code in the TF trainer
      
      * bugfix and remove unused code
      
      * Bugfix for TF 2.2
      
      * Apply Style
      
      * Divide TFSequenceClassificationAndMultipleChoiceLoss into their two respective name
      
      * Apply style
      
      * Mirror the PT Trainer in the TF one: fp16, optimizers and tb_writer as class parameter and better dataset handling
      
      * Fix TF optimizations tests and apply style
      
      * Remove useless parameter
      
      * Bugfix and apply style
      
      * Fix TF Trainer prediction
      
      * Now the TF models return the loss such as their PyTorch couterparts
      
      * Apply Style
      
      * Ignore some tests output
      
      * Take into account the SQuAD cls_index, p_mask and is_impossible parameters for the QuestionAnswering task models.
      
      * Fix names for SQuAD data
      
      * Apply Style
      
      * Fix conflicts with 2.11 release
      
      * Fix conflicts with 2.11
      
      * Fix wrongname
      
      * Add better documentation on the new create_optimizer function
      
      * Fix isort
      
      * logging_dir: use same default as PyTorch
      Co-authored-by: default avatarJulien Chaumond <chaumond@gmail.com>
      f9414f75
  20. 14 May, 2020 1 commit
  21. 20 Apr, 2020 2 commits
  22. 26 Mar, 2020 1 commit
  23. 21 Feb, 2020 1 commit
  24. 11 Feb, 2020 1 commit
  25. 10 Feb, 2020 1 commit
  26. 21 Jan, 2020 2 commits
  27. 06 Jan, 2020 2 commits
  28. 23 Dec, 2019 1 commit
  29. 22 Dec, 2019 5 commits
    • Aymeric Augustin's avatar
      Move source code inside a src subdirectory. · 6be7cdda
      Aymeric Augustin authored
      This prevents transformers from being importable simply because the CWD
      is the root of the git repository, while not being importable from other
      directories. That led to inconsistent behavior, especially in examples.
      
      Once you fetch this commit, in your dev environment, you must run:
      
          $ pip uninstall transformers
          $ pip install -e .
      6be7cdda
    • Aymeric Augustin's avatar
      Fix F401 flake8 warning (x88 / 116). · 783a6169
      Aymeric Augustin authored
      This change is mostly autogenerated with:
      
          $ python -m autoflake --in-place --recursive --remove-all-unused-imports --ignore-init-module-imports examples templates transformers utils hubconf.py setup.py
      
      I made minor changes in the generated diff.
      783a6169
    • Aymeric Augustin's avatar
      Fix F401 flake8 warning (x152 / 268). · 80327a13
      Aymeric Augustin authored
      This change is mostly autogenerated with:
      
          $ python -m autoflake --in-place --recursive examples templates transformers utils hubconf.py setup.py
      
      I made minor changes in the generated diff.
      80327a13
    • Aymeric Augustin's avatar
      Remove trailing whitespace from all Python files. · 28e608a2
      Aymeric Augustin authored
      Fixes flake8 warning W291 (x224).
      28e608a2
    • Aymeric Augustin's avatar
      Sort imports with isort. · 158e82e0
      Aymeric Augustin authored
      This is the result of:
      
          $ isort --recursive examples templates transformers utils hubconf.py setup.py
      158e82e0
  30. 21 Dec, 2019 1 commit
    • Aymeric Augustin's avatar
      Reformat source code with black. · fa84ae26
      Aymeric Augustin authored
      This is the result of:
      
          $ black --line-length 119 examples templates transformers utils hubconf.py setup.py
      
      There's a lot of fairly long lines in the project. As a consequence, I'm
      picking the longest widely accepted line length, 119 characters.
      
      This is also Thomas' preference, because it allows for explicit variable
      names, to make the code easier to understand.
      fa84ae26
  31. 19 Dec, 2019 1 commit
  32. 17 Dec, 2019 1 commit