1. 22 Feb, 2021 2 commits
  2. 19 Feb, 2021 9 commits
  3. 18 Feb, 2021 4 commits
  4. 17 Feb, 2021 4 commits
  5. 16 Feb, 2021 2 commits
  6. 15 Feb, 2021 5 commits
  7. 13 Feb, 2021 1 commit
    • Nicolas Patry's avatar
      Conversion from slow to fast for BPE spm vocabs contained an error. (#10120) · c9837a0d
      Nicolas Patry authored
      * Conversion from slow to fast for BPE spm vocabs contained an error.
      
      - There is only 1 test currently (tokenizers + slow) that used the modified path
      and it's reformer, which does not contain any ids modification so the
      bug was silent for now.
      - The real issue is that vocab variable was overloaded by
      SentencePieceExtractor, leading to Slow specific vocab oddities to be
      completely ignored
      - The bug was reported here https://github.com/huggingface/transformers/issues/9518
      - Ran the complete tokenization test suite with slow without error
      (`RUN_SLOW=1 pytest -sv tests/test_tokenization_*`)
      
      * Remove rebase error.
      
      * Adding the fixture.
      c9837a0d
  8. 12 Feb, 2021 2 commits
  9. 11 Feb, 2021 1 commit
  10. 10 Feb, 2021 3 commits
    • Suraj Patil's avatar
      remove adjust_logits_during_generation method (#10087) · c130e67d
      Suraj Patil authored
      * add forced logits processors
      
      * delete adjust_logits method
      
      * add forced_eos_token_id argument in config
      
      * add tests for forced logits processors
      
      * update gen utils tests
      
      * add forced option to tf generate
      
      * remove adjust_logits method from tf models
      
      * update adjust_logits for marian
      
      * delete _force_token_id_to_be_generated method
      
      * style
      
      * import warnings
      
      * pass max_length to _get_logits_processor
      
      * set forced_eos_token_id to None
      
      * set forced attributes in conf utils
      
      * typo
      
      * fix rag generate
      
      * add forced_eos_token_id in rag config
      
      * remove force_bos_token_to_be_generated from BartConfig
      
      * remove _force_token_ids_generation from FSMT
      
      * nit
      
      * fix negative constant
      
      * apply suggestions from code review
      c130e67d
    • Julien Plu's avatar
      Fix TF LED/Longformer attentions computation (#10007) · 22a32cf4
      Julien Plu authored
      * Fix test
      
      * Remove commented test
      
      * Fix name
      
      * Apply style
      
      * Fix check copies
      
      * Remove prints
      
      * Restore boolean
      
      * Fix reshape
      22a32cf4
    • Lysandre Debut's avatar
  11. 09 Feb, 2021 3 commits
  12. 08 Feb, 2021 4 commits