- 17 Feb, 2021 8 commits
-
-
Stas Bekman authored
-
Stas Bekman authored
* refactor place_model_on_device logic, add deepspeed * doc * style
-
Stas Bekman authored
* fix invalid port * missing requirements
-
Julien Plu authored
* Fix XLA and AMP * Apply style * Remove useless cast
-
Julien Plu authored
* Fix Flaubert and XLM * Remove useless cast * Tiny fix * Tiny fix
-
Julien Plu authored
* Update BART * Update Blenderbot * Update BlenderbotSmall * Update Marian * Update MBart * Update MBart * Update Pegasus * Update template * Fix Marian and Pegasus * Apply style * Default initializer * Default initializer * Default initializer * Remove int32 casts * Fix template * Remove more cast
-
Daniel Stancl authored
* Fix head_mask and decoder_head_mask in TFT5 models * Enable test_headmasking both fot TFT5 tester and TFT5EncoderOnly tester Co-authored-by:patrickvonplaten <patrick.v.platen@gmail.com>
-
Lysandre Debut authored
-
- 16 Feb, 2021 5 commits
-
-
Stas Bekman authored
* [trainer] fix ignored columns logger This PR fixes a confusing log entry that says: ``` The following columns in the evaluation set don't have a corresponding argument in `T5ForConditionalGeneration.forward` and have been ignored: . ``` when everything is in order. * Update src/transformers/trainer.py Co-authored-by:
Sylvain Gugger <35901082+sgugger@users.noreply.github.com> Co-authored-by:
Sylvain Gugger <35901082+sgugger@users.noreply.github.com>
-
Joe Davison authored
-
Sylvain Gugger authored
-
Zhang Cheng authored
-
Julien Plu authored
-
- 15 Feb, 2021 12 commits
-
-
Suraj Patil authored
* move old s2s scripts to legacy * add the tests back * proper rename * restore * Apply suggestions from code review Co-authored-by:
Sylvain Gugger <35901082+sgugger@users.noreply.github.com> Co-authored-by:
Stas Bekman <stas@stason.org> Co-authored-by:
Stas Bekman <stas00@users.noreply.github.com> Co-authored-by:
Sylvain Gugger <35901082+sgugger@users.noreply.github.com>
-
Stas Bekman authored
-
Lysandre Debut authored
Co-authored-by:
Quentin Lhoest <lhoest.q@gmail.com> Co-authored-by:
Quentin Lhoest <lhoest.q@gmail.com>
-
Stas Bekman authored
* fix run_seq2seq.py; porting DeepSpeed tests to it * unrefactor * defensive programming * defensive programming 2 * port the rest of the trainer tests * style * a cleaner scripts dir finder * cleanup
-
Julien Plu authored
-
Suraj Patil authored
* add tokenizer for mBART-50 * update tokenizers * make src_lang and tgt_lang optional * update tokenizer test * add setter * update docs * update conversion script * update docs * update conversion script * update tokenizer * update test * update docs * doc * address Sylvain's suggestions * fix test * fix formatting * nits
-
Julien Plu authored
* Fix template * Update Seq2Seq tests
-
Suraj Patil authored
-
Julien Plu authored
* Add check-ops script * Finish to implement check_tf_ops and start the test * Make the test mandatory only for BERT * Update tf_ops folder * Remove useless classes * Add the ONNX test for GPT2 and BART * Add a onnxruntime slow test + better opset flexibility * Fix test + apply style * fix tests * Switch min opset from 12 to 10 * Update src/transformers/file_utils.py Co-authored-by:
Lysandre Debut <lysandre@huggingface.co> * Fix GPT2 * Remove extra shape_list usage * Fix GPT2 * Address Morgan's comments Co-authored-by:
Lysandre Debut <lysandre@huggingface.co>
-
Lysandre Debut authored
-
Nicolas Patry authored
Fixes #10168
-
Sylvain Gugger authored
-
- 13 Feb, 2021 6 commits
-
-
Stas Bekman authored
* save fast tokenizer + add info logs * fix tests * remove the saving of fast tokenizer
-
Sylvain Gugger authored
-
Manuel Romero authored
-
Manuel Romero authored
-
Nicolas Patry authored
* Conversion from slow to fast for BPE spm vocabs contained an error. - There is only 1 test currently (tokenizers + slow) that used the modified path and it's reformer, which does not contain any ids modification so the bug was silent for now. - The real issue is that vocab variable was overloaded by SentencePieceExtractor, leading to Slow specific vocab oddities to be completely ignored - The bug was reported here https://github.com/huggingface/transformers/issues/9518 - Ran the complete tokenization test suite with slow without error (`RUN_SLOW=1 pytest -sv tests/test_tokenization_*`) * Remove rebase error. * Adding the fixture.
-
Lysandre Debut authored
-
- 12 Feb, 2021 4 commits
-
-
Julien Chaumond authored
-
Julien Chaumond authored
* [hf_api] delete deprecated methods and tests cc @lhoestq * Update test_hf_api.py
-
Mohamed Al Salti authored
* Fix typo * apply suggestion Co-authored-by:Suraj Patil <surajp815@gmail.com>
-
Suraj Patil authored
* fix rouge metrics and task specific params * fix typo * round metrics * typo * remove task_specific_params
-
- 11 Feb, 2021 5 commits
-
-
Sylvain Gugger authored
* Refactor things out of main train * Store signature * Add SageMakerTrainer * Init + Copyright * Address review comments
-
Stas Bekman authored
* init devices/setup explicitly * docs + test * simplify * cleanup * cleanup * cleanup * correct the required dist setup * derive local_rank from env LOCAL_RANK
-
Sylvain Gugger authored
-
Patrick von Platen authored
-
Patrick von Platen authored
-