1. 17 Oct, 2022 1 commit
  2. 12 Oct, 2022 3 commits
  3. 10 Oct, 2022 2 commits
  4. 07 Oct, 2022 1 commit
  5. 05 Oct, 2022 1 commit
  6. 03 Oct, 2022 1 commit
  7. 28 Sep, 2022 2 commits
  8. 26 Sep, 2022 1 commit
  9. 23 Sep, 2022 1 commit
  10. 21 Sep, 2022 1 commit
  11. 20 Sep, 2022 1 commit
  12. 14 Sep, 2022 1 commit
  13. 13 Sep, 2022 1 commit
  14. 09 Sep, 2022 1 commit
  15. 07 Sep, 2022 1 commit
  16. 06 Sep, 2022 1 commit
  17. 01 Sep, 2022 1 commit
  18. 25 Aug, 2022 1 commit
  19. 24 Aug, 2022 1 commit
  20. 22 Aug, 2022 1 commit
  21. 18 Aug, 2022 2 commits
  22. 17 Aug, 2022 1 commit
  23. 16 Aug, 2022 1 commit
    • zhoutang776's avatar
      Update run_translation_no_trainer.py (#18637) · 25e651a2
      zhoutang776 authored
      * Update run_translation_no_trainer.py
      
      found an error in selecting `no_decay` parameters and some small modifications when the user continues to train from a checkpoint
      
      * fixs `no_decay` and `resume_step` issue
      
      1. change `no_decay` list
      2. if use continue to train their model from provided checkpoint, the `resume_step` will not be initialized properly if `args.gradient_accumulation_steps != 1`
      25e651a2
  24. 08 Aug, 2022 3 commits
    • Rasmus Arpe Fogh Jensen's avatar
      Update no_trainer.py scripts to include accelerate gradient accumulation wrapper (#18473) · a765b68a
      Rasmus Arpe Fogh Jensen authored
      * Added accelerate gradient accumulation wrapper to run_image_classification_no_trainer.py example script
      
      * make fixup changes
      
      * PR comments
      
      * changed input to Acceletor based on PR comment, ran make fixup
      
      * Added comment explaining the sync_gradients statement
      
      * Fixed lr scheduler max steps
      
      * Changed run_clm_no_trainer.py script to use accelerate gradient accum wrapper
      
      * Fixed all scripts except wav2vec2 pretraining to use accelerate gradient accum wrapper
      
      * Added accelerate gradient accum wrapper for wav2vec2_pretraining_no_trainer.py script
      
      * make fixup and lr_scheduler step inserted back into run_qa_beam_search_no_trainer.py
      
      * removed changes to run_wav2vec2_pretraining_no_trainer.py script and fixed using wrong constant in qa_beam_search_no_trainer.py script
      a765b68a
    • Sylvain Gugger's avatar
      Fix compatibility with 1.12 (#17925) · 70b0d4e1
      Sylvain Gugger authored
      
      
      * Fix compatibility with 1.12
      
      * Remove pin from examples requirements
      
      * Update torch scatter version
      
      * Fix compatibility with 1.12
      
      * Remove pin from examples requirements
      
      * Update torch scatter version
      
      * fix torch.onnx.symbolic_opset12 import
      
      * Reject bad version
      Co-authored-by: default avatarydshieh <ydshieh@users.noreply.github.com>
      70b0d4e1
    • regisss's avatar
      88a0ce57
  25. 06 Aug, 2022 2 commits
  26. 04 Aug, 2022 2 commits
  27. 03 Aug, 2022 1 commit
  28. 02 Aug, 2022 1 commit
  29. 01 Aug, 2022 3 commits