"docs/source/it/model_sharing.md" did not exist on "7dbee87e09748366765d18b9161c9957881775da"
  1. 12 Oct, 2023 1 commit
  2. 10 Oct, 2023 1 commit
  3. 03 Oct, 2023 1 commit
  4. 29 Sep, 2023 1 commit
    • Sanchit Gandhi's avatar
      [Flax Examples] Seq2Seq ASR Fine-Tuning Script (#21764) · 68e85fc8
      Sanchit Gandhi authored
      * from seq2seq speech
      
      * [Flax] Example script for speech seq2seq
      
      * tests and fixes
      
      * make style
      
      * fix: label padding tokens
      
      * fix: label padding tokens over list
      
      * update ln names for Whisper
      
      * try datasets iter loader
      
      * create readme and append results
      
      * style
      
      * make style
      
      * adjust lr
      
      * use pt dataloader
      
      * make fast
      
      * pin gen max len
      
      * finish
      
      * add pt to requirements for test
      
      * fix pt -> torch
      
      * add accelerate
      68e85fc8
  5. 22 Sep, 2023 1 commit
  6. 18 Sep, 2023 1 commit
  7. 11 Sep, 2023 2 commits
  8. 04 Sep, 2023 1 commit
  9. 21 Aug, 2023 1 commit
  10. 07 Aug, 2023 1 commit
    • Jackmin801's avatar
      Allow `trust_remote_code` in example scripts (#25248) · 14510938
      Jackmin801 authored
      * pytorch examples
      
      * pytorch mim no trainer
      
      * cookiecutter
      
      * flax examples
      
      * missed line in pytorch run_glue
      
      * tensorflow examples
      
      * tensorflow run_clip
      
      * tensorflow run_mlm
      
      * tensorflow run_ner
      
      * tensorflow run_clm
      
      * pytorch example from_configs
      
      * pytorch no trainer examples
      
      * Revert "tensorflow run_clip"
      
      This reverts commit 261f86ac1f1c9e05dd3fd0291e1a1f8e573781d5.
      
      * fix: duplicated argument
      14510938
  11. 02 Aug, 2023 1 commit
  12. 28 Jul, 2023 2 commits
  13. 17 Jul, 2023 1 commit
  14. 12 Jul, 2023 1 commit
  15. 07 Jun, 2023 1 commit
  16. 22 May, 2023 1 commit
  17. 09 May, 2023 1 commit
  18. 02 May, 2023 1 commit
  19. 13 Apr, 2023 1 commit
  20. 23 Mar, 2023 1 commit
  21. 22 Mar, 2023 1 commit
  22. 14 Mar, 2023 1 commit
  23. 22 Feb, 2023 1 commit
  24. 06 Feb, 2023 1 commit
    • Sylvain Gugger's avatar
      Update quality tooling for formatting (#21480) · 6f79d264
      Sylvain Gugger authored
      * Result of black 23.1
      
      * Update target to Python 3.7
      
      * Switch flake8 to ruff
      
      * Configure isort
      
      * Configure isort
      
      * Apply isort with line limit
      
      * Put the right black version
      
      * adapt black in check copies
      
      * Fix copies
      6f79d264
  25. 23 Jan, 2023 1 commit
  26. 19 Jan, 2023 1 commit
  27. 18 Jan, 2023 1 commit
  28. 04 Jan, 2023 1 commit
  29. 20 Dec, 2022 1 commit
    • fzyzcjy's avatar
      Fix tiny typo (#20841) · ae3cbbca
      fzyzcjy authored
      * Fix typo
      
      * Update README.md
      
      * Update run_mlm_flax_stream.py
      
      * Update README.md
      ae3cbbca
  30. 01 Dec, 2022 1 commit
  31. 28 Nov, 2022 1 commit
  32. 01 Nov, 2022 1 commit
  33. 13 Oct, 2022 1 commit
  34. 10 Oct, 2022 2 commits
  35. 14 Sep, 2022 1 commit
  36. 09 Sep, 2022 1 commit
  37. 14 Aug, 2022 1 commit
    • Karim Foda's avatar
      Flax Remat for LongT5 (#17994) · d6eeb871
      Karim Foda authored
      
      
      * [Flax] Add remat (gradient checkpointing)
      
      * fix variable naming in test
      
      * flip: checkpoint using a method
      
      * fix naming
      
      * fix class naming
      
      * apply PVP's suggestions from code review
      
      * add gradient_checkpointing to examples
      
      * Add gradient_checkpointing to run_mlm_flax
      
      * Add remat to longt5
      
      * Add gradient checkpointing test longt5
      
      * Fix args errors
      
      * Fix remaining tests
      
      * Make fixup & quality fixes
      
      * replace kwargs
      
      * remove unecessary kwargs
      
      * Make fixup changes
      
      * revert long_t5_flax changes
      
      * Remove return_dict and copy to LongT5
      
      * Remove test_gradient_checkpointing
      Co-authored-by: default avatarsanchit-gandhi <sanchit@huggingface.co>
      d6eeb871