1. 29 Oct, 2023 1 commit
  2. 29 Mar, 2023 1 commit
  3. 21 Dec, 2022 1 commit
  4. 15 Nov, 2022 1 commit
  5. 09 Sep, 2022 1 commit
  6. 06 Aug, 2022 1 commit
  7. 02 May, 2022 1 commit
  8. 27 Apr, 2022 1 commit
    • NielsRogge's avatar
      Add semantic script, trainer (#16834) · 479fdc49
      NielsRogge authored
      * Add first draft
      
      * Improve script and README
      
      * Improve README
      
      * Apply suggestions from code review
      
      * Improve script, add link to resulting model
      
      * Add corresponding test
      
      * Adjust learning rate
      479fdc49
  9. 19 Apr, 2022 1 commit
    • NielsRogge's avatar
      Add image classification script, no trainer (#16727) · b96e82c8
      NielsRogge authored
      * Add first draft
      
      * Improve README and run fixup
      
      * Make script aligned with other scripts, improve README
      
      * Improve script and add test
      
      * Remove print statement
      
      * Apply suggestions from code review
      
      * Add num_labels to make test pass
      
      * Improve README
      b96e82c8
  10. 25 Mar, 2022 1 commit
  11. 23 Mar, 2022 1 commit
  12. 17 Feb, 2022 1 commit
  13. 21 Oct, 2021 1 commit
  14. 20 Oct, 2021 1 commit
  15. 08 Oct, 2021 1 commit
  16. 21 Sep, 2021 1 commit
  17. 29 Apr, 2021 1 commit
  18. 26 Apr, 2021 2 commits
  19. 21 Apr, 2021 1 commit
  20. 29 Mar, 2021 1 commit
  21. 22 Mar, 2021 1 commit
    • Boris Dayma's avatar
      feat(wandb): logging and configuration improvements (#10826) · 125ccead
      Boris Dayma authored
      * feat: ensure unique artifact id
      
      * feat: allow manual init
      
      * fix: simplify reinit logic
      
      * fix: no dropped value + immediate commits
      
      * fix: wandb use in sagemaker
      
      * docs: improve documenation and formatting
      
      * fix: typos
      
      * docs: improve formatting
      125ccead
  22. 19 Mar, 2021 1 commit
  23. 18 Mar, 2021 1 commit
    • Stas Bekman's avatar
      [examples/seq2seq/README.md] fix t5 examples (#10734) · 9352b515
      Stas Bekman authored
      * [examples/seq2seq] fix t5 examples
      
      This PR:
      * fixes T5 examples to include `--source_prefix` - it's **not** optional. If you give it a try you will see that you get 10x worse bleu scores w/o it. w/ `27.6849`, w/ `2.374`
      * added a normal translation example w/o the peculiarities of MBart and T5
      * reduces the default max samples to 50 so it's much faster to test quickly
      
      summarization seems to be broken for t5 score-wise: https://github.com/huggingface/transformers/issues/10733
      
      @sgugger
      
      * specify explicitly the t5 models requiring the special handling
      
      * one more
      
      * update the t5 summarization example to use cnn_dailymail
      
      * move max*samples into the top level README.md
      
      * better wording
      
      * better wording
      9352b515
  24. 17 Mar, 2021 1 commit
  25. 15 Mar, 2021 1 commit
  26. 09 Feb, 2021 1 commit
  27. 23 Jan, 2021 1 commit
  28. 21 Dec, 2020 1 commit
  29. 18 Dec, 2020 1 commit
  30. 16 Dec, 2020 1 commit
  31. 11 Dec, 2020 1 commit
  32. 08 Dec, 2020 1 commit
  33. 07 Dec, 2020 1 commit
  34. 09 Nov, 2020 2 commits
  35. 05 Nov, 2020 1 commit
  36. 30 Oct, 2020 1 commit
  37. 27 Oct, 2020 2 commits