1. 12 Dec, 2021 1 commit
  2. 06 Dec, 2021 2 commits
  3. 22 Nov, 2021 1 commit
  4. 30 Sep, 2021 1 commit
    • Suraj Patil's avatar
      [examples/flax] use Repository API for push_to_hub (#13672) · 7db2a79b
      Suraj Patil authored
      * use Repository for push_to_hub
      
      * update readme
      
      * update other flax scripts
      
      * update readme
      
      * update qa example
      
      * fix push_to_hub call
      
      * fix typo
      
      * fix more typos
      
      * update readme
      
      * use abosolute path to get repo name
      
      * fix glue script
      7db2a79b
  5. 28 Aug, 2021 1 commit
  6. 27 Aug, 2021 1 commit
  7. 09 Aug, 2021 1 commit
  8. 27 Jul, 2021 1 commit
  9. 20 Jul, 2021 1 commit
  10. 09 Jul, 2021 1 commit
  11. 08 Jul, 2021 1 commit
  12. 07 Jul, 2021 1 commit
  13. 06 Jul, 2021 1 commit
  14. 05 Jul, 2021 2 commits
  15. 29 Jun, 2021 1 commit
  16. 28 Jun, 2021 1 commit
  17. 25 Jun, 2021 1 commit
  18. 11 Jun, 2021 1 commit
    • Suraj Patil's avatar
      Flax CLM script (#12023) · 15b498f3
      Suraj Patil authored
      * first draft
      
      * max_seq_length => block_size
      
      * fix arg names
      
      * fix typos
      
      * fix loss calculation
      
      * add max examples, fix  train eval steps, metrics
      
      * optimizer mask
      
      * fix perpelexity, metric logging
      
      * fix logging
      
      * data_collator = > data_loader
      
      * refactor loss_fn
      
      * support single GPU
      
      * pass distributed to write_metric
      
      * fix jitting
      
      * fix single device training
      
      * fix single device metrics
      
      * close inner progress bars once finished
      
      * add overwrite_cache arg
      
      * ifx dataset caching issue
      
      * add more logs
      
      * few small fixes,
      
      * address nicholas suggestions
      
      * fix docstr
      
      * address patricks suggestions
      
      * make flake happy
      
      * pass new new_dropout_rng to apply_gradients
      
      * reset train metrics after every epoc
      
      * remove distributed logis, small fixes
      15b498f3
  19. 08 Jun, 2021 1 commit
  20. 25 May, 2021 3 commits
  21. 12 May, 2021 2 commits
  22. 11 May, 2021 1 commit
  23. 29 Apr, 2021 1 commit
  24. 26 Apr, 2021 1 commit
  25. 23 Apr, 2021 1 commit
  26. 21 Apr, 2021 1 commit
  27. 13 Apr, 2021 1 commit
  28. 09 Apr, 2021 1 commit
  29. 08 Apr, 2021 1 commit
  30. 07 Apr, 2021 2 commits
  31. 06 Apr, 2021 2 commits
  32. 31 Mar, 2021 1 commit
  33. 16 Mar, 2021 1 commit