1. 06 Dec, 2021 1 commit
    • Suraj Patil's avatar
      Add Flax example tests (#14599) · c5bd732a
      Suraj Patil authored
      * add test for glue
      
      * add tests for clm
      
      * fix clm test
      
      * add summrization tests
      
      * more tests
      
      * fix few tests
      
      * add test for t5 mlm
      
      * fix t5 mlm test
      
      * fix tests for multi device
      
      * cleanup
      
      * ci job
      
      * fix metric file name
      
      * make t5 more robust
      c5bd732a
  2. 09 Nov, 2021 1 commit
  3. 30 Sep, 2021 1 commit
    • Suraj Patil's avatar
      [examples/flax] use Repository API for push_to_hub (#13672) · 7db2a79b
      Suraj Patil authored
      * use Repository for push_to_hub
      
      * update readme
      
      * update other flax scripts
      
      * update readme
      
      * update qa example
      
      * fix push_to_hub call
      
      * fix typo
      
      * fix more typos
      
      * update readme
      
      * use abosolute path to get repo name
      
      * fix glue script
      7db2a79b
  4. 02 Aug, 2021 1 commit
  5. 28 Jun, 2021 1 commit
  6. 25 Jun, 2021 1 commit
  7. 15 Jun, 2021 1 commit
  8. 14 Jun, 2021 1 commit
  9. 03 Jun, 2021 1 commit
    • Nicholas Vadivelu's avatar
      Fix weight decay masking in `run_flax_glue.py` (#11964) · 4674061b
      Nicholas Vadivelu authored
      
      
      * Fix weight decay masking in `run_flax_glue.py`
      
      Issues with the previous implementation:
      - The `dict` from `traverse_util.flatten_dict` has keys which are tuples of strings, not one long string with the path separated by periods.
      - `optax.masked` applies the transformation wherever the mask is True, so the masks are flipped.
      - Flax's LayerNorm calls the scale parameter `scale` not `weight`
      
      * Fix formatting with black
      
      * adapt results
      Co-authored-by: default avatarPatrick von Platen <patrick@huggingface.co>
      4674061b
  10. 31 May, 2021 1 commit
    • Nicholas Vadivelu's avatar
      Remove redundant `nn.log_softmax` in `run_flax_glue.py` (#11920) · 1ab147d6
      Nicholas Vadivelu authored
      * Remove redundant `nn.log_softmax` in `run_flax_glue.py`
      
      `optax.softmax_cross_entropy` expects unnormalized logits, and so it already calls `nn.log_softmax`, so I believe it is not needed here. `nn.log_softmax` is idempotent so mathematically it shouldn't have made a difference.
      
      * Remove unused 'flax.linen' import
      1ab147d6
  11. 21 May, 2021 2 commits
  12. 19 May, 2021 1 commit
  13. 17 May, 2021 1 commit
  14. 14 May, 2021 2 commits
  15. 12 May, 2021 1 commit
  16. 11 May, 2021 1 commit