1. 12 Feb, 2021 1 commit
  2. 05 Feb, 2021 1 commit
  3. 03 Feb, 2021 1 commit
  4. 02 Feb, 2021 1 commit
  5. 27 Jan, 2021 1 commit
  6. 20 Jan, 2021 1 commit
  7. 11 Jan, 2021 1 commit
  8. 08 Jan, 2021 3 commits
  9. 05 Jan, 2021 1 commit
    • Benjamin Lefaudeux's avatar
      [fix] Flaky tests (#283) · 79365ee6
      Benjamin Lefaudeux authored
      * adding the pytest timeout plugin to properly root out hanging tests
      * removing redundant code, slightly more reasonable timeout, works on single cuda
      * finding the root bug for some of the cpu hangs, rpc init
      * propagating all the rpc init test changes to the pipe and model parallel tests
      79365ee6
  10. 29 Dec, 2020 1 commit
  11. 22 Dec, 2020 1 commit
    • Benjamin Lefaudeux's avatar
      [OSS] Balance the trainable params only (#262) · c386e937
      Benjamin Lefaudeux authored
      * fix, one liner
      
      * adjust so that frozen trunks get spread still, even if this should have little consequences
      
      * removing dead code, hopeful unit test fix
      
      * now with some linting..
      
      * adding a proper unit test case
      c386e937
  12. 06 Dec, 2020 1 commit
  13. 16 Nov, 2020 1 commit
  14. 06 Nov, 2020 1 commit
  15. 14 Oct, 2020 2 commits
  16. 08 Oct, 2020 1 commit
  17. 15 Sep, 2020 2 commits
  18. 09 Sep, 2020 1 commit
    • Benjamin Lefaudeux's avatar
      [feat] OSS flatten state dict (#65) · 4f597233
      Benjamin Lefaudeux authored
      Changes the structure of the returned state dict with respect to the param_groups to make it closer to what a vanilla optimizer would return (un-shard them). Shard again when loading
      4f597233
  19. 08 Sep, 2020 1 commit
    • Benjamin Lefaudeux's avatar
      [feat] OSS: Sync all attributes (#67) · 5a268b25
      Benjamin Lefaudeux authored
      Make sure that all attributes (not just LR) are in sync in between the OSS.param_groups and the actual wrapped optimizer. Some frameworks make it possible to alter any attribute on a scheduled basis, which proves useful depending on the optimizer, so the keys need to be generically supported (not just "lr"). Not syncing these attributes is a worst case scenario, since these adjustments are silently not propagated, fixing that. 
      5a268b25
  20. 03 Sep, 2020 1 commit
    • Benjamin Lefaudeux's avatar
      [fix] OSS pytorch-compliant state dict (#61) · 1d1d15ea
      Benjamin Lefaudeux authored
      * Aligning the optimizer state dict with what PyTorch expects
      
      * Adding a check on the dict keys, ensure that `state` and `param_groups` are there
      
      * after installing the specific isort, black and all, one liner to please the linter..
      1d1d15ea
  21. 28 Aug, 2020 1 commit
  22. 27 Aug, 2020 3 commits
  23. 20 Aug, 2020 1 commit
  24. 14 Aug, 2020 1 commit
  25. 13 Aug, 2020 1 commit
  26. 08 Aug, 2020 1 commit
  27. 31 Jul, 2020 1 commit
  28. 08 Jul, 2020 1 commit