1. 06 Dec, 2022 1 commit
  2. 26 Aug, 2022 1 commit
  3. 08 Aug, 2022 1 commit
  4. 14 Dec, 2021 1 commit
  5. 21 Jan, 2021 1 commit
  6. 15 Jan, 2021 1 commit
  7. 31 Dec, 2020 1 commit
  8. 21 May, 2020 1 commit
  9. 20 May, 2020 2 commits
  10. 19 May, 2020 1 commit
  11. 15 May, 2020 2 commits
  12. 13 May, 2020 1 commit
  13. 22 Apr, 2020 1 commit
    • Vinicius Reis's avatar
      Fix LARC with mixed precision (#793) · 2ec84ebd
      Vinicius Reis authored
      The LARC optimizer wraps an underlying optimizer and then needs to be passed
      to amp.initialize for mixed precision. There were 3 different crashes happening
      in this situation, fix all of them and add a unit test.
      
      I don't know if the 'LARC' in sys.modules check ever worked. In my setup, the
      entry in sys.modules is 'apex.parallel.LARC'. Checking if the variable is
      defined seems more reliable though.
      2ec84ebd
  14. 27 Feb, 2020 1 commit
  15. 03 Oct, 2019 1 commit
  16. 27 Aug, 2019 1 commit
    • ptrblck's avatar
      Enable Checkpointing (#420) · dec4fdd6
      ptrblck authored
      * add state_dict, load_state_dict
      
      * add test_restoring, test_loss_scale_decrease
      
      * disable amp outputs for checkpoint tests
      
      * add test for amp.state_dict, cleanup
      
      * add state_dict patch, add test
      
      * fixed testing, cleanup
      
      * add readme for checkpointing
      
      * add docs to source/amp
      
      * add review changes to doc
      dec4fdd6
  17. 03 Jul, 2019 1 commit
  18. 31 May, 2019 1 commit
  19. 27 May, 2019 2 commits
  20. 16 May, 2019 1 commit
  21. 02 May, 2019 1 commit
  22. 10 Apr, 2019 1 commit
  23. 04 Apr, 2019 1 commit
    • mcarilli's avatar
      WIP: Handle arbitrary combinations of optimizers/models/losses (#232) · 3f87614f
      mcarilli authored
      * Refactor to allow more flexible treatment of multiple optimizers/models/losses
      
      * Adding _process_optimizers.py
      
      * Created L0 tests (now passing).
      
      * fix: minor print typo (#234)
      
      * make L1 results easier to read
      
      * L0 multiple model/optimizer/loss test fleshed out
      
      * Adding test that master params remain synced across distributed processes
      
      * Docstring updates
      
      * Docstring updates
      3f87614f
  24. 19 Mar, 2019 1 commit
  25. 10 Mar, 2019 1 commit
  26. 26 Feb, 2019 1 commit