"llama/llama.cpp/vscode:/vscode.git/clone" did not exist on "ad035ad595295d0026a5a94f8180962bbf0fa935"
  1. 31 Mar, 2020 1 commit
  2. 27 Feb, 2020 1 commit
  3. 03 Oct, 2019 1 commit
  4. 03 Sep, 2019 1 commit
    • Deyu Fu's avatar
      Fix issues in fused_dam (#469) · 7fa74925
      Deyu Fu authored
      * move import of amp_C to __init__()
      
      * make fp16/32 separate lists to support mixed param types, disable double test
      
      * make zero_grad consistent between adam/novograd/lamb
      7fa74925
  5. 27 Aug, 2019 1 commit
    • ptrblck's avatar
      Enable Checkpointing (#420) · dec4fdd6
      ptrblck authored
      * add state_dict, load_state_dict
      
      * add test_restoring, test_loss_scale_decrease
      
      * disable amp outputs for checkpoint tests
      
      * add test for amp.state_dict, cleanup
      
      * add state_dict patch, add test
      
      * fixed testing, cleanup
      
      * add readme for checkpointing
      
      * add docs to source/amp
      
      * add review changes to doc
      dec4fdd6
  6. 17 Aug, 2019 1 commit
  7. 15 Aug, 2019 1 commit
  8. 13 Aug, 2019 2 commits
  9. 12 Aug, 2019 1 commit
  10. 08 Aug, 2019 1 commit
  11. 06 Aug, 2019 1 commit
    • ngimel's avatar
      Clean up layer norm tests (#418) · 3ef01fae
      ngimel authored
      * Bug fix for non-affine layer-norm + add backward unit test
      
      * clean up tests and add tests for a large batch
      3ef01fae
  12. 03 Jul, 2019 1 commit
  13. 31 May, 2019 1 commit
  14. 27 May, 2019 2 commits
  15. 16 May, 2019 1 commit
  16. 02 May, 2019 1 commit
  17. 10 Apr, 2019 3 commits
  18. 04 Apr, 2019 1 commit
    • mcarilli's avatar
      WIP: Handle arbitrary combinations of optimizers/models/losses (#232) · 3f87614f
      mcarilli authored
      * Refactor to allow more flexible treatment of multiple optimizers/models/losses
      
      * Adding _process_optimizers.py
      
      * Created L0 tests (now passing).
      
      * fix: minor print typo (#234)
      
      * make L1 results easier to read
      
      * L0 multiple model/optimizer/loss test fleshed out
      
      * Adding test that master params remain synced across distributed processes
      
      * Docstring updates
      
      * Docstring updates
      3f87614f
  19. 19 Mar, 2019 1 commit
  20. 13 Mar, 2019 1 commit
  21. 10 Mar, 2019 1 commit
  22. 26 Feb, 2019 1 commit