1. 08 Aug, 2019 1 commit
  2. 01 Aug, 2019 1 commit
  3. 12 Jul, 2019 1 commit
  4. 03 Jul, 2019 2 commits
  5. 28 Jun, 2019 1 commit
  6. 14 Jun, 2019 1 commit
  7. 11 Jun, 2019 1 commit
  8. 31 May, 2019 2 commits
  9. 26 Apr, 2019 1 commit
    • ptrblck's avatar
      Replace type().ScalarType() with scalar_type() (#272) · 855808f3
      ptrblck authored
      * change .type().ScalarType() to .scalar_type() + at::ScalarType::X to at::kX
      
      * revert scalar_type() to type() for AT_DISPATCH_FLOATING_TYPES_AND_HALF
      
      * revert scalar_type() to type() in AT_DISPATCH_FLOATING_TYPES
      
      * revert scalar_type() to type() for AT_DISPATCH_FLOATING_TYPES_AND_HALF in welford.cu
      
      * revert scalar_type() to type() in layer_norm_cuda_kernel.cu
      
      * revert at::kType  to at::ScalarType::Type
      
      * use DISPATCH_FLOAT_AND_HALF to get rid of warnings
      
      * add dispatch mechanisms for double+float and double+float+half
      855808f3
  10. 10 Apr, 2019 2 commits
  11. 09 Apr, 2019 1 commit
  12. 08 Apr, 2019 1 commit
  13. 04 Apr, 2019 1 commit
    • mcarilli's avatar
      WIP: Handle arbitrary combinations of optimizers/models/losses (#232) · 3f87614f
      mcarilli authored
      * Refactor to allow more flexible treatment of multiple optimizers/models/losses
      
      * Adding _process_optimizers.py
      
      * Created L0 tests (now passing).
      
      * fix: minor print typo (#234)
      
      * make L1 results easier to read
      
      * L0 multiple model/optimizer/loss test fleshed out
      
      * Adding test that master params remain synced across distributed processes
      
      * Docstring updates
      
      * Docstring updates
      3f87614f
  14. 21 Mar, 2019 2 commits
  15. 19 Mar, 2019 2 commits
  16. 15 Mar, 2019 1 commit
  17. 12 Mar, 2019 1 commit
  18. 10 Mar, 2019 2 commits
  19. 03 Mar, 2019 1 commit
  20. 28 Feb, 2019 1 commit
  21. 24 Feb, 2019 1 commit
  22. 22 Feb, 2019 1 commit
  23. 19 Feb, 2019 1 commit
  24. 13 Feb, 2019 1 commit
  25. 11 Feb, 2019 1 commit
  26. 08 Feb, 2019 1 commit
  27. 06 Feb, 2019 2 commits
  28. 05 Feb, 2019 1 commit
  29. 04 Feb, 2019 1 commit
  30. 01 Feb, 2019 1 commit
  31. 18 Jan, 2019 1 commit
  32. 15 Jan, 2019 1 commit
    • Jie's avatar
      [sync BN nhwc] · 443fa76e
      Jie authored
      Added kernel to support sync BN for channel last tensor
      443fa76e
  33. 06 Nov, 2018 1 commit
    • Jie's avatar
      [syncBN] · ee67e56a
      Jie authored
      adjusted kernel config for better perf.
      removed divergence in welford warp reduction.
      ee67e56a