1. 01 Mar, 2022 1 commit
  2. 04 Feb, 2022 1 commit
  3. 03 Feb, 2022 1 commit
  4. 01 Feb, 2022 2 commits
  5. 31 Jan, 2022 1 commit
  6. 29 Jan, 2022 1 commit
  7. 27 Jan, 2022 1 commit
  8. 26 Jan, 2022 1 commit
  9. 25 Jan, 2022 1 commit
  10. 24 Jan, 2022 2 commits
  11. 22 Jan, 2022 1 commit
  12. 11 Jan, 2022 2 commits
  13. 10 Jan, 2022 1 commit
  14. 08 Jan, 2022 1 commit
  15. 07 Jan, 2022 1 commit
  16. 17 Dec, 2021 3 commits
  17. 10 Dec, 2021 1 commit
  18. 22 Nov, 2021 1 commit
  19. 11 Nov, 2021 1 commit
    • Sangkug Lym's avatar
      persistent fused layer norm · a2fdcdf0
      Sangkug Lym authored
      fix the guard to fall back to the baseline fused layer norm kernel
      
      Persisten ln: move the guard for supported hidden sizes to layer norm module
      a2fdcdf0
  20. 29 Oct, 2021 1 commit
  21. 12 Oct, 2021 1 commit
  22. 08 Oct, 2021 1 commit
  23. 06 Oct, 2021 1 commit
  24. 01 Oct, 2021 1 commit
  25. 29 Sep, 2021 1 commit
  26. 27 Sep, 2021 1 commit
  27. 20 Sep, 2021 1 commit
  28. 27 Aug, 2021 1 commit
  29. 21 Aug, 2021 4 commits
  30. 19 Aug, 2021 2 commits
  31. 30 Jul, 2021 1 commit
    • Deepak Narayanan's avatar
      Support for pipeline parallelism in T5 model · 46c74b4c
      Deepak Narayanan authored
      - Accumulate encoder hidden state gradient to handle skip connection
      - Correctly compute the number of layers in encoder / decoder for T5 model
      - Ensure e weights are initialized the same way in embeddings
      - Synchronize embedding gradients across encoder and decoder for T5 model
      - Support for checkpoint loading and saving
      46c74b4c