1. 24 Nov, 2021 1 commit
  2. 11 Nov, 2021 1 commit
    • Sangkug Lym's avatar
      persistent fused layer norm · a2fdcdf0
      Sangkug Lym authored
      fix the guard to fall back to the baseline fused layer norm kernel
      
      Persisten ln: move the guard for supported hidden sizes to layer norm module
      a2fdcdf0
  3. 10 Oct, 2021 1 commit
  4. 02 Sep, 2021 3 commits
  5. 23 Aug, 2021 1 commit
  6. 21 Aug, 2021 2 commits
  7. 19 Aug, 2021 5 commits
  8. 17 Aug, 2021 1 commit
  9. 16 Aug, 2021 2 commits
  10. 11 Aug, 2021 1 commit
  11. 30 Jul, 2021 1 commit
    • Deepak Narayanan's avatar
      Support for pipeline parallelism in T5 model · 46c74b4c
      Deepak Narayanan authored
      - Accumulate encoder hidden state gradient to handle skip connection
      - Correctly compute the number of layers in encoder / decoder for T5 model
      - Ensure e weights are initialized the same way in embeddings
      - Synchronize embedding gradients across encoder and decoder for T5 model
      - Support for checkpoint loading and saving
      46c74b4c
  12. 13 Jul, 2021 1 commit
  13. 12 Jul, 2021 2 commits
  14. 19 May, 2021 1 commit
  15. 14 May, 2021 1 commit
    • Jared Casper's avatar
      Update arguments checks. · 8044c7b4
      Jared Casper authored
      hidden_size % attention_heads == 0 is handled above when dealing with kv_channels.
      
      Adding check for decoder sequence length.
      8044c7b4
  16. 11 May, 2021 1 commit
  17. 16 Apr, 2021 1 commit
  18. 08 Apr, 2021 1 commit
  19. 31 Mar, 2021 1 commit
  20. 20 Mar, 2021 1 commit
  21. 19 Mar, 2021 3 commits
  22. 18 Mar, 2021 1 commit
  23. 10 Mar, 2021 1 commit
  24. 08 Mar, 2021 1 commit
  25. 04 Mar, 2021 1 commit
  26. 23 Feb, 2021 3 commits
  27. 22 Feb, 2021 1 commit