1. 18 Mar, 2022 3 commits
  2. 14 Mar, 2022 1 commit
  3. 18 Feb, 2022 1 commit
  4. 16 Feb, 2022 1 commit
    • Sangkug Lym's avatar
      gradient accumulation fusion · 83b1e42f
      Sangkug Lym authored
      remove redundant linear layer class definition
      
      add fuse_gradient_accumulation attribute to weights for simple targetting
      
      reflect feedback and clean up the codes
      
      arg change
      83b1e42f
  5. 15 Feb, 2022 4 commits
  6. 11 Feb, 2022 1 commit
  7. 10 Feb, 2022 1 commit
  8. 04 Feb, 2022 1 commit
  9. 01 Feb, 2022 3 commits
  10. 28 Jan, 2022 1 commit
  11. 27 Jan, 2022 1 commit
  12. 26 Jan, 2022 1 commit
  13. 25 Jan, 2022 1 commit
  14. 24 Jan, 2022 3 commits
  15. 22 Jan, 2022 1 commit
  16. 19 Jan, 2022 1 commit
  17. 13 Jan, 2022 1 commit
  18. 12 Jan, 2022 1 commit
  19. 04 Jan, 2022 1 commit
  20. 17 Dec, 2021 1 commit
  21. 24 Nov, 2021 1 commit
  22. 23 Nov, 2021 1 commit
  23. 22 Nov, 2021 1 commit
  24. 11 Nov, 2021 1 commit
    • Sangkug Lym's avatar
      persistent fused layer norm · a2fdcdf0
      Sangkug Lym authored
      fix the guard to fall back to the baseline fused layer norm kernel
      
      Persisten ln: move the guard for supported hidden sizes to layer norm module
      a2fdcdf0
  25. 10 Oct, 2021 1 commit
  26. 02 Sep, 2021 3 commits
  27. 23 Aug, 2021 1 commit
  28. 21 Aug, 2021 2 commits