1. 30 Mar, 2022 1 commit
  2. 09 Mar, 2022 1 commit
  3. 07 Mar, 2022 2 commits
  4. 03 Mar, 2022 2 commits
  5. 02 Mar, 2022 1 commit
  6. 19 Feb, 2022 1 commit
  7. 18 Feb, 2022 1 commit
  8. 17 Feb, 2022 2 commits
  9. 16 Feb, 2022 1 commit
    • Sangkug Lym's avatar
      gradient accumulation fusion · 83b1e42f
      Sangkug Lym authored
      remove redundant linear layer class definition
      
      add fuse_gradient_accumulation attribute to weights for simple targetting
      
      reflect feedback and clean up the codes
      
      arg change
      83b1e42f
  10. 08 Feb, 2022 1 commit
  11. 04 Feb, 2022 1 commit
  12. 01 Feb, 2022 1 commit
  13. 31 Jan, 2022 1 commit
  14. 25 Jan, 2022 1 commit
  15. 24 Jan, 2022 3 commits
  16. 12 Jan, 2022 1 commit
  17. 11 Jan, 2022 3 commits
  18. 10 Jan, 2022 1 commit
  19. 08 Jan, 2022 1 commit
  20. 07 Jan, 2022 1 commit
  21. 17 Dec, 2021 2 commits
  22. 22 Nov, 2021 1 commit
  23. 05 Nov, 2021 1 commit
  24. 03 Sep, 2021 1 commit
  25. 02 Sep, 2021 3 commits
  26. 31 Aug, 2021 1 commit
  27. 19 Aug, 2021 3 commits
  28. 16 Aug, 2021 1 commit
    • eqy's avatar
      Destroy more groups in `destroy_model_parallel` · eddf7593
      eqy authored
      Some tests expect a clean model parallel slate and complain if a previous test left something behind; this change clears more variables that the tests complain about.
      eddf7593