- 18 Jan, 2022 1 commit
-
-
Frank Lee authored
-
- 13 Dec, 2021 1 commit
-
-
Frank Lee authored
-
- 10 Dec, 2021 1 commit
-
-
Frank Lee authored
-
- 18 Nov, 2021 1 commit
-
-
Frank Lee authored
* Add gradient accumulation, fix lr scheduler * fix FP16 optimizer and adapted torch amp with tensor parallel (#18) * fixed bugs in compatibility between torch amp and tensor parallel and performed some minor fixes * fixed trainer * Revert "fixed trainer" This reverts commit 2e0b0b76990e8d4e337add483d878c0f61cf5097. * improved consistency between trainer, engine and schedule (#23) Co-authored-by:
1SAA <c2h214748@gmail.com> Co-authored-by:
1SAA <c2h214748@gmail.com> Co-authored-by:
ver217 <lhx0217@gmail.com>
-
- 03 Nov, 2021 1 commit
-
-
binmakeswell authored
-
- 29 Oct, 2021 1 commit
-
-
ver217 authored
-
- 28 Oct, 2021 2 commits