- 29 Nov, 2021 1 commit
-
-
binmakeswell authored
-
- 18 Nov, 2021 2 commits
-
-
ver217 authored
-
Frank Lee authored
* Add gradient accumulation, fix lr scheduler * fix FP16 optimizer and adapted torch amp with tensor parallel (#18) * fixed bugs in compatibility between torch amp and tensor parallel and performed some minor fixes * fixed trainer * Revert "fixed trainer" This reverts commit 2e0b0b76990e8d4e337add483d878c0f61cf5097. * improved consistency between trainer, engine and schedule (#23) Co-authored-by:
1SAA <c2h214748@gmail.com> Co-authored-by:
1SAA <c2h214748@gmail.com> Co-authored-by:
ver217 <lhx0217@gmail.com>
-
- 15 Nov, 2021 2 commits
- 04 Nov, 2021 1 commit
-
-
ver217 authored
made some modifications to the documents
-
- 03 Nov, 2021 2 commits
-
-
binmakeswell authored
-
Frank Lee authored
added Chinese documents and fixed some typos in English documents
-
- 02 Nov, 2021 2 commits
- 01 Nov, 2021 1 commit
-
-
ver217 authored
-
- 29 Oct, 2021 1 commit
-
-
ver217 authored
-
- 28 Oct, 2021 4 commits