- 08 Apr, 2022 1 commit
-
-
ver217 authored
-
- 16 Mar, 2022 1 commit
-
-
Jiarui Fang authored
* better logger using rich * remove deepspeed in zero requirements
-
- 15 Feb, 2022 1 commit
-
-
ver217 authored
-
- 28 Jan, 2022 1 commit
-
-
BoxiangW authored
* Update GitHub action and pre-commit settings * Update GitHub action and pre-commit settings (#198)
-
- 18 Nov, 2021 1 commit
-
-
Frank Lee authored
* Add gradient accumulation, fix lr scheduler * fix FP16 optimizer and adapted torch amp with tensor parallel (#18) * fixed bugs in compatibility between torch amp and tensor parallel and performed some minor fixes * fixed trainer * Revert "fixed trainer" This reverts commit 2e0b0b76990e8d4e337add483d878c0f61cf5097. * improved consistency between trainer, engine and schedule (#23) Co-authored-by:
1SAA <c2h214748@gmail.com> Co-authored-by:
1SAA <c2h214748@gmail.com> Co-authored-by:
ver217 <lhx0217@gmail.com>
-
- 28 Oct, 2021 1 commit
-
-
zbian authored
-