- 14 Aug, 2020 5 commits
-
-
Jun Ru Anderson authored
Add support for mixed-precision (half precision params, full precision gradients) and memory-efficient (half precision params and half precision gradients) training with Adam Co-authored-by:Jun Ru Anderson <andersonic@fb.com>
-
Benjamin Lefaudeux authored
* hotfix a half-cooked optimizer state restoration, the global shared state also needs to be restored * [cleanup] get 100% coverage on oss.py (#38) authored-by:
Mandeep Singh Baines <msb@fb.com> * better unit testing, check that the .param_groups attribute is properly in sync with the loaded state Co-authored-by:
msbaines <35972327+msbaines@users.noreply.github.com>
-
msbaines authored
authored-by:Mandeep Singh Baines <msb@fb.com>
-
msbaines authored
* Set baseline coverage to 94%
-
msbaines authored
-
- 13 Aug, 2020 5 commits
-
-
msbaines authored
-
Jun Ru Anderson authored
-
msbaines authored
-
Jun Ru Anderson authored
Co-authored-by:Jun Ru Anderson <andersonic@fb.com>
-
Benjamin Lefaudeux authored
Aligning OSS state dict with `https://pytorch.org/docs/stable/_modules/torch/optim/optimizer.html#Optimizer` (#31)
-
- 08 Aug, 2020 1 commit
-
-
Min Xu authored
Co-authored-by:Min Xu <m1n@fb.com>
-
- 06 Aug, 2020 2 commits
-
-
Min Xu authored
Co-authored-by:Min Xu <m1n@fb.com>
-
Min Xu authored
Co-authored-by:Min Xu <m1n@fb.com>
-
- 01 Aug, 2020 1 commit
-
-
msbaines authored
-
- 31 Jul, 2020 11 commits
-
-
msbaines authored
-
msbaines authored
-
Benjamin Lefaudeux authored
-
Jun Ru Anderson authored
Add FusedAdam, update benchmark and add tests. Co-authored-by:Jun Ru Anderson <andersonic@fb.com>
-
Min Xu authored
-
msbaines authored
-
Tom Birch authored
-
msbaines authored
-
Jun Ru Anderson authored
Co-authored-by:Jun Ru Anderson <andersonic@fb.com>
-
Jun Ru Anderson authored
-
Jun Ru Anderson authored
-
- 08 Jul, 2020 1 commit
-
-
Mandeep Singh Baines authored
-