- 08 Sep, 2020 1 commit
-
-
Benjamin Lefaudeux authored
Make sure that all attributes (not just LR) are in sync in between the OSS.param_groups and the actual wrapped optimizer. Some frameworks make it possible to alter any attribute on a scheduled basis, which proves useful depending on the optimizer, so the keys need to be generically supported (not just "lr"). Not syncing these attributes is a worst case scenario, since these adjustments are silently not propagated, fixing that.
-
- 03 Sep, 2020 1 commit
-
-
Benjamin Lefaudeux authored
* Aligning the optimizer state dict with what PyTorch expects * Adding a check on the dict keys, ensure that `state` and `param_groups` are there * after installing the specific isort, black and all, one liner to please the linter..
-
- 28 Aug, 2020 1 commit
-
-
msbaines authored
* [fix] optim/oss: work correctly with LRScheduler Sync lr before every step and before consolidate.
-
- 27 Aug, 2020 3 commits
- 20 Aug, 2020 1 commit
-
-
Benjamin Lefaudeux authored
* move the restored param groups to the original device * adding a corresponding test
-
- 14 Aug, 2020 1 commit
-
-
Benjamin Lefaudeux authored
* hotfix a half-cooked optimizer state restoration, the global shared state also needs to be restored * [cleanup] get 100% coverage on oss.py (#38) authored-by:
Mandeep Singh Baines <msb@fb.com> * better unit testing, check that the .param_groups attribute is properly in sync with the loaded state Co-authored-by:
msbaines <35972327+msbaines@users.noreply.github.com>
-
- 13 Aug, 2020 1 commit
-
-
Benjamin Lefaudeux authored
Aligning OSS state dict with `https://pytorch.org/docs/stable/_modules/torch/optim/optimizer.html#Optimizer` (#31)
-
- 08 Aug, 2020 1 commit
-
-
Min Xu authored
Co-authored-by:Min Xu <m1n@fb.com>
-
- 31 Jul, 2020 1 commit
-
-
Benjamin Lefaudeux authored
-
- 08 Jul, 2020 1 commit
-
-
Mandeep Singh Baines authored
-