[PyTorch] Use same API in optimizer `zero_grad` as PyTorch optimizers (#1466)
Use same API in optimizer zero_grad as PyT optimizers
Signed-off-by:
Tim Moon <tmoon@nvidia.com>
Showing
Please register or sign in to comment
Use same API in optimizer zero_grad as PyT optimizers
Signed-off-by:
Tim Moon <tmoon@nvidia.com>