Unverified Commit 3f4c9fd0 authored by Wang Xinjiang's avatar Wang Xinjiang Committed by GitHub
Browse files

fix possible bugs brought by dumping grad_norm to the logger in pytorch1.5 (#349)

parent f28a7c7e
......@@ -23,6 +23,6 @@ class OptimizerHook(Hook):
grad_norm = self.clip_grads(runner.model.parameters())
if grad_norm is not None:
# Add grad norm to the logger
runner.log_buffer.update({'grad_norm': grad_norm},
runner.log_buffer.update({'grad_norm': float(grad_norm)},
runner.outputs['num_samples'])
runner.optimizer.step()
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment