"git@developer.sourcefind.cn:OpenDAS/pytorch3d.git" did not exist on "5218f45c2c3b254d7a82515b24cbac6902296b8f"
Commit 1506551f authored by Peizhao Zhang's avatar Peizhao Zhang Committed by Facebook GitHub Bot
Browse files

print grad scaler as part of the metric.

Summary:
Pull Request resolved: https://github.com/facebookresearch/d2go/pull/501

X-link: https://github.com/facebookresearch/detectron2/pull/4851

print grad scaler as part of the metric.
* Controlled by a flag "SOLVER.AMP.LOG_GRAD_SCALER"

Reviewed By: tax313

Differential Revision: D43585363

fbshipit-source-id: 495b37ff524c47e515cea0b3c677ee81b34ad4ca
parent 25049cdf
...@@ -81,6 +81,8 @@ def _add_detectron2go_runner_default_cfg(_C: CN) -> None: ...@@ -81,6 +81,8 @@ def _add_detectron2go_runner_default_cfg(_C: CN) -> None:
assert not _C.SOLVER.AMP.ENABLED assert not _C.SOLVER.AMP.ENABLED
# AMP precision is used by both D2 and lightning backend. Can be "float16" or "bfloat16". # AMP precision is used by both D2 and lightning backend. Can be "float16" or "bfloat16".
_C.SOLVER.AMP.PRECISION = "float16" _C.SOLVER.AMP.PRECISION = "float16"
# log the grad scalar to the output
_C.SOLVER.AMP.LOG_GRAD_SCALER = False
# Betas are used in the AdamW optimizer # Betas are used in the AdamW optimizer
_C.SOLVER.BETAS = (0.9, 0.999) _C.SOLVER.BETAS = (0.9, 0.999)
......
...@@ -551,6 +551,7 @@ class Detectron2GoRunner(D2GoDataAPIMixIn, BaseRunner): ...@@ -551,6 +551,7 @@ class Detectron2GoRunner(D2GoDataAPIMixIn, BaseRunner):
precision=parse_precision_from_string( precision=parse_precision_from_string(
cfg.SOLVER.AMP.PRECISION, lightning=False cfg.SOLVER.AMP.PRECISION, lightning=False
), ),
log_grad_scaler=cfg.SOLVER.AMP.LOG_GRAD_SCALER,
) )
else: else:
trainer = SimpleTrainer( trainer = SimpleTrainer(
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment