Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
chenpangpang
transformers
Commits
2f21497d
Commit
2f21497d
authored
Nov 20, 2018
by
thomwolf
Browse files
fixing param.grad is None in fp16 examples
parent
da73925f
Changes
2
Hide whitespace changes
Inline
Side-by-side
Showing
2 changed files
with
4 additions
and
2 deletions
+4
-2
examples/run_classifier.py
examples/run_classifier.py
+2
-1
examples/run_squad.py
examples/run_squad.py
+2
-1
No files found.
examples/run_classifier.py
View file @
2f21497d
...
@@ -555,7 +555,8 @@ def main():
...
@@ -555,7 +555,8 @@ def main():
if
args
.
fp16
and
args
.
loss_scale
!=
1.0
:
if
args
.
fp16
and
args
.
loss_scale
!=
1.0
:
# scale down gradients for fp16 training
# scale down gradients for fp16 training
for
param
in
model
.
parameters
():
for
param
in
model
.
parameters
():
param
.
grad
.
data
=
param
.
grad
.
data
/
args
.
loss_scale
if
param
.
grad
is
not
None
:
param
.
grad
.
data
=
param
.
grad
.
data
/
args
.
loss_scale
is_nan
=
set_optimizer_params_grad
(
param_optimizer
,
model
.
named_parameters
(),
test_nan
=
True
)
is_nan
=
set_optimizer_params_grad
(
param_optimizer
,
model
.
named_parameters
(),
test_nan
=
True
)
if
is_nan
:
if
is_nan
:
logger
.
info
(
"FP16 TRAINING: Nan in gradients, reducing loss scaling"
)
logger
.
info
(
"FP16 TRAINING: Nan in gradients, reducing loss scaling"
)
...
...
examples/run_squad.py
View file @
2f21497d
...
@@ -898,7 +898,8 @@ def main():
...
@@ -898,7 +898,8 @@ def main():
if
args
.
fp16
and
args
.
loss_scale
!=
1.0
:
if
args
.
fp16
and
args
.
loss_scale
!=
1.0
:
# scale down gradients for fp16 training
# scale down gradients for fp16 training
for
param
in
model
.
parameters
():
for
param
in
model
.
parameters
():
param
.
grad
.
data
=
param
.
grad
.
data
/
args
.
loss_scale
if
param
.
grad
is
not
None
:
param
.
grad
.
data
=
param
.
grad
.
data
/
args
.
loss_scale
is_nan
=
set_optimizer_params_grad
(
param_optimizer
,
model
.
named_parameters
(),
test_nan
=
True
)
is_nan
=
set_optimizer_params_grad
(
param_optimizer
,
model
.
named_parameters
(),
test_nan
=
True
)
if
is_nan
:
if
is_nan
:
logger
.
info
(
"FP16 TRAINING: Nan in gradients, reducing loss scaling"
)
logger
.
info
(
"FP16 TRAINING: Nan in gradients, reducing loss scaling"
)
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment