Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
OpenDAS
Megatron-LM
Commits
27bc1338
Commit
27bc1338
authored
Aug 15, 2022
by
mshoeybi
Browse files
fixed grad scalar warning so it only prints it for fp16
parent
d2073912
Changes
1
Show whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
1 addition
and
1 deletion
+1
-1
megatron/optimizer/optimizer.py
megatron/optimizer/optimizer.py
+1
-1
No files found.
megatron/optimizer/optimizer.py
View file @
27bc1338
...
@@ -679,7 +679,7 @@ class Float16OptimizerWithFloat16Params(MixedPrecisionOptimizer):
...
@@ -679,7 +679,7 @@ class Float16OptimizerWithFloat16Params(MixedPrecisionOptimizer):
self
.
optimizer
.
load_state_dict
(
state_dict
[
optimizer_key
])
self
.
optimizer
.
load_state_dict
(
state_dict
[
optimizer_key
])
# Grad scaler.
# Grad scaler.
if
'grad_scaler'
not
in
state_dict
:
if
self
.
fp16
and
'grad_scaler'
not
in
state_dict
:
print_rank_0
(
'***WARNING*** found an old checkpoint, will not '
print_rank_0
(
'***WARNING*** found an old checkpoint, will not '
'load grad scaler ...'
)
'load grad scaler ...'
)
else
:
else
:
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment