Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
OpenDAS
Megatron-LM
Commits
a13cbe1e
Commit
a13cbe1e
authored
Jan 05, 2021
by
Jared Casper
Browse files
Use new api to get loss scale when finetuning.
parent
7381754e
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
2 additions
and
1 deletion
+2
-1
tasks/finetune_utils.py
tasks/finetune_utils.py
+2
-1
No files found.
tasks/finetune_utils.py
View file @
a13cbe1e
...
...
@@ -186,7 +186,8 @@ def _train(model, optimizer, lr_scheduler, forward_step,
# Logging.
report_memory_flag
=
training_log
(
losses_dict
,
losses_dict_sum
,
optimizer
.
param_groups
[
0
][
'lr'
],
iteration
,
optimizer
.
loss_scale
,
iteration
,
optimizer
.
get_loss_scale
().
item
(),
report_memory_flag
,
skipped_iter
)
# Autoresume
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment