"examples/vscode:/vscode.git/clone" did not exist on "765a446deecfc275bfb2ef5df18b146d77c3508e"
Commit 512337f5 authored by Mohammad Shoeybi's avatar Mohammad Shoeybi
Browse files

Merge branch 'optimize_refactor_finetune_fix' into 'optimize_refactor'

Use new api to get loss scale when finetuning.

See merge request ADLR/megatron-lm!201
parents 7381754e a13cbe1e
...@@ -186,7 +186,8 @@ def _train(model, optimizer, lr_scheduler, forward_step, ...@@ -186,7 +186,8 @@ def _train(model, optimizer, lr_scheduler, forward_step,
# Logging. # Logging.
report_memory_flag = training_log(losses_dict, losses_dict_sum, report_memory_flag = training_log(losses_dict, losses_dict_sum,
optimizer.param_groups[0]['lr'], optimizer.param_groups[0]['lr'],
iteration, optimizer.loss_scale, iteration,
optimizer.get_loss_scale().item(),
report_memory_flag, skipped_iter) report_memory_flag, skipped_iter)
# Autoresume # Autoresume
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment