Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
wuxk1
Megatron-LM
Commits
e2a4d426
Commit
e2a4d426
authored
Dec 02, 2020
by
mohammad
Browse files
found a bug in consumed tokens initialization
parent
75bd9b54
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
4 additions
and
2 deletions
+4
-2
megatron/checkpointing.py
megatron/checkpointing.py
+4
-2
No files found.
megatron/checkpointing.py
View file @
e2a4d426
...
@@ -220,8 +220,10 @@ def load_checkpoint(model, optimizer, lr_scheduler, load_arg='load'):
...
@@ -220,8 +220,10 @@ def load_checkpoint(model, optimizer, lr_scheduler, load_arg='load'):
if
'args'
in
state_dict
:
if
'args'
in
state_dict
:
checkpoint_args
=
state_dict
[
'args'
]
checkpoint_args
=
state_dict
[
'args'
]
check_checkpoint_args
(
checkpoint_args
)
check_checkpoint_args
(
checkpoint_args
)
args
.
consumed_train_samples
=
getattr
(
args
,
'consumed_train_samples'
,
0
)
args
.
consumed_train_samples
=
getattr
(
checkpoint_args
,
args
.
consumed_valid_samples
=
getattr
(
args
,
'consumed_valid_samples'
,
0
)
'consumed_train_samples'
,
0
)
args
.
consumed_valid_samples
=
getattr
(
checkpoint_args
,
'consumed_valid_samples'
,
0
)
else
:
else
:
print_rank_0
(
'could not find arguments in the checkpoint ...'
)
print_rank_0
(
'could not find arguments in the checkpoint ...'
)
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment