Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
OpenDAS
Megatron-LM
Commits
36c3a1c1
Commit
36c3a1c1
authored
Dec 08, 2022
by
MaximumEntropy
Committed by
Jimmy Zhang
Apr 03, 2023
Browse files
Fix checkpointing
Signed-off-by:
MaximumEntropy
<
sandeep.subramanian.1@umontreal.ca
>
parent
82c7ba57
Changes
1
Show whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
2 additions
and
2 deletions
+2
-2
megatron/model/language_model.py
megatron/model/language_model.py
+2
-2
No files found.
megatron/model/language_model.py
View file @
36c3a1c1
...
...
@@ -541,8 +541,8 @@ class TransformerLanguageModel(MegatronModule):
keep_vars
=
keep_vars
)
if
self
.
untie_embeddings_and_output_weights
:
state_dict_
[
self
.
_output_layer_key
]
\
=
self
.
output_layer
.
state_dict
_for_save_checkpoint
(
prefix
=
prefix
,
keep_vars
=
keep_vars
)
=
self
.
output_layer
.
state_dict
(
prefix
=
prefix
,
keep_vars
=
keep_vars
)
if
self
.
add_decoder
:
state_dict_
[
self
.
_decoder_key
]
\
=
self
.
decoder
.
state_dict_for_save_checkpoint
(
prefix
=
prefix
,
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment