Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
chenpangpang
transformers
Commits
19067711
Unverified
Commit
19067711
authored
Nov 04, 2022
by
Sourab Mangrulkar
Committed by
GitHub
Nov 04, 2022
Browse files
fix `tokenizer_type` to avoid error when loading checkpoint back (#20062)
parent
3502c202
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
1 addition
and
1 deletion
+1
-1
src/transformers/models/megatron_gpt2/checkpoint_reshaping_and_interoperability.py
...egatron_gpt2/checkpoint_reshaping_and_interoperability.py
+1
-1
No files found.
src/transformers/models/megatron_gpt2/checkpoint_reshaping_and_interoperability.py
View file @
19067711
...
...
@@ -648,7 +648,7 @@ def convert_checkpoint_from_transformers_to_megatron(args):
"data_parallel_size"
:
args
.
target_data_parallel_size
,
"make_vocab_size_divisible_by"
:
args
.
make_vocab_size_divisible_by
,
"rank"
:
0
,
"tokenizer_type"
:
None
,
"tokenizer_type"
:
"GPT2BPETokenizer"
,
}
if
config
.
activation_function
==
"gelu"
:
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment