Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
chenpangpang
transformers
Commits
080e42d0
"...git@developer.sourcefind.cn:chenpangpang/transformers.git" did not exist on "11cb6e0f7eb48bf973595eb42e827b89831704ab"
Unverified
Commit
080e42d0
authored
Apr 07, 2022
by
Stas Bekman
Committed by
GitHub
Apr 07, 2022
Browse files
[megatron-bert-uncased-345m] fix conversion (#16639)
parent
09a272b0
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
4 additions
and
0 deletions
+4
-0
src/transformers/models/megatron_bert/convert_megatron_bert_checkpoint.py
.../models/megatron_bert/convert_megatron_bert_checkpoint.py
+4
-0
No files found.
src/transformers/models/megatron_bert/convert_megatron_bert_checkpoint.py
View file @
080e42d0
...
@@ -300,6 +300,10 @@ def main():
...
@@ -300,6 +300,10 @@ def main():
if
args
.
config_file
==
""
:
if
args
.
config_file
==
""
:
# Default config of megatron-bert 345m
# Default config of megatron-bert 345m
config
=
MegatronBertConfig
()
config
=
MegatronBertConfig
()
# different megatron-bert-*-345m models have different vocab sizes, so override the default
# config (which is for megatron-bert-cased-345m) with the actual vocab dimension
config
.
vocab_size
=
input_state_dict
[
"model"
][
"lm_head"
][
"bias"
].
numel
()
else
:
else
:
config
=
MegatronBertConfig
.
from_json_file
(
args
.
config_file
)
config
=
MegatronBertConfig
.
from_json_file
(
args
.
config_file
)
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment