Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
chenpangpang
transformers
Commits
68f77303
"git@developer.sourcefind.cn:chenpangpang/ComfyUI.git" did not exist on "1ddf512fdc69bf8dfb51eb858d3e5ba069570791"
Commit
68f77303
authored
Dec 09, 2018
by
thomwolf
Browse files
fixing Adam weights skip in TF convert script
parent
a2b6918a
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
1 addition
and
1 deletion
+1
-1
pytorch_pretrained_bert/convert_tf_checkpoint_to_pytorch.py
pytorch_pretrained_bert/convert_tf_checkpoint_to_pytorch.py
+1
-1
No files found.
pytorch_pretrained_bert/convert_tf_checkpoint_to_pytorch.py
View file @
68f77303
...
@@ -50,7 +50,7 @@ def convert_tf_checkpoint_to_pytorch(tf_checkpoint_path, bert_config_file, pytor
...
@@ -50,7 +50,7 @@ def convert_tf_checkpoint_to_pytorch(tf_checkpoint_path, bert_config_file, pytor
name
=
name
.
split
(
'/'
)
name
=
name
.
split
(
'/'
)
# adam_v and adam_m are variables used in AdamWeightDecayOptimizer to calculated m and v
# adam_v and adam_m are variables used in AdamWeightDecayOptimizer to calculated m and v
# which are not required for using pretrained model
# which are not required for using pretrained model
if
name
[
-
1
]
in
[
"adam_v"
,
"adam_m"
]:
if
any
(
n
in
[
"adam_v"
,
"adam_m"
]
for
n
in
name
)
:
print
(
"Skipping {}"
.
format
(
"/"
.
join
(
name
)))
print
(
"Skipping {}"
.
format
(
"/"
.
join
(
name
)))
continue
continue
pointer
=
model
pointer
=
model
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment