Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
chenpangpang
transformers
Commits
88cc26dc
"...git@developer.sourcefind.cn:chenpangpang/transformers.git" did not exist on "321c05abab5a182c8cdc28f4fddea65458785640"
Unverified
Commit
88cc26dc
authored
Feb 25, 2021
by
Lysandre Debut
Committed by
GitHub
Feb 25, 2021
Browse files
Ignore unexpected weights from PT conversion (#10397)
parent
63645b3b
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
5 additions
and
1 deletion
+5
-1
src/transformers/models/bert/modeling_tf_bert.py
src/transformers/models/bert/modeling_tf_bert.py
+5
-1
No files found.
src/transformers/models/bert/modeling_tf_bert.py
View file @
88cc26dc
...
...
@@ -919,7 +919,11 @@ Bert Model with two heads on top as done during the pretraining:
)
class
TFBertForPreTraining
(
TFBertPreTrainedModel
,
TFBertPreTrainingLoss
):
# names with a '.' represents the authorized unexpected/missing layers when a TF model is loaded from a PT model
_keys_to_ignore_on_load_unexpected
=
[
r
"cls.predictions.decoder.weight"
]
_keys_to_ignore_on_load_unexpected
=
[
r
"position_ids"
,
r
"cls.predictions.decoder.weight"
,
r
"cls.predictions.decoder.bias"
,
]
def
__init__
(
self
,
config
:
BertConfig
,
*
inputs
,
**
kwargs
):
super
().
__init__
(
config
,
*
inputs
,
**
kwargs
)
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment