Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
chenpangpang
transformers
Commits
e0f867a9
"...git@developer.sourcefind.cn:chenpangpang/transformers.git" did not exist on "d740351f7dfd6176e40efaeca694aca5622a55cd"
Commit
e0f867a9
authored
Aug 31, 2019
by
LysandreJik
Browse files
XLNet bias fix on resize embeddings (cf #1124)
parent
d7a4c325
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
8 additions
and
0 deletions
+8
-0
pytorch_transformers/modeling_utils.py
pytorch_transformers/modeling_utils.py
+8
-0
No files found.
pytorch_transformers/modeling_utils.py
View file @
e0f867a9
...
@@ -327,6 +327,14 @@ class PreTrainedModel(nn.Module):
...
@@ -327,6 +327,14 @@ class PreTrainedModel(nn.Module):
else
:
else
:
first_module
.
weight
=
second_module
.
weight
first_module
.
weight
=
second_module
.
weight
if
hasattr
(
first_module
,
'bias'
):
first_module
.
bias
.
data
=
torch
.
nn
.
functional
.
pad
(
first_module
.
bias
.
data
,
(
0
,
first_module
.
weight
.
shape
[
0
]
-
first_module
.
bias
.
shape
[
0
]),
'constant'
,
0
)
def
resize_token_embeddings
(
self
,
new_num_tokens
=
None
):
def
resize_token_embeddings
(
self
,
new_num_tokens
=
None
):
""" Resize input token embeddings matrix of the model if new_num_tokens != config.vocab_size.
""" Resize input token embeddings matrix of the model if new_num_tokens != config.vocab_size.
Take care of tying weights embeddings afterwards if the model class has a `tie_weights()` method.
Take care of tying weights embeddings afterwards if the model class has a `tie_weights()` method.
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment