Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
chenpangpang
transformers
Commits
a6a6d9e6
Commit
a6a6d9e6
authored
Sep 12, 2019
by
Ikuya Yamada
Committed by
LysandreJik
Sep 27, 2019
Browse files
fix padding_idx of RoBERTa model
parent
d8b641c8
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
3 additions
and
0 deletions
+3
-0
transformers/modeling_roberta.py
transformers/modeling_roberta.py
+3
-0
No files found.
transformers/modeling_roberta.py
View file @
a6a6d9e6
...
...
@@ -43,6 +43,9 @@ class RobertaEmbeddings(BertEmbeddings):
def
__init__
(
self
,
config
):
super
(
RobertaEmbeddings
,
self
).
__init__
(
config
)
self
.
padding_idx
=
1
self
.
word_embeddings
=
nn
.
Embedding
(
config
.
vocab_size
,
config
.
hidden_size
,
padding_idx
=
self
.
padding_idx
)
self
.
position_embeddings
=
nn
.
Embedding
(
config
.
max_position_embeddings
,
config
.
hidden_size
,
padding_idx
=
self
.
padding_idx
)
def
forward
(
self
,
input_ids
,
token_type_ids
=
None
,
position_ids
=
None
):
seq_length
=
input_ids
.
size
(
1
)
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment