Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
chenpangpang
transformers
Commits
83081701
"vscode:/vscode.git/clone" did not exist on "61d223c884c1576456a0196f4488f12a16e62206"
Commit
83081701
authored
Aug 15, 2019
by
LysandreJik
Browse files
Warning for RoBERTa sequences encoded without special tokens.
parent
572dcfd1
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
7 additions
and
0 deletions
+7
-0
pytorch_transformers/modeling_roberta.py
pytorch_transformers/modeling_roberta.py
+7
-0
No files found.
pytorch_transformers/modeling_roberta.py
View file @
83081701
...
...
@@ -165,6 +165,13 @@ class RobertaModel(BertModel):
self
.
embeddings
=
RobertaEmbeddings
(
config
)
self
.
apply
(
self
.
init_weights
)
def
forward
(
self
,
input_ids
,
token_type_ids
=
None
,
attention_mask
=
None
,
position_ids
=
None
,
head_mask
=
None
):
if
input_ids
[:,
0
].
sum
().
item
()
!=
0
:
logger
.
warning
(
"A sequence with no special tokens has been passed to the RoBERTa model. "
"This model requires special tokens in order to work. "
"Please specify add_special_tokens=True in your encoding."
)
return
super
(
RobertaModel
,
self
).
forward
(
input_ids
,
token_type_ids
,
attention_mask
,
position_ids
,
head_mask
)
@
add_start_docstrings
(
"""RoBERTa Model with a `language modeling` head on top. """
,
ROBERTA_START_DOCSTRING
,
ROBERTA_INPUTS_DOCSTRING
)
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment