Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
chenpangpang
transformers
Commits
5dbf36bd
Unverified
Commit
5dbf36bd
authored
Mar 14, 2022
by
Yih-Dar
Committed by
GitHub
Mar 14, 2022
Browse files
Fix ProphetNetTokenizer (#16082)
Co-authored-by:
ydshieh
<
ydshieh@users.noreply.github.com
>
parent
923c35b5
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
5 additions
and
0 deletions
+5
-0
src/transformers/models/prophetnet/tokenization_prophetnet.py
...transformers/models/prophetnet/tokenization_prophetnet.py
+5
-0
No files found.
src/transformers/models/prophetnet/tokenization_prophetnet.py
View file @
5dbf36bd
...
...
@@ -102,6 +102,11 @@ class ProphetNetTokenizer(PreTrainedTokenizer):
pretrained_init_configuration
=
PRETRAINED_INIT_CONFIGURATION
max_model_input_sizes
=
PRETRAINED_POSITIONAL_EMBEDDINGS_SIZES
# first name has to correspond to main model input name
# to make sure `tokenizer.pad(...)` works correctly
# `ProphetNet` doesn't have `token_type_ids` as argument.
model_input_names
:
List
[
str
]
=
[
"input_ids"
,
"attention_mask"
]
def
__init__
(
self
,
vocab_file
,
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment