Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
chenpangpang
transformers
Commits
e5101c2e
"...research_projects/distillation/scripts/token_counts.py" did not exist on "7f2c384c802c2d4c454152f38eeee21abc8df297"
Unverified
Commit
e5101c2e
authored
Mar 17, 2022
by
Dayyan Smith
Committed by
GitHub
Mar 17, 2022
Browse files
Fix typo (#16208)
parent
25b8f9a8
Changes
1
Show whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
2 additions
and
2 deletions
+2
-2
src/transformers/data/data_collator.py
src/transformers/data/data_collator.py
+2
-2
No files found.
src/transformers/data/data_collator.py
View file @
e5101c2e
...
...
@@ -221,7 +221,7 @@ class DataCollatorWithPadding:
among:
- `True` or `'longest'`: Pad to the longest sequence in the batch (or no padding if only a single sequence
i
f
provided).
i
s
provided).
- `'max_length'`: Pad to a maximum length specified with the argument `max_length` or to the maximum
acceptable input length for the model if that argument is not provided.
- `False` or `'do_not_pad'` (default): No padding (i.e., can output a batch with sequences of different
...
...
@@ -273,7 +273,7 @@ class DataCollatorForTokenClassification(DataCollatorMixin):
among:
- `True` or `'longest'`: Pad to the longest sequence in the batch (or no padding if only a single sequence
i
f
provided).
i
s
provided).
- `'max_length'`: Pad to a maximum length specified with the argument `max_length` or to the maximum
acceptable input length for the model if that argument is not provided.
- `False` or `'do_not_pad'` (default): No padding (i.e., can output a batch with sequences of different
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment