Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
chenpangpang
transformers
Commits
389bdba6
Unverified
Commit
389bdba6
authored
May 19, 2023
by
joaoareis
Committed by
GitHub
May 19, 2023
Browse files
Fix PretrainedConfig `min_length` docstring (#23471)
parent
b455ad0a
Changes
1
Show whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
1 addition
and
1 deletion
+1
-1
src/transformers/configuration_utils.py
src/transformers/configuration_utils.py
+1
-1
No files found.
src/transformers/configuration_utils.py
View file @
389bdba6
...
@@ -119,7 +119,7 @@ class PretrainedConfig(PushToHubMixin):
...
@@ -119,7 +119,7 @@ class PretrainedConfig(PushToHubMixin):
max_length (`int`, *optional*, defaults to 20):
max_length (`int`, *optional*, defaults to 20):
Maximum length that will be used by default in the `generate` method of the model.
Maximum length that will be used by default in the `generate` method of the model.
min_length (`int`, *optional*, defaults to
1
0):
min_length (`int`, *optional*, defaults to 0):
Minimum length that will be used by default in the `generate` method of the model.
Minimum length that will be used by default in the `generate` method of the model.
do_sample (`bool`, *optional*, defaults to `False`):
do_sample (`bool`, *optional*, defaults to `False`):
Flag that will be used by default in the `generate` method of the model. Whether or not to use sampling ;
Flag that will be used by default in the `generate` method of the model. Whether or not to use sampling ;
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment