Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
chenpangpang
transformers
Commits
f86ed231
"git@developer.sourcefind.cn:chenpangpang/transformers.git" did not exist on "ad2303a4013f299d5ab13c7746ce1f54ecc5bc61"
Commit
f86ed231
authored
Dec 21, 2019
by
thomwolf
Browse files
update doc
parent
cfa03805
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
10 additions
and
0 deletions
+10
-0
transformers/modeling_utils.py
transformers/modeling_utils.py
+10
-0
No files found.
transformers/modeling_utils.py
View file @
f86ed231
...
@@ -529,6 +529,16 @@ class PreTrainedModel(nn.Module):
...
@@ -529,6 +529,16 @@ class PreTrainedModel(nn.Module):
The cumulative probability of parameter highest probability vocabulary tokens to keep for nucleus sampling. Must be between 0 and 1. Default to 1.
The cumulative probability of parameter highest probability vocabulary tokens to keep for nucleus sampling. Must be between 0 and 1. Default to 1.
**repetition_penalty**: (`optional`) float
**repetition_penalty**: (`optional`) float
The parameter for repetition penalty. Between 1.0 and + infinity. 1.0 means no penalty. Default to 1.
The parameter for repetition penalty. Between 1.0 and + infinity. 1.0 means no penalty. Default to 1.
**bos_token_id**: (`optional`) int
Beginning of sentence token if no prompt is provided. Default to 0.
**eos_token_ids**: (`optional`) int or list of int
End of sequence token or list of tokens to stop the generation. Default to 0.
**length_penalty**: (`optional`) int
Exponential penalty to the length. Default to 0.
**length_penalty**: (`optional`) float
Exponential penalty to the length. Default to 1.
**num_return_sequences**: (`optional`) int
The number of independantly computed returned sequences for each element in the batch. Default to 1.
"""
"""
# We cannot generate if the model does not have a LM head
# We cannot generate if the model does not have a LM head
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment