Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
chenpangpang
transformers
Commits
e769ca3d
"docs/source/ja/model_doc/auto.md" did not exist on "57882177becb85560f1ff931abb1b0b75d67e70d"
Unverified
Commit
e769ca3d
authored
Aug 21, 2023
by
Pranith Pashikanti
Committed by
GitHub
Aug 21, 2023
Browse files
Added paper links in logitprocess.py (#25482)
parent
5c67682b
Changes
1
Show whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
4 additions
and
0 deletions
+4
-0
src/transformers/generation/logits_process.py
src/transformers/generation/logits_process.py
+4
-0
No files found.
src/transformers/generation/logits_process.py
View file @
e769ca3d
...
...
@@ -1333,6 +1333,8 @@ class WhisperTimeStampLogitsProcessor(LogitsProcessor):
Whisper specific Processor. This processor can be used to force a list of tokens. The processor will set their log
probs to `inf` so that they are sampled at their corresponding index.
See [the paper](https://arxiv.org/abs/2212.04356) for more information.
Args:
generate_config (`GenerateConfig`):
The generate config used to generate the output. The following parameters are required:
...
...
@@ -1399,6 +1401,8 @@ class ClassifierFreeGuidanceLogitsProcessor(LogitsProcessor):
correspond to the unconditional logits (predicted from an empty or 'null' prompt). The processor computes a
weighted average across the conditional and unconditional logits, parameterised by the `guidance_scale`.
See [the paper](https://arxiv.org/abs/2306.05284) for more information.
Args:
guidance_scale (float):
The guidance scale for classifier free guidance (CFG). CFG is enabled by setting `guidance_scale > 1`.
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment