Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
norm
vllm
Commits
f1f6cc10
Unverified
Commit
f1f6cc10
authored
Jan 24, 2024
by
Federico Galatolo
Committed by
GitHub
Jan 24, 2024
Browse files
Added `include_stop_str_in_output` and `length_penalty` parameters to OpenAI API (#2562)
parent
3209b490
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
8 additions
and
0 deletions
+8
-0
vllm/entrypoints/openai/protocol.py
vllm/entrypoints/openai/protocol.py
+8
-0
No files found.
vllm/entrypoints/openai/protocol.py
View file @
f1f6cc10
...
...
@@ -78,6 +78,8 @@ class ChatCompletionRequest(BaseModel):
echo
:
Optional
[
bool
]
=
False
repetition_penalty
:
Optional
[
float
]
=
1.0
min_p
:
Optional
[
float
]
=
0.0
include_stop_str_in_output
:
Optional
[
bool
]
=
False
length_penalty
:
Optional
[
float
]
=
1.0
def
to_sampling_params
(
self
)
->
SamplingParams
:
return
SamplingParams
(
...
...
@@ -97,6 +99,8 @@ class ChatCompletionRequest(BaseModel):
use_beam_search
=
self
.
use_beam_search
,
skip_special_tokens
=
self
.
skip_special_tokens
,
spaces_between_special_tokens
=
self
.
spaces_between_special_tokens
,
include_stop_str_in_output
=
self
.
include_stop_str_in_output
,
length_penalty
=
self
.
length_penalty
,
)
...
...
@@ -127,6 +131,8 @@ class CompletionRequest(BaseModel):
spaces_between_special_tokens
:
Optional
[
bool
]
=
True
repetition_penalty
:
Optional
[
float
]
=
1.0
min_p
:
Optional
[
float
]
=
0.0
include_stop_str_in_output
:
Optional
[
bool
]
=
False
length_penalty
:
Optional
[
float
]
=
1.0
def
to_sampling_params
(
self
):
echo_without_generation
=
self
.
echo
and
self
.
max_tokens
==
0
...
...
@@ -150,6 +156,8 @@ class CompletionRequest(BaseModel):
prompt_logprobs
=
self
.
logprobs
if
self
.
echo
else
None
,
skip_special_tokens
=
self
.
skip_special_tokens
,
spaces_between_special_tokens
=
(
self
.
spaces_between_special_tokens
),
include_stop_str_in_output
=
self
.
include_stop_str_in_output
,
length_penalty
=
self
.
length_penalty
,
)
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment