Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
norm
vllm
Commits
27ca23dc
Unverified
Commit
27ca23dc
authored
Mar 02, 2024
by
Seonghyeon
Committed by
GitHub
Mar 01, 2024
Browse files
Remove exclude_unset in streaming response (#3143)
parent
54d35447
Changes
1
Show whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
3 additions
and
3 deletions
+3
-3
vllm/entrypoints/openai/serving_completion.py
vllm/entrypoints/openai/serving_completion.py
+3
-3
No files found.
vllm/entrypoints/openai/serving_completion.py
View file @
27ca23dc
...
...
@@ -96,7 +96,7 @@ async def completion_stream_generator(
logprobs
=
logprobs
,
finish_reason
=
finish_reason
,
)
]).
model_dump_json
(
exclude_unset
=
True
)
]).
model_dump_json
()
yield
f
"data:
{
response_json
}
\n\n
"
if
output
.
finish_reason
is
not
None
:
# return final usage
...
...
@@ -121,7 +121,7 @@ async def completion_stream_generator(
)
],
usage
=
final_usage
,
).
model_dump_json
(
exclude_unset
=
True
)
).
model_dump_json
()
yield
f
"data:
{
response_json
}
\n\n
"
yield
"data: [DONE]
\n\n
"
...
...
@@ -306,7 +306,7 @@ class OpenAIServingCompletion(OpenAIServing):
request
,
prompt
=
prompt
)
generators
.
append
(
self
.
engine
.
generate
(
None
,
self
.
engine
.
generate
(
prompt
,
sampling_params
,
f
"
{
request_id
}
-
{
i
}
"
,
prompt_token_ids
=
input_ids
,
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment