Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
norm
vllm
Commits
0ffded81
Unverified
Commit
0ffded81
authored
Jul 03, 2023
by
Zhuohan Li
Committed by
GitHub
Jul 03, 2023
Browse files
[Fix] Better error message for batched prompts (#342)
parent
0bd2a573
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
7 additions
and
1 deletion
+7
-1
vllm/entrypoints/openai/api_server.py
vllm/entrypoints/openai/api_server.py
+7
-1
No files found.
vllm/entrypoints/openai/api_server.py
View file @
0ffded81
...
...
@@ -358,7 +358,13 @@ async def create_completion(raw_request: Request):
model_name
=
request
.
model
request_id
=
f
"cmpl-
{
random_uuid
()
}
"
if
isinstance
(
request
.
prompt
,
list
):
assert
len
(
request
.
prompt
)
==
1
if
len
(
request
.
prompt
)
==
0
:
return
create_error_response
(
HTTPStatus
.
BAD_REQUEST
,
"please provide at least one prompt"
)
if
len
(
request
.
prompt
)
>
1
:
return
create_error_response
(
HTTPStatus
.
BAD_REQUEST
,
"multiple prompts in a batch is not "
"currently supported"
)
prompt
=
request
.
prompt
[
0
]
else
:
prompt
=
request
.
prompt
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment