Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
gaoqiong
lm-evaluation-harness
Commits
38c8d02f
Unverified
Commit
38c8d02f
authored
Jan 24, 2024
by
Baber Abbasi
Committed by
GitHub
Jan 24, 2024
Browse files
modified default gen_kwargs to work better with CLI; changed prompt_logprobs=1 (#1345)
parent
081deb8b
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
3 additions
and
3 deletions
+3
-3
lm_eval/models/vllm_causallms.py
lm_eval/models/vllm_causallms.py
+3
-3
No files found.
lm_eval/models/vllm_causallms.py
View file @
38c8d02f
...
...
@@ -175,7 +175,7 @@ class VLLM(LM):
sampling_params
=
SamplingParams
(
max_tokens
=
max_tokens
,
stop
=
stop
,
**
kwargs
)
else
:
sampling_params
=
SamplingParams
(
temperature
=
0
,
prompt_logprobs
=
2
,
max_tokens
=
1
temperature
=
0
,
prompt_logprobs
=
1
,
max_tokens
=
1
)
if
self
.
data_parallel_size
>
1
:
requests
=
[
list
(
x
)
for
x
in
divide
(
requests
,
self
.
data_parallel_size
)]
...
...
@@ -436,8 +436,8 @@ class VLLM(LM):
@
staticmethod
def
modify_gen_kwargs
(
kwargs
:
dict
)
->
dict
:
# sampling_params
do_sample
=
kwargs
.
pop
(
"do_sample"
,
Fals
e
)
if
do_sample
is
not
True
:
do_sample
=
kwargs
.
pop
(
"do_sample"
,
Non
e
)
if
do_sample
is
False
or
"temperature"
not
in
kwargs
:
kwargs
[
"temperature"
]
=
0.0
# hf defaults
kwargs
[
"skip_special_tokens"
]
=
kwargs
.
get
(
"skip_special_tokens"
,
False
)
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment