Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
gaoqiong
lm-evaluation-harness
Commits
96ea9f54
Commit
96ea9f54
authored
Jul 27, 2023
by
Benjamin Fattori
Browse files
skip recomputing batch size if it is maximal
parent
3168fc00
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
4 additions
and
0 deletions
+4
-0
lm_eval/models/huggingface.py
lm_eval/models/huggingface.py
+4
-0
No files found.
lm_eval/models/huggingface.py
View file @
96ea9f54
...
...
@@ -596,6 +596,10 @@ class HFLM(LM):
sched
=
pos
//
int
(
n_reordered_requests
/
self
.
batch_schedule
)
if
sched
in
self
.
batch_sizes
:
return
self
.
batch_sizes
[
sched
]
if
(
len
(
self
.
batch_sizes
)
>
1
)
and
(
self
.
batch_sizes
[
sched
-
1
]
==
self
.
max_batch_size
):
# if previous batch size is already maximal, skip recomputation
self
.
batch_sizes
[
sched
]
=
self
.
max_batch_size
return
self
.
batch_sizes
[
sched
]
print
(
f
"Passed argument batch_size = auto:
{
self
.
batch_schedule
}
. Detecting largest batch size"
)
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment