Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
gaoqiong
lm-evaluation-harness
Commits
f8771d26
Commit
f8771d26
authored
May 30, 2024
by
Konrad
Browse files
cache key update
parent
899a544a
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
4 additions
and
0 deletions
+4
-0
lm_eval/api/task.py
lm_eval/api/task.py
+4
-0
No files found.
lm_eval/api/task.py
View file @
f8771d26
...
@@ -384,6 +384,10 @@ class Task(abc.ABC):
...
@@ -384,6 +384,10 @@ class Task(abc.ABC):
og_limit
=
limit
og_limit
=
limit
cache_key
=
f
"requests-
{
self
.
_config
.
task
}
-
{
self
.
config
.
num_fewshot
}
shot-rank
{
rank
}
-world_size
{
world_size
}
"
cache_key
=
f
"requests-
{
self
.
_config
.
task
}
-
{
self
.
config
.
num_fewshot
}
shot-rank
{
rank
}
-world_size
{
world_size
}
"
cache_key
+=
"-chat_template"
if
apply_chat_template
else
""
cache_key
+=
"-fewshot_as_multiturn"
if
fewshot_as_multiturn
else
""
if
lm
is
not
None
and
hasattr
(
lm
,
"tokenizer"
):
cache_key
+=
f
"-
{
lm
.
tokenizer
.
name_or_path
.
replace
(
'/'
,
'__'
)
}
"
cached_instances
=
load_from_cache
(
file_name
=
cache_key
)
cached_instances
=
load_from_cache
(
file_name
=
cache_key
)
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment