Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
gaoqiong
lm-evaluation-harness
Commits
0f63d4f5
Unverified
Commit
0f63d4f5
authored
Jun 25, 2025
by
Baber Abbasi
Committed by
GitHub
Jun 25, 2025
Browse files
remove system message if `TemplateError` (#3076)
parent
532909c0
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
22 additions
and
8 deletions
+22
-8
lm_eval/models/vllm_causallms.py
lm_eval/models/vllm_causallms.py
+22
-8
No files found.
lm_eval/models/vllm_causallms.py
View file @
0f63d4f5
...
...
@@ -10,6 +10,7 @@ from queue import Empty
from
time
import
sleep
from
typing
import
TYPE_CHECKING
,
Dict
,
List
,
Literal
,
Optional
,
Tuple
,
Union
import
jinja2
from
more_itertools
import
distribute
from
packaging.version
import
parse
as
parse_version
from
tqdm
import
tqdm
...
...
@@ -300,14 +301,27 @@ class VLLM(TemplateLM):
"""
Method to apply a chat template to a list of chat history between user and model.
"""
chat_templated
=
self
.
tokenizer
.
apply_chat_template
(
chat_history
,
tokenize
=
False
,
add_generation_prompt
=
add_generation_prompt
,
continue_final_message
=
not
add_generation_prompt
,
chat_template
=
self
.
hf_chat_template
,
enable_thinking
=
self
.
enable_thinking
,
)
try
:
chat_templated
=
self
.
tokenizer
.
apply_chat_template
(
chat_history
,
tokenize
=
False
,
add_generation_prompt
=
add_generation_prompt
,
continue_final_message
=
not
add_generation_prompt
,
chat_template
=
self
.
hf_chat_template
,
enable_thinking
=
self
.
enable_thinking
,
)
except
jinja2
.
exceptions
.
TemplateError
:
eval_logger
.
warning
(
"Failed to apply chat template. removing the system role in chat history."
)
chat_templated
=
self
.
tokenizer
.
apply_chat_template
(
[
msg
for
msg
in
chat_history
if
msg
[
"role"
]
!=
"system"
],
tokenize
=
False
,
add_generation_prompt
=
add_generation_prompt
,
continue_final_message
=
not
add_generation_prompt
,
chat_template
=
self
.
hf_chat_template
,
enable_thinking
=
self
.
enable_thinking
,
)
return
chat_templated
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment