Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
gaoqiong
lm-evaluation-harness
Commits
3e27f9dd
"ts/webui/src/static/function.ts" did not exist on "995f625963a9e6bf76033e1cc8e7dcd4df3dbf65"
Commit
3e27f9dd
authored
Jan 06, 2024
by
daniel-furman
Browse files
first stab at wrap_chat_template, formatting in function
parent
53c68db4
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
3 additions
and
1 deletion
+3
-1
lm_eval/models/huggingface.py
lm_eval/models/huggingface.py
+3
-1
No files found.
lm_eval/models/huggingface.py
View file @
3e27f9dd
...
@@ -664,10 +664,10 @@ class HFLM(LM):
...
@@ -664,10 +664,10 @@ class HFLM(LM):
return
self
.
tokenizer
.
decode
(
tokens
,
skip_special_tokens
=
True
)
return
self
.
tokenizer
.
decode
(
tokens
,
skip_special_tokens
=
True
)
def
tok_wrap_chat_template
(
self
,
requests
:
List
[
Instance
],
system
:
bool
=
False
)
->
List
[
Instance
]:
def
tok_wrap_chat_template
(
self
,
requests
:
List
[
Instance
],
system
:
bool
=
False
)
->
List
[
Instance
]:
new_reqs
=
[]
new_reqs
=
[]
for
req
in
requests
:
for
req
in
requests
:
context
,
continuation
=
req
.
args
[
0
].
strip
(),
req
.
args
[
1
].
strip
()
context
,
continuation
=
req
.
args
[
0
].
strip
(),
req
.
args
[
1
].
strip
()
if
system
:
if
system
:
chat
=
[
chat
=
[
{
"role"
:
"system"
,
"content"
:
system
},
{
"role"
:
"system"
,
"content"
:
system
},
...
@@ -684,11 +684,13 @@ class HFLM(LM):
...
@@ -684,11 +684,13 @@ class HFLM(LM):
tokenize
=
False
,
tokenize
=
False
,
add_generation_prompt
=
True
,
add_generation_prompt
=
True
,
)
)
rfind_continuation
=
single_tokenized_conversation
.
rfind
(
continuation
)
rfind_continuation
=
single_tokenized_conversation
.
rfind
(
continuation
)
context
=
single_tokenized_conversation
[:
rfind_continuation
]
context
=
single_tokenized_conversation
[:
rfind_continuation
]
continuation
=
single_tokenized_conversation
[
rfind_continuation
:]
continuation
=
single_tokenized_conversation
[
rfind_continuation
:]
req
.
args
=
(
context
,
continuation
)
req
.
args
=
(
context
,
continuation
)
new_reqs
.
append
(
req
)
new_reqs
.
append
(
req
)
return
new_reqs
return
new_reqs
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment