Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
gaoqiong
lm-evaluation-harness
Commits
2976f69b
Unverified
Commit
2976f69b
authored
Jun 05, 2023
by
gakada
Committed by
GitHub
Jun 05, 2023
Browse files
Use max_length in AutoSeq2SeqLM (#551)
parent
92929bd2
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
1 addition
and
10 deletions
+1
-10
lm_eval/models/huggingface.py
lm_eval/models/huggingface.py
+1
-10
No files found.
lm_eval/models/huggingface.py
View file @
2976f69b
...
...
@@ -351,7 +351,7 @@ class HuggingFaceAutoLM(BaseLM):
"""Return the maximum sequence length of the model.
NOTE: Different model configurations have different max sequence length
attribute names.
- n_positions: (CTRLConfig)
- n_positions: (CTRLConfig
, T5Config
)
- max_position_embeddings: (BartConfig, RoFormerConfig)
- n_ctx: (GPT2Config)
NOTE: For relative position encoded models you should specify the max
...
...
@@ -543,15 +543,6 @@ class AutoSeq2SeqLM(HuggingFaceAutoLM):
AUTO_MODEL_CLASS
=
transformers
.
AutoModelForSeq2SeqLM
AUTO_PEFT_CLASS
=
peft
.
PeftModel
@
property
def
max_length
(
self
)
->
int
:
"""Return the maximum sequence length of the model.
TODO: Currently only works for relative position encoded Seq2Seq models.
"""
if
self
.
_max_length
is
not
None
:
return
self
.
_max_length
return
self
.
_DEFAULT_MAX_LENGTH
def
loglikelihood
(
self
,
requests
:
List
[
Tuple
[
str
,
str
]]
)
->
List
[
Tuple
[
float
,
bool
]]:
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment