Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
chenpangpang
transformers
Commits
df5abae8
Unverified
Commit
df5abae8
authored
Jun 03, 2024
by
miivanov90
Committed by
GitHub
Jun 03, 2024
Browse files
Set greater_is_better to False if metric_for_best_model ends with "loss" (#31142)
* update to not(endswith(loss)) * ruff formatting
parent
924c46d4
Changes
1
Show whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
3 additions
and
3 deletions
+3
-3
src/transformers/training_args.py
src/transformers/training_args.py
+3
-3
No files found.
src/transformers/training_args.py
View file @
df5abae8
...
...
@@ -464,8 +464,8 @@ class TrainingArguments:
Use in conjunction with `load_best_model_at_end` and `metric_for_best_model` to specify if better models
should have a greater metric or not. Will default to:
- `True` if `metric_for_best_model` is set to a value that
i
sn't
`"loss"` or `"eval_
loss"`.
- `False` if `metric_for_best_model` is not set, or set to
`"loss"` or `"eval_
loss"`.
- `True` if `metric_for_best_model` is set to a value that
doe
sn't
end in `"
loss"`.
- `False` if `metric_for_best_model` is not set, or set to
a value that ends in `"
loss"`.
ignore_data_skip (`bool`, *optional*, defaults to `False`):
When resuming training, whether or not to skip the epochs and batches to get the data loading at the same
stage as in the previous training. If set to `True`, the training will begin faster (as that skipping step
...
...
@@ -1592,7 +1592,7 @@ class TrainingArguments:
)
and
self
.
metric_for_best_model
is
None
:
self
.
metric_for_best_model
=
"loss"
if
self
.
greater_is_better
is
None
and
self
.
metric_for_best_model
is
not
None
:
self
.
greater_is_better
=
self
.
metric_for_best_model
not
in
[
"loss"
,
"eval_
loss"
]
self
.
greater_is_better
=
not
(
self
.
metric_for_best_model
.
endswith
(
"
loss"
))
if
self
.
run_name
is
None
:
self
.
run_name
=
self
.
output_dir
if
self
.
framework
==
"pt"
and
is_torch_available
():
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment