Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
chenpangpang
transformers
Commits
4c18ddb5
Unverified
Commit
4c18ddb5
authored
Feb 16, 2024
by
Sourab Mangrulkar
Committed by
GitHub
Feb 16, 2024
Browse files
`auto_find_batch_size` isn't yet supported with DeepSpeed/FSDP. Raise error accrodingly. (#29058)
Update trainer.py
parent
b2628086
Changes
1
Show whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
5 additions
and
0 deletions
+5
-0
src/transformers/trainer.py
src/transformers/trainer.py
+5
-0
No files found.
src/transformers/trainer.py
View file @
4c18ddb5
...
...
@@ -4136,6 +4136,11 @@ class Trainer:
wrapper
=
"DeepSpeed"
if
self
.
is_deepspeed_enabled
else
"FSDP"
raise
ValueError
(
f
"
{
wrapper
}
can't be used with `save_only_model` along with `load_best_model_at_end`."
)
# `auto_find_batch_size` isn't yet supported with DeepSpeed/FSDP
if
(
self
.
is_deepspeed_enabled
or
self
.
is_fsdp_enabled
)
and
self
.
args
.
auto_find_batch_size
:
wrapper
=
"DeepSpeed"
if
self
.
is_deepspeed_enabled
else
"FSDP"
raise
NotImplementedError
(
f
"`
{
wrapper
}
` doesn't support `auto_find_batch_size`."
)
def
propagate_args_to_deepspeed
(
self
,
auto_find_batch_size
=
False
):
"""
Sets values in the deepspeed plugin based on the Trainer args
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment