Unverified Commit 43814483 authored by Allan Jie's avatar Allan Jie Committed by GitHub
Browse files

Raise error and suggestion when using custom optimizer with Fairscale or Deepspeed (#16786)

* optimizer issues related to saving

* remove the "optimizer saving" option

* reformat using make style
parent b4ddd267
......@@ -397,6 +397,13 @@ class Trainer:
"Passing a `model_init` is incompatible with providing the `optimizers` argument. "
"You should subclass `Trainer` and override the `create_optimizer_and_scheduler` method."
)
if (self.sharded_ddp is not None or args.deepspeed) and (
self.optimizer is not None or self.lr_scheduler is not None
):
raise RuntimeError(
"Passing `optimizers` is not allowed if Fairscale or Deepspeed is enabled."
"You should subclass `Trainer` and override the `create_optimizer_and_scheduler` method."
)
default_callbacks = DEFAULT_CALLBACKS + get_reporting_integration_callbacks(self.args.report_to)
callbacks = default_callbacks if callbacks is None else default_callbacks + callbacks
self.callback_handler = CallbackHandler(
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment