"torchvision/git@developer.sourcefind.cn:OpenDAS/vision.git" did not exist on "5a9e21032c32877f8203eeee15801a266bb171f0"
Commit ec2e8fff authored by Ananth Subramaniam's avatar Ananth Subramaniam Committed by Facebook GitHub Bot
Browse files

Move checkpoint callback to callbacks trainer argument

Summary:
`checkpoint_callback` is being phased out. Initially, it was a special way to configure checkpoints, but it makes more sense for those callbacks to be included in the general `callbacks` trainer argument. In 1.2.X, `checkpoint_callback` is expected to be a boolean value only.

If `checkpoint_callback=False` **and** an instance of `ModelCheckpoint` is passed in the trainer's `callbacks` arguments, Lightning raises a [misconfiguration error](https://github.com/PyTorchLightning/pytorch-lightning/blob/2f6ce1ae7fff34d16d3707571f6a9a7b0fb0c50a/pytorch_lightning/trainer/connectors/callback_connector.py#L66-L70)

Reviewed By: newstzpz

Differential Revision: D27139315

fbshipit-source-id: 07ad5ea520583a2e46a9cb2a938f98968265c932
parent 4bfa571d
...@@ -44,7 +44,7 @@ class TestLightningTask(unittest.TestCase): ...@@ -44,7 +44,7 @@ class TestLightningTask(unittest.TestCase):
"max_steps": 1, "max_steps": 1,
"limit_train_batches": 1, "limit_train_batches": 1,
"num_sanity_val_steps": 0, "num_sanity_val_steps": 0,
"checkpoint_callback": checkpoint_callback, "callbacks": [checkpoint_callback],
} }
trainer = pl.Trainer(**params) trainer = pl.Trainer(**params)
with EventStorage() as storage: with EventStorage() as storage:
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment