Synchronize PyTorchLightning/pytorch-lightning (revision 7b283e3c@master) to...
Synchronize PyTorchLightning/pytorch-lightning (revision 7b283e3c@master) to github/third-party/PyTorchLightning/pytorch-lightning
Summary:
# Manual
- remove fixme's in `model_checkpoint.py`, `parameter_monitor.py`, `test_quantization.py`, and `speed_monitor.py` now that `Trainer` is properly annotated.
- update `test_quantization.py` to `trainer.train_loop.global_step` instead of `trainer.global_step` which is a read-only.
- update `loop_callback.py` to read from `train_loop` for `batch_idx` (which is no longer available).
# Automatic
### New commit log messages
7b283e3c Bugfix/Multiple dataloaders (#7433)
d7c44cc6 Docs: sync chlog 1.3.1 (#7478)
fdf50a5e Mark certain Trainer APIs as protected (#7420)
ad9118f0 remove trainer hidden state | sanity refactor [1 / n] (#7437)
4a1134db Log epoch metrics before firing the `on_evaluation_end` hook (#7272)
b65ae794 Automatically check `DataModule.has_{setup,teardown,prepare_data}` [2/2] (#7238)
8660d8cf [pre-commit.ci] pre-commit autoupdate (#7475)
f6fe715e Fix Sphinx argument deprecation (#7464)
Reviewed By: shuyingsunshine21
Differential Revision: D28353491
fbshipit-source-id: 98b87d99e2f09b47b07270858fcbdb5d5299730b
Showing
Please register or sign in to comment