"docs/vscode:/vscode.git/clone" did not exist on "d6837aea4d2c1e32b19706ecd4d807df82dacfce"
customize_runtime.md 11.5 KB
Newer Older
twang's avatar
twang committed
1
2
3
4
# Tutorial 5: Customize Runtime Settings

## Customize optimization settings

5
### Customize optimizer supported by PyTorch
twang's avatar
twang committed
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22

We already support to use all the optimizers implemented by PyTorch, and the only modification is to change the `optimizer` field of config files.
For example, if you want to use `ADAM` (note that the performance could drop a lot), the modification could be as the following.

```python
optimizer = dict(type='Adam', lr=0.0003, weight_decay=0.0001)
```

To modify the learning rate of the model, the users only need to modify the `lr` in the config of optimizer. The users can directly set arguments following the [API doc](https://pytorch.org/docs/stable/optim.html?highlight=optim#module-torch.optim) of PyTorch.

### Customize self-implemented optimizer

#### 1. Define a new optimizer

A customized optimizer could be defined as following.

Assume you want to add a optimizer named `MyOptimizer`, which has arguments `a`, `b`, and `c`.
23
24
You need to create a new directory named `mmdet3d/core/optimizer`.
And then implement the new optimizer in a file, e.g., in `mmdet3d/core/optimizer/my_optimizer.py`:
twang's avatar
twang committed
25
26

```python
27
from mmcv.runner.optimizer import OPTIMIZERS
twang's avatar
twang committed
28
29
30
31
32
33
34
35
36
37
38
39
40
41
from torch.optim import Optimizer


@OPTIMIZERS.register_module()
class MyOptimizer(Optimizer):

    def __init__(self, a, b, c)

```

#### 2. Add the optimizer to registry

To find the above module defined above, this module should be imported into the main namespace at first. There are two options to achieve it.

42
- Add `mmdet3d/core/optimizer/__init__.py` to import it.
twang's avatar
twang committed
43

44
45
  The newly defined module should be imported in `mmdet3d/core/optimizer/__init__.py` so that the registry will
  find the new module and add it:
twang's avatar
twang committed
46
47
48

```python
from .my_optimizer import MyOptimizer
49
50
51
52
53
54
55
56

__all__ = ['MyOptimizer']

```

You also need to import `optimizer` in `mmdet3d/core/__init__.py` by adding:

```python
Wenhao Wu's avatar
Wenhao Wu committed
57
from .optimizer import *
twang's avatar
twang committed
58
59
```

Wenhao Wu's avatar
Wenhao Wu committed
60
Or use `custom_imports` in the config to manually import it
twang's avatar
twang committed
61
62

```python
63
custom_imports = dict(imports=['mmdet3d.core.optimizer.my_optimizer'], allow_failed_imports=False)
twang's avatar
twang committed
64
65
```

66
The module `mmdet3d.core.optimizer.my_optimizer` will be imported at the beginning of the program and the class `MyOptimizer` is then automatically registered.
twang's avatar
twang committed
67
Note that only the package containing the class `MyOptimizer` should be imported.
68
`mmdet3d.core.optimizer.my_optimizer.MyOptimizer` **cannot** be imported directly.
twang's avatar
twang committed
69

70
Actually users can use a totally different file directory structure in this importing method, as long as the module root can be located in `PYTHONPATH`.
twang's avatar
twang committed
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89

#### 3. Specify the optimizer in the config file

Then you can use `MyOptimizer` in `optimizer` field of config files.
In the configs, the optimizers are defined by the field `optimizer` like the following:

```python
optimizer = dict(type='SGD', lr=0.02, momentum=0.9, weight_decay=0.0001)
```

To use your own optimizer, the field can be changed to

```python
optimizer = dict(type='MyOptimizer', a=a_value, b=b_value, c=c_value)
```

### Customize optimizer constructor

Some models may have some parameter-specific settings for optimization, e.g. weight decay for BatchNorm layers.
90
The users can tune those fine-grained parameters through customizing optimizer constructor.
twang's avatar
twang committed
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110

```python
from mmcv.utils import build_from_cfg

from mmcv.runner.optimizer import OPTIMIZER_BUILDERS, OPTIMIZERS
from mmdet.utils import get_root_logger
from .my_optimizer import MyOptimizer


@OPTIMIZER_BUILDERS.register_module()
class MyOptimizerConstructor(object):

    def __init__(self, optimizer_cfg, paramwise_cfg=None):

    def __call__(self, model):

        return my_optimizer

```

Wenhao Wu's avatar
Wenhao Wu committed
111
The default optimizer constructor is implemented [here](https://github.com/open-mmlab/mmcv/blob/v1.3.7/mmcv/runner/optimizer/default_constructor.py#L11), which could also serve as a template for new optimizer constructor.
twang's avatar
twang committed
112
113
114
115
116
117

### Additional settings

Tricks not implemented by the optimizer should be implemented through optimizer constructor (e.g., set parameter-wise learning rates) or hooks. We list some common settings that could stabilize the training or accelerate the training. Feel free to create PR, issue for more settings.

- __Use gradient clip to stabilize training__:
118

119
  Some models need gradient clip to clip the gradients to stabilize the training process. An example is as below:
twang's avatar
twang committed
120

121
122
123
124
  ```python
  optimizer_config = dict(
      _delete_=True, grad_clip=dict(max_norm=35, norm_type=2))
  ```
twang's avatar
twang committed
125

126
  If your config inherits the base config which already sets the `optimizer_config`, you might need `_delete_=True` to override the unnecessary settings in the base config. See the [config documentation](https://mmdetection.readthedocs.io/en/latest/tutorials/config.html) for more details.
twang's avatar
twang committed
127
128

- __Use momentum schedule to accelerate model convergence__:
129

130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
  We support momentum scheduler to modify model's momentum according to learning rate, which could make the model converge in a faster way.
  Momentum scheduler is usually used with LR scheduler, for example, the following config is used in 3D detection to accelerate convergence.
  For more details, please refer to the implementation of [CyclicLrUpdater](https://github.com/open-mmlab/mmcv/blob/v1.3.7/mmcv/runner/hooks/lr_updater.py#L358) and [CyclicMomentumUpdater](https://github.com/open-mmlab/mmcv/blob/v1.3.7/mmcv/runner/hooks/momentum_updater.py#L225).

  ```python
  lr_config = dict(
      policy='cyclic',
      target_ratio=(10, 1e-4),
      cyclic_times=1,
      step_ratio_up=0.4,
  )
  momentum_config = dict(
      policy='cyclic',
      target_ratio=(0.85 / 0.95, 1),
      cyclic_times=1,
      step_ratio_up=0.4,
  )
  ```
twang's avatar
twang committed
148
149
150

## Customize training schedules

Wenhao Wu's avatar
Wenhao Wu committed
151
152
By default we use step learning rate with 1x schedule, this calls [`StepLRHook`](https://github.com/open-mmlab/mmcv/blob/v1.3.7/mmcv/runner/hooks/lr_updater.py#L167) in MMCV.
We support many other learning rate schedule [here](https://github.com/open-mmlab/mmcv/blob/v1.3.7/mmcv/runner/hooks/lr_updater.py), such as `CosineAnnealing` and `Poly` schedule. Here are some examples
twang's avatar
twang committed
153
154
155

- Poly schedule:

156
157
158
  ```python
  lr_config = dict(policy='poly', power=0.9, min_lr=1e-4, by_epoch=False)
  ```
twang's avatar
twang committed
159
160
161

- ConsineAnnealing schedule:

162
163
164
165
166
167
168
169
  ```python
  lr_config = dict(
      policy='CosineAnnealing',
      warmup='linear',
      warmup_iters=1000,
      warmup_ratio=1.0 / 10,
      min_lr_ratio=1e-5)
  ```
twang's avatar
twang committed
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192

## Customize workflow

Workflow is a list of (phase, epochs) to specify the running order and epochs.
By default it is set to be

```python
workflow = [('train', 1)]
```

which means running 1 epoch for training.
Sometimes user may want to check some metrics (e.g. loss, accuracy) about the model on the validate set.
In such case, we can set the workflow as

```python
[('train', 1), ('val', 1)]
```

so that 1 epoch for training and 1 epoch for validation will be run iteratively.

**Note**:

1. The parameters of model will not be updated during val epoch.
193
2. Keyword `max_epochs` in `runner` in the config only controls the number of training epochs and will not affect the validation workflow.
twang's avatar
twang committed
194
195
196
197
198
199
200
201
202
203
3. Workflows `[('train', 1), ('val', 1)]` and `[('train', 1)]` will not change the behavior of `EvalHook` because `EvalHook` is called by `after_train_epoch` and validation workflow only affect hooks that are called through `after_val_epoch`. Therefore, the only difference between `[('train', 1), ('val', 1)]` and `[('train', 1)]` is that the runner will calculate losses on validation set after each training epoch.

## Customize hooks

### Customize self-implemented hooks

#### 1. Implement a new hook

There are some occasions when the users might need to implement a new hook. MMDetection supports customized hooks in training (#3395) since v2.3.0. Thus the users could implement a hook directly in mmdet or their mmdet-based codebases and use the hook by only modifying the config in training.
Before v2.3.0, the users need to modify the code to get the hook registered before training starts.
204
Here we give an example of creating a new hook in mmdet3d and using it in training.
twang's avatar
twang committed
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238

```python
from mmcv.runner import HOOKS, Hook


@HOOKS.register_module()
class MyHook(Hook):

    def __init__(self, a, b):
        pass

    def before_run(self, runner):
        pass

    def after_run(self, runner):
        pass

    def before_epoch(self, runner):
        pass

    def after_epoch(self, runner):
        pass

    def before_iter(self, runner):
        pass

    def after_iter(self, runner):
        pass
```

Depending on the functionality of the hook, the users need to specify what the hook will do at each stage of the training in `before_run`, `after_run`, `before_epoch`, `after_epoch`, `before_iter`, and `after_iter`.

#### 2. Register the new hook

239
Then we need to make `MyHook` imported. Assuming the hook is in `mmdet3d/core/utils/my_hook.py` there are two ways to do that:
twang's avatar
twang committed
240

241
- Modify `mmdet3d/core/utils/__init__.py` to import it.
twang's avatar
twang committed
242

243
244
  The newly defined module should be imported in `mmdet3d/core/utils/__init__.py` so that the registry will
  find the new module and add it:
twang's avatar
twang committed
245
246
247

```python
from .my_hook import MyHook
248
249
250

__all__ = [..., 'MyHook']

twang's avatar
twang committed
251
252
```

Wenhao Wu's avatar
Wenhao Wu committed
253
Or use `custom_imports` in the config to manually import it
twang's avatar
twang committed
254
255

```python
256
custom_imports = dict(imports=['mmdet3d.core.utils.my_hook'], allow_failed_imports=False)
twang's avatar
twang committed
257
258
259
260
261
262
263
264
265
266
```

#### 3. Modify the config

```python
custom_hooks = [
    dict(type='MyHook', a=a_value, b=b_value)
]
```

267
You can also set the priority of the hook by setting key `priority` to `'NORMAL'` or `'HIGHEST'` as below
twang's avatar
twang committed
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288

```python
custom_hooks = [
    dict(type='MyHook', a=a_value, b=b_value, priority='NORMAL')
]
```

By default the hook's priority is set as `NORMAL` during registration.

### Use hooks implemented in MMCV

If the hook is already implemented in MMCV, you can directly modify the config to use the hook as below

```python
custom_hooks = [
    dict(type='MyHook', a=a_value, b=b_value, priority='NORMAL')
]
```

### Modify default runtime hooks

289
There are some common hooks that are not registered through `custom_hooks`, they are
twang's avatar
twang committed
290
291
292
293
294
295
296
297
298
299

- log_config
- checkpoint_config
- evaluation
- lr_config
- optimizer_config
- momentum_config

In those hooks, only the logger hook has the `VERY_LOW` priority, others' priority are `NORMAL`.
The above-mentioned tutorials already covers how to modify `optimizer_config`, `momentum_config`, and `lr_config`.
Wenhao Wu's avatar
Wenhao Wu committed
300
Here we reveal what we can do with `log_config`, `checkpoint_config`, and `evaluation`.
twang's avatar
twang committed
301
302
303

#### Checkpoint config

Wenhao Wu's avatar
Wenhao Wu committed
304
The MMCV runner will use `checkpoint_config` to initialize [`CheckpointHook`](https://github.com/open-mmlab/mmcv/blob/v1.3.7/mmcv/runner/hooks/checkpoint.py#L9).
twang's avatar
twang committed
305
306
307
308
309

```python
checkpoint_config = dict(interval=1)
```

310
The users could set `max_keep_ckpts` to save only small number of checkpoints or decide whether to store state dict of optimizer by `save_optimizer`. More details of the arguments are [here](https://mmcv.readthedocs.io/en/latest/api.html#mmcv.runner.CheckpointHook).
twang's avatar
twang committed
311
312
313
314

#### Log config

The `log_config` wraps multiple logger hooks and enables to set intervals. Now MMCV supports `WandbLoggerHook`, `MlflowLoggerHook`, and `TensorboardLoggerHook`.
315
The detailed usages can be found in the [docs](https://mmcv.readthedocs.io/en/latest/api.html#mmcv.runner.LoggerHook).
twang's avatar
twang committed
316
317
318
319
320
321
322
323
324
325
326
327

```python
log_config = dict(
    interval=50,
    hooks=[
        dict(type='TextLoggerHook'),
        dict(type='TensorboardLoggerHook')
    ])
```

#### Evaluation config

Wenhao Wu's avatar
Wenhao Wu committed
328
The config of `evaluation` will be used to initialize the [`EvalHook`](https://github.com/open-mmlab/mmdetection/blob/v2.13.0/mmdet/core/evaluation/eval_hooks.py#L9).
329
Except the key `interval`, other arguments such as `metric` will be passed to the `dataset.evaluate()`.
twang's avatar
twang committed
330
331
332
333

```python
evaluation = dict(interval=1, metric='bbox')
```