2022-08-31 20:54:15 +08:00
# Customize Runtime Settings
2020-12-23 10:36:49 +08:00
2023-02-07 14:47:22 +08:00
## Customize hooks
2020-12-23 10:36:49 +08:00
2023-02-07 14:47:22 +08:00
### Step 1: Implement a new hook
2020-12-23 10:36:49 +08:00
2023-02-07 14:47:22 +08:00
MMEngine has implemented commonly used [hooks ](https://github.com/open-mmlab/mmengine/blob/main/docs/en/tutorials/hook.md ) for training and test,
When users have requirements for customization, they can follow examples below.
For example, if some hyper-parameter of the model needs to be changed when model training, we can implement a new hook for it:
2020-12-23 10:36:49 +08:00
```python
2023-02-07 14:47:22 +08:00
# Copyright (c) OpenMMLab. All rights reserved.
from typing import Optional, Sequence
2020-12-23 10:36:49 +08:00
2023-02-07 14:47:22 +08:00
from mmengine.hooks import Hook
from mmengine.model import is_model_wrapper
2020-12-23 10:36:49 +08:00
2023-02-07 14:47:22 +08:00
from mmseg.registry import HOOKS
2020-12-23 10:36:49 +08:00
2023-02-07 14:47:22 +08:00
@HOOKS .register_module()
class NewHook(Hook):
"""Docstring for NewHook.
"""
2020-12-23 10:36:49 +08:00
2023-02-07 14:47:22 +08:00
def __init__ (self, a: int, b: int) -> None:
self.a = a
self.b = b
2020-12-23 10:36:49 +08:00
2023-02-07 14:47:22 +08:00
def before_train_iter(self,
runner,
batch_idx: int,
data_batch: Optional[Sequence[dict]] = None) -> None:
cur_iter = runner.iter
# acquire this model when it is in a wrapper
if is_model_wrapper(runner.model):
model = runner.model.module
model.hyper_parameter = self.a * cur_iter + self.b
2020-12-23 10:36:49 +08:00
```
2023-02-07 14:47:22 +08:00
### Step 2: Import a new hook
2020-12-23 10:36:49 +08:00
2023-02-07 14:47:22 +08:00
The module which is defined above needs to be imported into main namespace first to ensure being registered.
We assume `NewHook` is implemented in `mmseg/engine/hooks/new_hook.py` , there are two ways to import it:
2020-12-23 10:36:49 +08:00
2023-02-07 14:47:22 +08:00
- Import it by modifying `mmseg/engine/hooks/__init__.py` .
Modules should be imported in `mmseg/engine/hooks/__init__.py` thus these new modules can be found and added by registry.
2020-12-23 10:36:49 +08:00
```python
2023-02-07 14:47:22 +08:00
from .new_hook import NewHook
__all__ = [..., NewHook]
2020-12-23 10:36:49 +08:00
```
2023-02-07 14:47:22 +08:00
- Import it manually by `custom_imports` in config file.
2020-12-23 10:36:49 +08:00
```python
2023-02-07 14:47:22 +08:00
custom_imports = dict(imports=['mmseg.engine.hooks.new_hook'], allow_failed_imports=False)
2020-12-23 10:36:49 +08:00
```
2023-02-07 14:47:22 +08:00
### Step 3: Modify config file
2020-12-23 10:36:49 +08:00
2023-02-07 14:47:22 +08:00
Users can set and use customized hooks in training and test followed methods below.
The execution priority of hooks at the same place of `Runner` can be referred [here ](https://github.com/open-mmlab/mmengine/blob/main/docs/en/tutorials/hook.md#built-in-hooks ),
Default priority of customized hook is `NORMAL` .
2020-12-23 10:36:49 +08:00
```python
2023-02-07 14:47:22 +08:00
custom_hooks = [
dict(type='NewHook', a=a_value, b=b_value, priority='ABOVE_NORMAL')
]
2020-12-23 10:36:49 +08:00
```
2023-02-07 14:47:22 +08:00
## Customize optimizer
2020-12-23 10:36:49 +08:00
2023-02-07 14:47:22 +08:00
### Step 1: Implement a new optimizer
2020-12-23 10:36:49 +08:00
2023-02-07 14:47:22 +08:00
We recommend the customized optimizer implemented in `mmseg/engine/optimizers/my_optimizer.py` . Here is an example of a new optimizer `MyOptimizer` which has parameters `a` , `b` and `c` :
2020-12-23 10:36:49 +08:00
```python
2023-02-07 14:47:22 +08:00
from mmseg.registry import OPTIMIZERS
from torch.optim import Optimizer
2020-12-23 10:36:49 +08:00
2023-02-07 14:47:22 +08:00
@OPTIMIZERS .register_module()
class MyOptimizer(Optimizer):
2020-12-23 10:36:49 +08:00
2023-02-07 14:47:22 +08:00
def __init__ (self, a, b, c)
2020-12-23 10:36:49 +08:00
```
2023-02-07 14:47:22 +08:00
### Step 2: Import a new optimizer
2020-12-23 10:36:49 +08:00
2023-02-07 14:47:22 +08:00
The module which is defined above needs to be imported into main namespace first to ensure being registered.
We assume `MyOptimizer` is implemented in `mmseg/engine/optimizers/my_optimizer.py` , there are two ways to import it:
2020-12-23 10:36:49 +08:00
2023-02-07 14:47:22 +08:00
- Import it by modifying `mmseg/engine/optimizers/__init__.py` .
Modules should be imported in `mmseg/engine/optimizers/__init__.py` thus these new modules can be found and added by registry.
2020-12-23 10:36:49 +08:00
```python
2023-02-07 14:47:22 +08:00
from .my_optimizer import MyOptimizer
2020-12-23 10:36:49 +08:00
```
2023-02-07 14:47:22 +08:00
- Import it manually by `custom_imports` in config file.
2020-12-23 10:36:49 +08:00
```python
2023-02-07 14:47:22 +08:00
custom_imports = dict(imports=['mmseg.engine.optimizers.my_optimizer'], allow_failed_imports=False)
2020-12-23 10:36:49 +08:00
```
2023-02-07 14:47:22 +08:00
### Step 3: Modify config file
2020-12-23 10:36:49 +08:00
2023-02-07 14:47:22 +08:00
Then it needs to modify `optimizer` in `optim_wrapper` of config file, if users want to use customized `MyOptimizer` , it can be modified as:
2020-12-23 10:36:49 +08:00
2023-02-07 14:47:22 +08:00
```python
optim_wrapper = dict(type='OptimWrapper',
optimizer=dict(type='MyOptimizer',
a=a_value, b=b_value, c=c_value),
clip_grad=None)
```
2021-09-16 23:23:50 +08:00
2023-02-07 14:47:22 +08:00
## Customize optimizer constructor
2020-12-23 10:36:49 +08:00
2023-02-07 14:47:22 +08:00
### Step 1: Implement a new optimizer constructor
2020-12-23 10:36:49 +08:00
2023-02-07 14:47:22 +08:00
Optimizer constructor is used to create optimizer and optimizer wrapper for model training, which has powerful functions like specifying learning rate and weight decay for different model layers.
Here is an example for a customized optimizer constructor.
2020-12-23 10:36:49 +08:00
```python
2023-02-07 14:47:22 +08:00
from mmengine.optim import DefaultOptimWrapperConstructor
from mmseg.registry import OPTIM_WRAPPER_CONSTRUCTORS
2020-12-23 10:36:49 +08:00
2023-02-07 14:47:22 +08:00
@OPTIM_WRAPPER_CONSTRUCTORS .register_module()
class LearningRateDecayOptimizerConstructor(DefaultOptimWrapperConstructor):
def __init__ (self, optim_wrapper_cfg, paramwise_cfg=None):
2020-12-23 10:36:49 +08:00
2023-02-07 14:47:22 +08:00
def __call__ (self, model):
2020-12-23 10:36:49 +08:00
2023-02-07 14:47:22 +08:00
return my_optimizer
```
2020-12-23 10:36:49 +08:00
2023-02-07 14:47:22 +08:00
Default optimizer constructor is implemented [here ](https://github.com/open-mmlab/mmengine/blob/main/mmengine/optim/optimizer/default_constructor.py#L19 ).
It can also be used as base class of new optimizer constructor.
2020-12-23 10:36:49 +08:00
2023-02-07 14:47:22 +08:00
### Step 2: Import a new optimizer constructor
2020-12-23 10:36:49 +08:00
2023-02-07 14:47:22 +08:00
The module which is defined above needs to be imported into main namespace first to ensure being registered.
We assume `MyOptimizerConstructor` is implemented in `mmseg/engine/optimizers/my_optimizer_constructor.py` , there are two ways to import it:
- Import it by modifying `mmseg/engine/optimizers/__init__.py` .
Modules should be imported in `mmseg/engine/optimizers/__init__.py` thus these new modules can be found and added by registry.
2020-12-23 10:36:49 +08:00
```python
2023-02-07 14:47:22 +08:00
from .my_optimizer_constructor import MyOptimizerConstructor
2020-12-23 10:36:49 +08:00
```
2023-02-07 14:47:22 +08:00
- Import it manually by `custom_imports` in config file.
2020-12-23 10:36:49 +08:00
```python
2023-02-07 14:47:22 +08:00
custom_imports = dict(imports=['mmseg.engine.optimizers.my_optimizer_constructor'], allow_failed_imports=False)
2020-12-23 10:36:49 +08:00
```
2023-02-07 14:47:22 +08:00
### Step 3: Modify config file
2020-12-23 10:36:49 +08:00
2023-02-07 14:47:22 +08:00
Then it needs to modify `constructor` in `optim_wrapper` of config file, if users want to use customized `MyOptimizerConstructor` , it can be modified as:
2020-12-23 10:36:49 +08:00
```python
2023-02-07 14:47:22 +08:00
optim_wrapper = dict(type='OptimWrapper',
constructor='MyOptimizerConstructor',
clip_grad=None)
2020-12-23 10:36:49 +08:00
```