26 KiB
Migration from MMClassification 0.x
We introduce some modifications in MMClassification 1.x, and some of them are BC-breading. To migrate your projects from MMClassification 0.x smoothly, please read this tutorial.
New dependencies
MMClassification 1.x depends on some new packages, you can prepare a new clean environment and install again according to the install tutorial. Or install the below packages manually.
- MMEngine: MMEngine is the core the OpenMMLab 2.0 architecture, and we splited many compentents unrelated to computer vision from MMCV to MMEngine.
- MMCV: The computer vision package of OpenMMLab. This is not a new
dependency, but you need to upgrade it to above
2.0.0rc1
version. - rich: A terminal formatting package, and we use it to beautify some outputs in the terminal.
Configuration files
In MMClassification 1.x, we refactored the structure of configuration files, and the original files are not usable.
In this section, we will introduce all changes of the configuration files. And we assume you already have ideas of the config files.
Model settings
No changes in model.backbone
, model.neck
and model.head
fields.
Changes in model.train_cfg
:
BatchMixup
is renamed toMixup
.BatchCutMix
is renamed toCutMix
.BatchResizeMix
is renamed toResizeMix
.- The
prob
argument is removed from all augments settings, and you can use theprobs
field intrain_cfg
to specify probabilities of every augemnts. If noprobs
field, randomly choose one by the same probability.
Original |
|
New |
|
Data settings
Changes in data
:
- The original
data
field is splited totrain_dataloader
,val_dataloader
andtest_dataloader
. This allows us to configure them in fine-grained. For example, you can specify different sampler and batch size during training and test. - The
samples_per_gpu
is renamed tobatch_size
. - The
workers_per_gpu
is renamed tonum_workers
.
Original |
|
New |
|
Changes in pipeline
:
- The original formatting transforms
ToTensor
,ImageToTensor
andCollect
are combined asPackInputs
. - We don't recommend to do
Normalize
in the dataset pipeline. Please remove it from pipelines and set it in thedata_preprocessor
field. - The argument
flip_prob
inRandomFlip
is renamed toflip
. - The argument
size
inRandomCrop
is renamed tocrop_size
. - The argument
size
inRandomResizedCrop
is renamed toscale
. - The argument
size
inResize
is renamed toscale
. AndResize
won't support size like(256, -1)
, please useResizeEdge
to replace it. - The argument
policies
inAutoAugment
andRandAugment
supports using string to specify preset policies.AutoAugment
supports "imagenet" andRandAugment
supports "timm_increasing". RandomResizedCrop
andCenterCrop
won't supportsefficientnet_style
, and please useEfficientNetRandomCrop
andEfficientNetCenterCrop
to replace them.
We move some work of data transforms to the data preprocessor, like normalization, see [the documentation](mmpretrain.models.utils.data_preprocessor) for
more details.
Original |
|
New |
|
Changes in evaluation
:
- The
evaluation
field is splited toval_evaluator
andtest_evaluator
. And it won't supportsinterval
andsave_best
arguments. Theinterval
is moved totrain_cfg.val_interval
, see the schedule settings and thesave_best
is moved todefault_hooks.checkpoint.save_best
, see the runtime settings. - The 'accuracy' metric is renamed to
Accuracy
. - The 'precision', 'recall', 'f1-score' and 'support' are combined as
SingleLabelMetric
, and useitems
argument to specify to calculate which metric. - The 'mAP' is renamed to
AveragePrecision
. - The 'CP', 'CR', 'CF1', 'OP', 'OR', 'OF1' are combined as
MultiLabelMetric
, and useitems
andaverage
arguments to specify to calculate which metric.
Original |
|
New |
|
Original |
|
New |
|
Schedule settings
Changes in optimizer
and optimizer_config
:
- Now we use
optim_wrapper
field to specify all configuration about the optimization process. And theoptimizer
is a sub field ofoptim_wrapper
now. paramwise_cfg
is also a sub field ofoptim_wrapper
, instead ofoptimizer
.optimizer_config
is removed now, and all configurations of it are moved tooptim_wrapper
.grad_clip
is renamed toclip_grad
.
Original |
|
New |
|
Changes in lr_config
:
- The
lr_config
field is removed and we use newparam_scheduler
to replace it. - The
warmup
related arguments are removed, since we use schedulers combination to implement this functionality.
The new schedulers combination mechanism is very flexible, and you can use it to design many kinds of learning
rate / momentum curves. See {external+mmengine:doc}the tutorial <tutorials/param_scheduler>
for more details.
Original |
|
New |
|
Changes in runner
:
Most configuration in the original runner
field is moved to train_cfg
, val_cfg
and test_cfg
, which
configure the loop in training, validation and test.
Original |
|
New |
|
In fact, in OpenMMLab 2.0, we introduced Loop
to control the behaviors in training, validation and test. And
the functionalities of Runner
are also changed. You can find more details in {external+mmengine:doc}the MMEngine tutorials <design/runner>
.
Runtime settings
Changes in checkpoint_config
and log_config
:
The checkpoint_config
are moved to default_hooks.checkpoint
and the log_config
are moved to default_hooks.logger
.
And we move many hooks settings from the script code to the default_hooks
field in the runtime configuration.
default_hooks = dict(
# record the time of every iterations.
timer=dict(type='IterTimerHook'),
# print log every 100 iterations.
logger=dict(type='LoggerHook', interval=100),
# enable the parameter scheduler.
param_scheduler=dict(type='ParamSchedulerHook'),
# save checkpoint per epoch, and automatically save the best checkpoint.
checkpoint=dict(type='CheckpointHook', interval=1, save_best='auto'),
# set sampler seed in distributed evrionment.
sampler_seed=dict(type='DistSamplerSeedHook'),
# validation results visualization, set True to enable it.
visualization=dict(type='VisualizationHook', enable=False),
)
In addition, we splited the original logger to logger and visualizer. The logger is used to record information and the visualizer is used to show the logger in different backends, like terminal, TensorBoard and Wandb.
Original |
|
New |
|
Changes in load_from
and resume_from
:
- The
resume_from
is removed. And we useresume
andload_from
to replace it.- If
resume=True
andload_from
is not None, resume training from the checkpoint inload_from
. - If
resume=True
andload_from
is None, try to resume from the latest checkpoint in the work directory. - If
resume=False
andload_from
is not None, only load the checkpoint, not resume training. - If
resume=False
andload_from
is None, do not load nor resume.
- If
Changes in dist_params
: The dist_params
field is a sub field of env_cfg
now. And there are some new
configurations in the env_cfg
.
env_cfg = dict(
# whether to enable cudnn benchmark
cudnn_benchmark=False,
# set multi process parameters
mp_cfg=dict(mp_start_method='fork', opencv_num_threads=0),
# set distributed parameters
dist_cfg=dict(backend='nccl'),
)
Changes in workflow
: workflow
related functionalities are removed.
New field visualizer
: The visualizer is a new design in OpenMMLab 2.0 architecture. We use a
visualizer instance in the runner to handle results & log visualization and save to different backends.
See the {external+mmengine:doc}MMEngine tutorial <advanced_tutorials/visualization>
for more details.
visualizer = dict(
type='UniversalVisualizer',
vis_backends=[
dict(type='LocalVisBackend'),
# Uncomment the below line to save the log and visualization results to TensorBoard.
# dict(type='TensorboardVisBackend')
]
)
New field default_scope
: The start point to search module for all registries. The default_scope
in MMClassification is mmpretrain
. See {external+mmengine:doc}the registry tutorial <advanced_tutorials/registry>
for more details.
Packages
mmpretrain.apis
The documentation can be found here.
Function | Changes |
---|---|
init_model |
No changes |
inference_model |
No changes. But we recommend to use mmpretrain.ImageClassificationInferencer instead. |
train_model |
Removed, use runner.train to train. |
multi_gpu_test |
Removed, use runner.test to test. |
single_gpu_test |
Removed, use runner.test to test. |
show_result_pyplot |
Removed, use mmpretrain.ImageClassificationInferencer to inference model and show the result. |
set_random_seed |
Removed, use mmengine.runner.set_random_seed . |
init_random_seed |
Removed, use mmengine.dist.sync_random_seed . |
mmpretrain.core
The mmpretrain.core
package is renamed to mmpretrain.engine
.
Sub package | Changes |
---|---|
evaluation |
Removed, use the metrics in mmpretrain.evaluation . |
hook |
Moved to mmpretrain.engine.hooks |
optimizers |
Moved to mmpretrain.engine.optimizers |
utils |
Removed, the distributed environment related functions can be found in the mmengine.dist package. |
visualization |
Removed, the related functionalities are implemented in mmengine.visualization.Visualizer . |
The MMClsWandbHook
in hooks
package is waiting for implementation.
The CosineAnnealingCooldownLrUpdaterHook
in hooks
package is removed, and we support this functionality by
the combination of parameter schedulers, see the tutorial.
mmpretrain.datasets
The documentation can be found here.
Dataset class | Changes |
---|---|
CustomDataset |
Add data_root argument as the common prefix of data_prefix and ann_file . |
ImageNet |
Same as CustomDataset . |
ImageNet21k |
Same as CustomDataset . |
CIFAR10 & CIFAR100 |
The test_mode argument is a required argument now. |
MNIST & FashionMNIST |
The test_mode argument is a required argument now. |
VOC |
Requires data_root , image_set_path and test_mode now. |
CUB |
Requires data_root and test_mode now. |
The mmpretrain.datasets.pipelines
is renamed to mmpretrain.datasets.transforms
.
Transform class | Changes |
---|---|
LoadImageFromFile |
Removed, use mmcv.transforms.LoadImageFromFile . |
RandomFlip |
Removed, use mmcv.transforms.RandomFlip . The argument flip_prob is renamed to prob . |
RandomCrop |
The argument size is renamed to crop_size . |
RandomResizedCrop |
The argument size is renamed to scale . The argument scale is renamed to crop_ratio_range . Won't support efficientnet_style , use EfficientNetRandomCrop . |
CenterCrop |
Removed, use mmcv.transforms.CenterCrop . Won't support efficientnet_style , use EfficientNetCenterCrop . |
Resize |
Removed, use mmcv.transforms.Resize . The argument size is renamed to scale . Won't support size like (256, -1) , use ResizeEdge . |
AutoAugment & RandomAugment |
The argument policies supports using string to specify preset policies. |
Compose |
Removed, use mmcv.transforms.Compose . |
mmpretrain.models
The documentation can be found here. The interface of all backbones, necks and losses didn't change.
Changes in ImageClassifier
:
Method of classifiers | Changes |
---|---|
extract_feat |
No changes |
forward |
Now only accepts three arguments: inputs , data_samples and mode . See the documentation for more details. |
forward_train |
Replaced by loss . |
simple_test |
Replaced by predict . |
train_step |
The optimizer argument is replaced by optim_wrapper and it accepts OptimWrapper . |
val_step |
The original val_step is the same as train_step , now it calls predict . |
test_step |
New method, and it's the same as val_step . |
Changes in heads:
Method of heads | Changes |
---|---|
pre_logits |
No changes |
forward_train |
Replaced by loss . |
simple_test |
Replaced by predict . |
loss |
It accepts data_samples instead of gt_labels to calculate loss. The data_samples is a list of ClsDataSample. |
forward |
New method, and it returns the output of the classification head without any post-processs like softmax or sigmoid. |
mmpretrain.utils
Function | Changes |
---|---|
collect_env |
No changes |
get_root_logger |
Removed, use mmengine.logging.MMLogger.get_current_instance |
load_json_log |
The output format changed. |
setup_multi_processes |
Removed, use mmengine.utils.dl_utils.set_multi_processing . |
wrap_non_distributed_model |
Removed, we auto wrap the model in the runner. |
wrap_distributed_model |
Removed, we auto wrap the model in the runner. |
auto_select_device |
Removed, we auto select the device in the runner. |
Other changes
- We moved the definition of all registries in different packages to the
mmpretrain.registry
package.