2021-05-20 19:58:42 +08:00
# MIM: MIM Installs OpenMMLab Packages
2021-05-19 22:56:54 +08:00
2021-06-13 17:34:13 +08:00
MIM provides a unified interface for launching and installing OpenMMLab projects and their extensions, and managing the OpenMMLab model zoo.
2021-05-19 22:56:54 +08:00
2021-06-10 22:25:30 +08:00
## Major Features
2021-05-19 22:56:54 +08:00
2021-06-10 22:25:30 +08:00
- **Package Management**
2021-05-19 22:56:54 +08:00
2021-06-10 22:25:30 +08:00
You can use MIM to manage OpenMMLab codebases, install or uninstall them conveniently.
2021-05-19 22:56:54 +08:00
2021-06-13 17:34:13 +08:00
- **Model Management**
2021-05-19 22:56:54 +08:00
2021-06-13 17:34:13 +08:00
You can use MIM to manage OpenMMLab model zoo, e.g., download checkpoints by name, search checkpoints that meet specific criteria.
2021-05-19 22:56:54 +08:00
2021-06-13 17:34:13 +08:00
- **Unified Entrypoint for Scripts**
2021-05-19 22:56:54 +08:00
2021-06-13 17:34:13 +08:00
You can execute any script provided by all OpenMMLab codebases with unified commands. Train, test and inference become easier than ever. Besides, you can use `gridsearch` command for vanilla hyper-parameter search.
2021-05-19 22:56:54 +08:00
2021-06-13 18:25:16 +08:00
## License
This project is released under the [Apache 2.0 license ](LICENSE ).
## Changelog
v0.1.1 was released in 13/6/2021.
2021-06-23 22:16:08 +08:00
## Customization
2021-12-10 20:43:50 +08:00
You can use `.mimrc` for customization. Now we support customize default values of each sub-command. Please refer to [customization.md ](docs/en/customization.md ) for details.
2021-06-23 22:16:08 +08:00
2021-06-10 22:44:45 +08:00
## Build custom projects with MIM
2021-06-13 17:34:13 +08:00
We provide some examples of how to build custom projects based on OpenMMLAB codebases and MIM in [MIM-Example ](https://github.com/open-mmlab/mim-example ).
Without worrying about copying codes and scripts from existing codebases, users can focus on developing new components and MIM helps integrate and run the new project.
2021-06-10 22:44:45 +08:00
2021-06-10 22:25:30 +08:00
## Installation
2021-05-19 22:56:54 +08:00
2021-12-10 20:43:50 +08:00
Please refer to [installation.md ](docs/en/installation.md ) for installation.
2021-05-19 22:56:54 +08:00
## Command
< details >
< summary > 1. install< / summary >
2021-06-13 17:34:13 +08:00
- command
2021-05-19 22:56:54 +08:00
2022-06-13 14:06:35 +08:00
```bash
# install latest version of mmcv-full
> mim install mmcv-full # wheel
2022-06-24 16:34:30 +08:00
# install 1.5.0
> mim install mmcv-full==1.5.0
2022-06-13 14:06:35 +08:00
# install latest version of mmcls
> mim install mmcls
# install master branch
2022-06-24 16:34:30 +08:00
> mim install git+https://github.com/open-mmlab/mmclassification.git
2022-06-13 14:06:35 +08:00
# install local repo
> git clone https://github.com/open-mmlab/mmclassification.git
> cd mmclassification
> mim install .
# install extension based on OpenMMLab
2022-06-24 16:34:30 +08:00
mim install git+https://github.com/xxx/mmcls-project.git
2022-06-13 14:06:35 +08:00
```
2021-05-19 22:56:54 +08:00
2021-06-13 17:34:13 +08:00
- api
2021-05-19 22:56:54 +08:00
2022-06-13 14:06:35 +08:00
```python
from mim import install
2021-05-19 22:56:54 +08:00
2022-06-13 14:06:35 +08:00
# install mmcv
install('mmcv-full')
2021-05-19 22:56:54 +08:00
2022-06-13 14:06:35 +08:00
# install mmcls will automatically install mmcv if it is not installed
2022-06-24 16:34:30 +08:00
install('mmcls')
2021-05-19 22:56:54 +08:00
2022-06-13 14:06:35 +08:00
# install extension based on OpenMMLab
2022-06-24 16:34:30 +08:00
install('git+https://github.com/xxx/mmcls-project.git')
2022-06-13 14:06:35 +08:00
```
2021-05-19 22:56:54 +08:00
< / details >
< details >
< summary > 2. uninstall< / summary >
2021-06-14 12:14:07 +08:00
2021-06-13 17:34:13 +08:00
- command
2021-05-19 22:56:54 +08:00
2022-06-13 14:06:35 +08:00
```bash
# uninstall mmcv
> mim uninstall mmcv-full
2021-05-19 22:56:54 +08:00
2022-06-13 14:06:35 +08:00
# uninstall mmcls
> mim uninstall mmcls
```
2021-05-19 22:56:54 +08:00
2021-06-13 17:34:13 +08:00
- api
2021-05-19 22:56:54 +08:00
2022-06-13 14:06:35 +08:00
```python
from mim import uninstall
2021-05-19 22:56:54 +08:00
2022-06-13 14:06:35 +08:00
# uninstall mmcv
uninstall('mmcv-full')
2021-05-19 22:56:54 +08:00
2022-06-13 14:06:35 +08:00
# uninstall mmcls
2023-02-14 10:28:14 +08:00
uninstall('mmcls')
2022-06-13 14:06:35 +08:00
```
2021-05-19 22:56:54 +08:00
< / details >
< details >
< summary > 3. list< / summary >
2021-06-13 17:34:13 +08:00
- command
2021-05-19 22:56:54 +08:00
2022-06-13 14:06:35 +08:00
```bash
> mim list
> mim list --all
```
2021-05-19 22:56:54 +08:00
2021-06-13 17:34:13 +08:00
- api
2021-05-19 22:56:54 +08:00
2022-06-13 14:06:35 +08:00
```python
from mim import list_package
2021-05-19 22:56:54 +08:00
2022-06-13 14:06:35 +08:00
list_package()
list_package(True)
```
2021-05-19 22:56:54 +08:00
< / details >
< details >
< summary > 4. search< / summary >
2021-06-13 17:34:13 +08:00
- command
2021-05-19 22:56:54 +08:00
2022-06-13 14:06:35 +08:00
```bash
> mim search mmcls
2022-06-24 16:34:30 +08:00
> mim search mmcls==0.23.0 --remote
2022-06-13 14:06:35 +08:00
> mim search mmcls --config resnet18_8xb16_cifar10
> mim search mmcls --model resnet
> mim search mmcls --dataset cifar-10
> mim search mmcls --valid-field
> mim search mmcls --condition 'batch_size>45,epochs>100'
> mim search mmcls --condition 'batch_size>45 epochs>100'
> mim search mmcls --condition '128<batch_size<=256'
> mim search mmcls --sort batch_size epochs
> mim search mmcls --field epochs batch_size weight
> mim search mmcls --exclude-field weight paper
```
2021-05-19 22:56:54 +08:00
2021-06-13 17:34:13 +08:00
- api
2021-05-19 22:56:54 +08:00
2022-06-13 14:06:35 +08:00
```python
from mim import get_model_info
2021-05-19 22:56:54 +08:00
2022-06-13 14:06:35 +08:00
get_model_info('mmcls')
2022-06-24 16:34:30 +08:00
get_model_info('mmcls==0.23.0', local=False)
2022-06-13 14:06:35 +08:00
get_model_info('mmcls', models=['resnet'])
get_model_info('mmcls', training_datasets=['cifar-10'])
get_model_info('mmcls', filter_conditions='batch_size>45,epochs>100')
get_model_info('mmcls', filter_conditions='batch_size>45 epochs>100')
get_model_info('mmcls', filter_conditions='128< batch_size < = 256 ' )
get_model_info('mmcls', sorted_fields=['batch_size', 'epochs'])
get_model_info('mmcls', shown_fields=['epochs', 'batch_size', 'weight'])
```
2021-05-19 22:56:54 +08:00
< / details >
< details >
< summary > 5. download< / summary >
2021-06-13 17:34:13 +08:00
- command
2021-05-19 22:56:54 +08:00
2022-06-13 14:06:35 +08:00
```bash
> mim download mmcls --config resnet18_8xb16_cifar10
> mim download mmcls --config resnet18_8xb16_cifar10 --dest .
```
2021-05-19 22:56:54 +08:00
2021-06-13 17:34:13 +08:00
- api
2021-05-19 22:56:54 +08:00
2022-06-13 14:06:35 +08:00
```python
from mim import download
2021-05-19 22:56:54 +08:00
2022-06-13 14:06:35 +08:00
download('mmcls', ['resnet18_8xb16_cifar10'])
2023-01-28 14:20:53 +08:00
download('mmcls', ['resnet18_8xb16_cifar10'], dest_root='.')
2022-06-13 14:06:35 +08:00
```
2021-05-19 22:56:54 +08:00
< / details >
< details >
< summary > 6. train< / summary >
2021-06-13 17:34:13 +08:00
- command
2021-05-19 22:56:54 +08:00
2022-06-13 14:06:35 +08:00
```bash
# Train models on a single server with CPU by setting `gpus` to 0 and
# 'launcher' to 'none' (if applicable). The training script of the
# corresponding codebase will fail if it doesn't support CPU training.
> mim train mmcls resnet101_b16x8_cifar10.py --work-dir tmp --gpus 0
# Train models on a single server with one GPU
> mim train mmcls resnet101_b16x8_cifar10.py --work-dir tmp --gpus 1
# Train models on a single server with 4 GPUs and pytorch distributed
> mim train mmcls resnet101_b16x8_cifar10.py --work-dir tmp --gpus 4 \
--launcher pytorch
# Train models on a slurm HPC with one 8-GPU node
> mim train mmcls resnet101_b16x8_cifar10.py --launcher slurm --gpus 8 \
--gpus-per-node 8 --partition partition_name --work-dir tmp
# Print help messages of sub-command train
> mim train -h
# Print help messages of sub-command train and the training script of mmcls
> mim train mmcls -h
```
2021-05-19 22:56:54 +08:00
2021-06-13 17:34:13 +08:00
- api
2021-05-19 22:56:54 +08:00
2022-06-13 14:06:35 +08:00
```python
from mim import train
2021-05-19 22:56:54 +08:00
2022-06-13 14:06:35 +08:00
train(repo='mmcls', config='resnet18_8xb16_cifar10.py', gpus=0,
2023-10-23 11:02:37 +08:00
other_args=('--work-dir', 'tmp'))
2022-06-13 14:06:35 +08:00
train(repo='mmcls', config='resnet18_8xb16_cifar10.py', gpus=1,
2023-10-23 11:02:37 +08:00
other_args=('--work-dir', 'tmp'))
2022-06-13 14:06:35 +08:00
train(repo='mmcls', config='resnet18_8xb16_cifar10.py', gpus=4,
2023-10-23 11:02:37 +08:00
launcher='pytorch', other_args=('--work-dir', 'tmp'))
2022-06-13 14:06:35 +08:00
train(repo='mmcls', config='resnet18_8xb16_cifar10.py', gpus=8,
launcher='slurm', gpus_per_node=8, partition='partition_name',
2023-10-23 11:02:37 +08:00
other_args=('--work-dir', 'tmp'))
2022-06-13 14:06:35 +08:00
```
2021-05-19 22:56:54 +08:00
< / details >
< details >
< summary > 7. test< / summary >
2021-06-13 17:34:13 +08:00
- command
2021-05-19 22:56:54 +08:00
2022-06-13 14:06:35 +08:00
```bash
# Test models on a single server with 1 GPU, report accuracy
> mim test mmcls resnet101_b16x8_cifar10.py --checkpoint \
tmp/epoch_3.pth --gpus 1 --metrics accuracy
# Test models on a single server with 1 GPU, save predictions
> mim test mmcls resnet101_b16x8_cifar10.py --checkpoint \
tmp/epoch_3.pth --gpus 1 --out tmp.pkl
# Test models on a single server with 4 GPUs, pytorch distributed,
# report accuracy
> mim test mmcls resnet101_b16x8_cifar10.py --checkpoint \
tmp/epoch_3.pth --gpus 4 --launcher pytorch --metrics accuracy
# Test models on a slurm HPC with one 8-GPU node, report accuracy
> mim test mmcls resnet101_b16x8_cifar10.py --checkpoint \
tmp/epoch_3.pth --gpus 8 --metrics accuracy --partition \
partition_name --gpus-per-node 8 --launcher slurm
# Print help messages of sub-command test
> mim test -h
# Print help messages of sub-command test and the testing script of mmcls
> mim test mmcls -h
```
2021-05-19 22:56:54 +08:00
2021-06-13 17:34:13 +08:00
- api
2021-05-19 22:56:54 +08:00
2022-06-13 14:06:35 +08:00
```python
from mim import test
test(repo='mmcls', config='resnet101_b16x8_cifar10.py',
2023-10-23 11:02:37 +08:00
checkpoint='tmp/epoch_3.pth', gpus=1, other_args=('--metrics', 'accuracy'))
2022-06-13 14:06:35 +08:00
test(repo='mmcls', config='resnet101_b16x8_cifar10.py',
2023-10-23 11:02:37 +08:00
checkpoint='tmp/epoch_3.pth', gpus=1, other_args=('--out', 'tmp.pkl'))
2022-06-13 14:06:35 +08:00
test(repo='mmcls', config='resnet101_b16x8_cifar10.py',
checkpoint='tmp/epoch_3.pth', gpus=4, launcher='pytorch',
2023-10-23 11:02:37 +08:00
other_args=('--metrics', 'accuracy'))
2022-06-13 14:06:35 +08:00
test(repo='mmcls', config='resnet101_b16x8_cifar10.py',
checkpoint='tmp/epoch_3.pth', gpus=8, partition='partition_name',
2023-10-23 11:02:37 +08:00
launcher='slurm', gpus_per_node=8, other_args=('--metrics', 'accuracy'))
2022-06-13 14:06:35 +08:00
```
2021-05-19 22:56:54 +08:00
< / details >
< details >
< summary > 8. run< / summary >
2021-06-13 17:34:13 +08:00
- command
2021-05-19 22:56:54 +08:00
2022-06-13 14:06:35 +08:00
```bash
# Get the Flops of a model
> mim run mmcls get_flops resnet101_b16x8_cifar10.py
# Publish a model
> mim run mmcls publish_model input.pth output.pth
# Train models on a slurm HPC with one GPU
> srun -p partition --gres=gpu:1 mim run mmcls train \
resnet101_b16x8_cifar10.py --work-dir tmp
# Test models on a slurm HPC with one GPU, report accuracy
> srun -p partition --gres=gpu:1 mim run mmcls test \
resnet101_b16x8_cifar10.py tmp/epoch_3.pth --metrics accuracy
# Print help messages of sub-command run
> mim run -h
# Print help messages of sub-command run, list all available scripts in
# codebase mmcls
> mim run mmcls -h
# Print help messages of sub-command run, print the help message of
# training script in mmcls
> mim run mmcls train -h
```
2021-05-19 22:56:54 +08:00
2021-06-13 17:34:13 +08:00
- api
2021-05-19 22:56:54 +08:00
2022-06-13 14:06:35 +08:00
```python
from mim import run
2021-05-19 22:56:54 +08:00
2022-06-13 14:06:35 +08:00
run(repo='mmcls', command='get_flops',
2023-10-23 11:02:37 +08:00
other_args=('resnet101_b16x8_cifar10.py',))
2022-06-13 14:06:35 +08:00
run(repo='mmcls', command='publish_model',
2023-10-23 11:02:37 +08:00
other_args=('input.pth', 'output.pth'))
2022-06-13 14:06:35 +08:00
run(repo='mmcls', command='train',
2023-10-23 11:02:37 +08:00
other_args=('resnet101_b16x8_cifar10.py', '--work-dir', 'tmp'))
2022-06-13 14:06:35 +08:00
run(repo='mmcls', command='test',
2023-10-23 11:02:37 +08:00
other_args=('resnet101_b16x8_cifar10.py', 'tmp/epoch_3.pth', '--metrics accuracy'))
2022-06-13 14:06:35 +08:00
```
2021-05-19 22:56:54 +08:00
< / details >
< details >
< summary > 9. gridsearch< / summary >
2021-06-13 17:34:13 +08:00
- command
2021-05-19 22:56:54 +08:00
2022-06-13 14:06:35 +08:00
```bash
# Parameter search on a single server with CPU by setting `gpus` to 0 and
# 'launcher' to 'none' (if applicable). The training script of the
# corresponding codebase will fail if it doesn't support CPU training.
> mim gridsearch mmcls resnet101_b16x8_cifar10.py --work-dir tmp --gpus 0 \
--search-args '--optimizer.lr 1e-2 1e-3'
# Parameter search with on a single server with one GPU, search learning
# rate
> mim gridsearch mmcls resnet101_b16x8_cifar10.py --work-dir tmp --gpus 1 \
--search-args '--optimizer.lr 1e-2 1e-3'
# Parameter search with on a single server with one GPU, search
# weight_decay
> mim gridsearch mmcls resnet101_b16x8_cifar10.py --work-dir tmp --gpus 1 \
--search-args '--optimizer.weight_decay 1e-3 1e-4'
# Parameter search with on a single server with one GPU, search learning
# rate and weight_decay
> mim gridsearch mmcls resnet101_b16x8_cifar10.py --work-dir tmp --gpus 1 \
--search-args '--optimizer.lr 1e-2 1e-3 --optimizer.weight_decay 1e-3 \
1e-4'
# Parameter search on a slurm HPC with one 8-GPU node, search learning
# rate and weight_decay
> mim gridsearch mmcls resnet101_b16x8_cifar10.py --work-dir tmp --gpus 8 \
--partition partition_name --gpus-per-node 8 --launcher slurm \
--search-args '--optimizer.lr 1e-2 1e-3 --optimizer.weight_decay 1e-3 \
1e-4'
# Parameter search on a slurm HPC with one 8-GPU node, search learning
# rate and weight_decay, max parallel jobs is 2
> mim gridsearch mmcls resnet101_b16x8_cifar10.py --work-dir tmp --gpus 8 \
--partition partition_name --gpus-per-node 8 --launcher slurm \
--max-jobs 2 --search-args '--optimizer.lr 1e-2 1e-3 \
--optimizer.weight_decay 1e-3 1e-4'
# Print the help message of sub-command search
> mim gridsearch -h
# Print the help message of sub-command search and the help message of the
# training script of codebase mmcls
> mim gridsearch mmcls -h
```
2021-05-19 22:56:54 +08:00
2021-06-13 17:34:13 +08:00
- api
2021-05-19 22:56:54 +08:00
2022-06-13 14:06:35 +08:00
```python
from mim import gridsearch
gridsearch(repo='mmcls', config='resnet101_b16x8_cifar10.py', gpus=0,
search_args='--optimizer.lr 1e-2 1e-3',
2023-10-23 11:02:37 +08:00
other_args=('--work-dir', 'tmp'))
2022-06-13 14:06:35 +08:00
gridsearch(repo='mmcls', config='resnet101_b16x8_cifar10.py', gpus=1,
search_args='--optimizer.lr 1e-2 1e-3',
2023-10-23 11:02:37 +08:00
other_args=('--work-dir', 'tmp'))
2022-06-13 14:06:35 +08:00
gridsearch(repo='mmcls', config='resnet101_b16x8_cifar10.py', gpus=1,
search_args='--optimizer.weight_decay 1e-3 1e-4',
2023-10-23 11:02:37 +08:00
other_args=('--work-dir', 'tmp'))
2022-06-13 14:06:35 +08:00
gridsearch(repo='mmcls', config='resnet101_b16x8_cifar10.py', gpus=1,
search_args='--optimizer.lr 1e-2 1e-3 --optimizer.weight_decay'
'1e-3 1e-4',
2023-10-23 11:02:37 +08:00
other_args=('--work-dir', 'tmp'))
2022-06-13 14:06:35 +08:00
gridsearch(repo='mmcls', config='resnet101_b16x8_cifar10.py', gpus=8,
partition='partition_name', gpus_per_node=8, launcher='slurm',
search_args='--optimizer.lr 1e-2 1e-3 --optimizer.weight_decay'
' 1e-3 1e-4',
2023-10-23 11:02:37 +08:00
other_args=('--work-dir', 'tmp'))
2022-06-13 14:06:35 +08:00
gridsearch(repo='mmcls', config='resnet101_b16x8_cifar10.py', gpus=8,
partition='partition_name', gpus_per_node=8, launcher='slurm',
max_workers=2,
search_args='--optimizer.lr 1e-2 1e-3 --optimizer.weight_decay'
' 1e-3 1e-4',
2023-10-23 11:02:37 +08:00
other_args=('--work-dir', 'tmp'))
2022-06-13 14:06:35 +08:00
```
2021-05-19 22:56:54 +08:00
< / details >
## Contributing
We appreciate all contributions to improve mim. Please refer to [CONTRIBUTING.md ](https://github.com/open-mmlab/mmcv/blob/master/CONTRIBUTING.md ) for the contributing guideline.
2021-05-25 22:29:41 +08:00
## License
This project is released under the [Apache 2.0 license ](LICENSE ).
2021-12-07 14:15:19 +08:00
## Projects in OpenMMLab
2023-04-25 20:17:03 +08:00
- [MMEngine ](https://github.com/open-mmlab/mmengine ): OpenMMLab foundational library for training deep learning models.
2021-12-07 14:15:19 +08:00
- [MMCV ](https://github.com/open-mmlab/mmcv ): OpenMMLab foundational library for computer vision.
2023-04-25 20:17:03 +08:00
- [MMEval ](https://github.com/open-mmlab/mmeval ): A unified evaluation library for multiple machine learning libraries.
- [MMPreTrain ](https://github.com/open-mmlab/mmpretrain ): OpenMMLab pre-training toolbox and benchmark.
- [MMagic ](https://github.com/open-mmlab/mmagic ): Open**MM**Lab **A**dvanced, **G**enerative and **I**ntelligent **C**reation toolbox.
2021-12-07 14:15:19 +08:00
- [MMDetection ](https://github.com/open-mmlab/mmdetection ): OpenMMLab detection toolbox and benchmark.
2023-04-25 20:17:03 +08:00
- [MMYOLO ](https://github.com/open-mmlab/mmyolo ): OpenMMLab YOLO series toolbox and benchmark.
2021-12-07 14:15:19 +08:00
- [MMDetection3D ](https://github.com/open-mmlab/mmdetection3d ): OpenMMLab's next-generation platform for general 3D object detection.
2022-02-25 16:51:06 +08:00
- [MMRotate ](https://github.com/open-mmlab/mmrotate ): OpenMMLab rotated object detection toolbox and benchmark.
2023-04-25 20:17:03 +08:00
- [MMTracking ](https://github.com/open-mmlab/mmtracking ): OpenMMLab video perception toolbox and benchmark.
- [MMPose ](https://github.com/open-mmlab/mmpose ): OpenMMLab pose estimation toolbox and benchmark.
2021-12-07 14:15:19 +08:00
- [MMSegmentation ](https://github.com/open-mmlab/mmsegmentation ): OpenMMLab semantic segmentation toolbox and benchmark.
2022-02-25 16:51:06 +08:00
- [MMOCR ](https://github.com/open-mmlab/mmocr ): OpenMMLab text detection, recognition, and understanding toolbox.
- [MMHuman3D ](https://github.com/open-mmlab/mmhuman3d ): OpenMMLab 3D human parametric model toolbox and benchmark.
- [MMSelfSup ](https://github.com/open-mmlab/mmselfsup ): OpenMMLab self-supervised learning toolbox and benchmark.
- [MMFewShot ](https://github.com/open-mmlab/mmfewshot ): OpenMMLab fewshot learning toolbox and benchmark.
2021-12-07 14:15:19 +08:00
- [MMAction2 ](https://github.com/open-mmlab/mmaction2 ): OpenMMLab's next-generation action understanding toolbox and benchmark.
2022-02-25 16:51:06 +08:00
- [MMFlow ](https://github.com/open-mmlab/mmflow ): OpenMMLab optical flow toolbox and benchmark.
- [MMDeploy ](https://github.com/open-mmlab/mmdeploy ): OpenMMLab model deployment framework.
2023-04-25 20:17:03 +08:00
- [MMRazor ](https://github.com/open-mmlab/mmrazor ): OpenMMLab model compression toolbox and benchmark.
- [Playground ](https://github.com/open-mmlab/playground ): A central hub for gathering and showcasing amazing projects built upon OpenMMLab.