Add model_zoo docs
parent
9a661ef981
commit
563fa63d06
|
@ -28,7 +28,7 @@ mmclassification
|
|||
|
||||
```
|
||||
|
||||
For ImageNet, it has multiple versions, but the most commonly used on is [ILSVRC 2012](http://www.image-net.org/challenges/LSVRC/2012/). It can be accessed with the following steps.
|
||||
For ImageNet, it has multiple versions, but the most commonly used one is [ILSVRC 2012](http://www.image-net.org/challenges/LSVRC/2012/). It can be accessed with the following steps.
|
||||
1. Register an account and login to the [download page](http://www.image-net.org/download-images).
|
||||
2. Find download links for ILSVRC2012 and download the following two files
|
||||
- ILSVRC2012_img_train.tar (~138GB)
|
||||
|
@ -181,6 +181,24 @@ Params: 25.56 M
|
|||
(1) FLOPs are related to the input shape while parameters are not. The default input shape is (1, 3, 224, 224).
|
||||
(2) Some operators are not counted into FLOPs like GN and custom operators. Refer to [`mmcv.cnn.get_model_complexity_info()`](https://github.com/open-mmlab/mmcv/blob/master/mmcv/cnn/utils/flops_counter.py) for details.
|
||||
|
||||
### Publish a model
|
||||
|
||||
Before you upload a model to AWS, you may want to
|
||||
(1) convert model weights to CPU tensors, (2) delete the optimizer states and
|
||||
(3) compute the hash of the checkpoint file and append the hash id to the filename.
|
||||
|
||||
```shell
|
||||
python tools/publish_model.py ${INPUT_FILENAME} ${OUTPUT_FILENAME}
|
||||
```
|
||||
|
||||
E.g.,
|
||||
|
||||
```shell
|
||||
python tools/publish_model.py work_dirs/resnet50/latest.pth imagenet_resnet50_20200708.pth
|
||||
```
|
||||
|
||||
The final output filename will be `imagenet_resnet50_20200708-{hash id}.pth`.
|
||||
|
||||
## Tutorials
|
||||
|
||||
Currently, we provide five tutorials for users to [finetune models](tutorials/finetune.md), [add new dataset](tutorials/new_dataset.md), [design data pipeline](tutorials/data_pipeline.md), [add new modules](tutorials/new_modules.md) and [training tricks](tutorials/training_tricks.md).
|
||||
Currently, we provide five tutorials for users to [finetune models](tutorials/finetune.md), [add new dataset](tutorials/new_dataset.md), [design data pipeline](tutorials/data_pipeline.md) and [add new modules](tutorials/new_modules.md).
|
||||
|
|
|
@ -0,0 +1,28 @@
|
|||
# Model Zoo
|
||||
|
||||
## ImageNet
|
||||
|
||||
ImageNet has multiple versions, but the most commonly used one is [ILSVRC 2012](http://www.image-net.org/challenges/LSVRC/2012/).
|
||||
The ResNet family models below are trained by standard data augmentations, i.e., RandomResizedCrop, RandomHorizontalFlip and Normalize.
|
||||
|
||||
|
||||
| Model | Params(M) | Flops(G) | Top-1 (%) | Top-5 (%) | Download |
|
||||
|:---------------------:|:---------:|:--------:|:---------:|:---------:|:--------:|
|
||||
| ResNet-18 | 11.69 | 1.82 | 70.07 | 89.44 | [model](https://openmmlab.oss-accelerate.aliyuncs.com/mmclassification/v0/imagenet/resnet18_batch256_20200708-34ab8f90.pth) | [log](https://openmmlab.oss-accelerate.aliyuncs.com/mmclassification/v0/imagenet/resnet18_batch256_20200708-34ab8f90.log.json) |
|
||||
| ResNet-34 | 21.8 | 3.68 | 73.85 | 91.53 | [model](https://openmmlab.oss-accelerate.aliyuncs.com/mmclassification/v0/imagenet/resnet34_batch256_20200708-32ffb4f7.pth) | [log](https://openmmlab.oss-accelerate.aliyuncs.com/mmclassification/v0/imagenet/resnet34_batch256_20200708-32ffb4f7.log.json) |
|
||||
| ResNet-50 | 25.56 | 4.12 | 76.55 | 93.15 | [model](https://openmmlab.oss-accelerate.aliyuncs.com/mmclassification/v0/imagenet/resnet50_batch256_20200708-cfb998bf.pth) | [log](https://openmmlab.oss-accelerate.aliyuncs.com/mmclassification/v0/imagenet/resnet50_batch256_20200708-cfb998bf.log.json) |
|
||||
| ResNet-101 | 44.55 | 7.85 | 78.18 | 94.03 | [model](https://openmmlab.oss-accelerate.aliyuncs.com/mmclassification/v0/imagenet/resnet101_batch256_20200708-753f3608.pth) | [log](https://openmmlab.oss-accelerate.aliyuncs.com/mmclassification/v0/imagenet/resnet101_batch256_20200708-753f3608.log.json) |
|
||||
| ResNet-152 | 60.19 | 11.58 | 78.63 | 94.16 | [model](https://openmmlab.oss-accelerate.aliyuncs.com/mmclassification/v0/imagenet/resnet152_batch256_20200708-ec25b1f9.pth) | [log](https://openmmlab.oss-accelerate.aliyuncs.com/mmclassification/v0/imagenet/resnet152_batch256_20200708-ec25b1f9.log.json) |
|
||||
| ResNetV1D-50 | 25.58 | 4.36 | 77.4 | 93.66 | [model](https://openmmlab.oss-accelerate.aliyuncs.com/mmclassification/v0/imagenet/resnetv1d50_batch256_20200708-1ad0ce94.pth) | [log](https://openmmlab.oss-accelerate.aliyuncs.com/mmclassification/v0/imagenet/resnetv1d50_batch256_20200708-1ad0ce94.log.json) |
|
||||
| ResNetV1D-101 | 44.57 | 8.09 | 78.85 | 94.38 | [model](https://openmmlab.oss-accelerate.aliyuncs.com/mmclassification/v0/imagenet/resnetv1d101_batch256_20200708-9cb302ef.pth) | [log](https://openmmlab.oss-accelerate.aliyuncs.com/mmclassification/v0/imagenet/resnetv1d101_batch256_20200708-9cb302ef.log.json) |
|
||||
| ResNetV1D-152 | 60.21 | 11.82 | 79.35 | 94.61 | [model](https://openmmlab.oss-accelerate.aliyuncs.com/mmclassification/v0/imagenet/resnetv1d152_batch256_20200708-e79cb6a2.pth) | [log](https://openmmlab.oss-accelerate.aliyuncs.com/mmclassification/v0/imagenet/resnetv1d152_batch256_20200708-e79cb6a2.log.json) |
|
||||
| ResNeXt-32x4d-50 | 25.03 | 4.27 | 77.92 | 93.74 | [model](https://openmmlab.oss-accelerate.aliyuncs.com/mmclassification/v0/imagenet/resnext50_32x4d_batch256_20200708-c07adbb7.pth) | [log](https://openmmlab.oss-accelerate.aliyuncs.com/mmclassification/v0/imagenet/resnext50_32x4d_batch256_20200708-c07adbb7.log.json) |
|
||||
| ResNeXt-32x4d-101 | 44.18 | 8.03 | 78.7 | 94.34 | [model](https://openmmlab.oss-accelerate.aliyuncs.com/mmclassification/v0/imagenet/resnext101_32x4d_batch256_20200708-87f2d1c9.pth) | [log](https://openmmlab.oss-accelerate.aliyuncs.com/mmclassification/v0/imagenet/resnext101_32x4d_batch256_20200708-87f2d1c9.log.json) |
|
||||
| ResNeXt-32x8d-101 | 88.79 | 16.5 | 79.22 | 94.52 | [model](https://openmmlab.oss-accelerate.aliyuncs.com/mmclassification/v0/imagenet/resnext101_32x8d_batch256_20200708-1ec34aa7.pth) | [log](https://openmmlab.oss-accelerate.aliyuncs.com/mmclassification/v0/imagenet/resnext101_32x8d_batch256_20200708-1ec34aa7.log.json) |
|
||||
| ResNeXt-32x4d-152 | 59.95 | 11.8 | 79.06 | 94.47 | [model](https://openmmlab.oss-accelerate.aliyuncs.com/mmclassification/v0/imagenet/resnext152_32x4d_batch256_20200708-aab5034c.pth) | [log](https://openmmlab.oss-accelerate.aliyuncs.com/mmclassification/v0/imagenet/resnext152_32x4d_batch256_20200708-aab5034c.log.json) |
|
||||
| SE-ResNet-50 | 28.09 | 4.13 | 77.81 | 93.87 | [model](https://openmmlab.oss-accelerate.aliyuncs.com/mmclassification/v0/imagenet/se-resnet50_batch256_20200708-657b3c36.pth) | [log](https://openmmlab.oss-accelerate.aliyuncs.com/mmclassification/v0/imagenet/se-resnet50_batch256_20200708-657b3c36.log.json) |
|
||||
| SE-ResNet-101 | 49.33 | 7.86 | 78.36 | 94.08 | [model](https://openmmlab.oss-accelerate.aliyuncs.com/mmclassification/v0/imagenet/se-resnet101_batch256_20200708-038a4d04.pth) | [log](https://openmmlab.oss-accelerate.aliyuncs.com/mmclassification/v0/imagenet/se-resnet101_batch256_20200708-038a4d04.log.json) |
|
||||
| ShuffleNet V1 (g=3) | 1.87 | 0.146 | 67.66 | 87.67 | [model](https://openmmlab.oss-accelerate.aliyuncs.com/mmclassification/v0/imagenet/shufflenet_v1_batch1024_20200708-7a087432.pth) | [log](https://openmmlab.oss-accelerate.aliyuncs.com/mmclassification/v0/imagenet/shufflenet_v1_batch1024_20200708-7a087432.log.json) |
|
||||
| MobileNet V2 | 3.5 | 0.319 | 71.86 | 90.42 | [model](https://openmmlab.oss-accelerate.aliyuncs.com/mmclassification/v0/imagenet/mobilenet_v2_batch256_20200708-3b2dc3af.pth) | [log](https://openmmlab.oss-accelerate.aliyuncs.com/mmclassification/v0/imagenet/mobilenet_v2_batch256_20200708-3b2dc3af.log.json) |
|
||||
|
||||
Models with * are converted from other repos, others are trained by ourselves.
|
Loading…
Reference in New Issue