mirror of
https://github.com/open-mmlab/mmpretrain.git
synced 2025-06-03 14:59:18 +08:00
* update base datasets * update base * update barlowtwins * update with new convention * update * update * update * add schedule * add densecl * add eva * add mae * add maskfeat * add milan and mixmim * add moco * add swav simclr * add simmim and simsiam * refine * update * add to model index * update config inheritance * fix error in metafile * Update pre-commit and metafile check script * update metafile * fix name error * Fix classification model name and config name --------- Co-authored-by: mzr1996 <mzr1996@163.com>
63 lines
2.2 KiB
YAML
63 lines
2.2 KiB
YAML
Collections:
|
|
- Name: MixMIM
|
|
Metadata:
|
|
Architecture:
|
|
- Attention Dropout
|
|
- Convolution
|
|
- Dense Connections
|
|
- Dropout
|
|
- GELU
|
|
- Layer Normalization
|
|
- Multi-Head Attention
|
|
- Scaled Dot-Product Attention
|
|
- Tanh Activation
|
|
Paper:
|
|
Title: 'MixMIM: Mixed and Masked Image Modeling for Efficient Visual Representation Learning'
|
|
URL: https://arxiv.org/abs/2205.13137
|
|
README: configs/mixmim/README.md
|
|
Code:
|
|
URL: https://github.com/open-mmlab/mmclassification/blob/dev-1.x/mmcls/models/backbones/mixmim.py
|
|
Version: v1.0.0rc4
|
|
|
|
Models:
|
|
- Name: mixmim-base_3rdparty_in1k
|
|
Metadata:
|
|
FLOPs: 16352000000
|
|
Parameters: 88344000
|
|
Training Data:
|
|
- ImageNet-1k
|
|
In Collection: MixMIM
|
|
Results:
|
|
- Dataset: ImageNet-1k
|
|
Task: Image Classification
|
|
Metrics:
|
|
Top 1 Accuracy: 84.6
|
|
Top 5 Accuracy: 97.0
|
|
Weights: https://download.openmmlab.com/mmclassification/v0/mixmim/mixmim-base_3rdparty_in1k_20221206-e40e2c8c.pth
|
|
Config: configs/mixmim/benchmarks/mixmim-base_8xb64_in1k.py
|
|
Converted From:
|
|
Code: https://github.com/Sense-X/MixMIM
|
|
|
|
- Name: mixmim_mixmim-base_16xb128-coslr-300e_in1k
|
|
In Collection: MixMIM
|
|
Metadata:
|
|
Epochs: 300
|
|
Batch Size: 2048
|
|
Results: null
|
|
Config: configs/mixmim/mixmim_mixmim-base_16xb128-coslr-300e_in1k.py
|
|
Weights: https://download.openmmlab.com/mmselfsup/1.x/mixmim/mixmim-base-p16_16xb128-coslr-300e_in1k/mixmim-base-p16_16xb128-coslr-300e_in1k_20221208-44fe8d2c.pth
|
|
Downstream:
|
|
- mixmim-base_mixmim-pre_8xb128-coslr-100e_in1k
|
|
- Name: mixmim-base_mixmim-pre_8xb128-coslr-100e_in1k
|
|
In Collection: MILAN
|
|
Metadata:
|
|
Epochs: 100
|
|
Batch Size: 1024
|
|
Results:
|
|
- Task: Image Classification
|
|
Dataset: ImageNet-1k
|
|
Metrics:
|
|
Top 1 Accuracy: 84.63
|
|
Config: configs/mixmim/benchmarks/mixmim-base_8xb128-coslr-100e_in1k.py
|
|
Weights: https://download.openmmlab.com/mmselfsup/1.x/mixmim/mixmim-base-p16_16xb128-coslr-300e_in1k/mixmim-base-p16_ft-8xb128-coslr-100e_in1k/mixmim-base-p16_ft-8xb128-coslr-100e_in1k_20221208-41ecada9.pth
|