51 lines
2.0 KiB
YAML
51 lines
2.0 KiB
YAML
Collections:
|
|
- Name: MLP-Mixer
|
|
Metadata:
|
|
Training Data: ImageNet-1k
|
|
Architecture:
|
|
- MLP
|
|
- Layer Normalization
|
|
- Dropout
|
|
Paper:
|
|
URL: https://arxiv.org/abs/2105.01601
|
|
Title: "MLP-Mixer: An all-MLP Architecture for Vision"
|
|
README: configs/mlp_mixer/README.md
|
|
Code:
|
|
URL: https://github.com/open-mmlab/mmpretrain/blob/v0.18.0/mmcls/models/backbones/mlp_mixer.py
|
|
Version: v0.18.0
|
|
|
|
Models:
|
|
- Name: mlp-mixer-base-p16_3rdparty_64xb64_in1k
|
|
In Collection: MLP-Mixer
|
|
Config: configs/mlp_mixer/mlp-mixer-base-p16_64xb64_in1k.py
|
|
Metadata:
|
|
FLOPs: 12610000000 # 12.61 G
|
|
Parameters: 59880000 # 59.88 M
|
|
Results:
|
|
- Dataset: ImageNet-1k
|
|
Metrics:
|
|
Top 1 Accuracy: 76.68
|
|
Top 5 Accuracy: 92.25
|
|
Task: Image Classification
|
|
Weights: https://download.openmmlab.com/mmclassification/v0/mlp-mixer/mixer-base-p16_3rdparty_64xb64_in1k_20211124-1377e3e0.pth
|
|
Converted From:
|
|
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_mixer_b16_224-76587d61.pth
|
|
Code: https://github.com/rwightman/pytorch-image-models/blob/master/timm/models/mlp_mixer.py#L70
|
|
|
|
- Name: mlp-mixer-large-p16_3rdparty_64xb64_in1k
|
|
In Collection: MLP-Mixer
|
|
Config: configs/mlp_mixer/mlp-mixer-large-p16_64xb64_in1k.py
|
|
Metadata:
|
|
FLOPs: 44570000000 # 44.57 G
|
|
Parameters: 208200000 # 208.2 M
|
|
Results:
|
|
- Dataset: ImageNet-1k
|
|
Metrics:
|
|
Top 1 Accuracy: 72.34
|
|
Top 5 Accuracy: 88.02
|
|
Task: Image Classification
|
|
Weights: https://download.openmmlab.com/mmclassification/v0/mlp-mixer/mixer-large-p16_3rdparty_64xb64_in1k_20211124-5a2519d2.pth
|
|
Converted From:
|
|
Weights: https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_mixer_b16_224_in21k-617b3de2.pth
|
|
Code: https://github.com/rwightman/pytorch-image-models/blob/master/timm/models/mlp_mixer.py#L73
|