70 lines
2.5 KiB
YAML
70 lines
2.5 KiB
YAML
Collections:
|
|
- Name: BEiT
|
|
Metadata:
|
|
Architecture:
|
|
- Attention Dropout
|
|
- Convolution
|
|
- Dense Connections
|
|
- Dropout
|
|
- GELU
|
|
- Layer Normalization
|
|
- Multi-Head Attention
|
|
- Scaled Dot-Product Attention
|
|
- Tanh Activation
|
|
Paper:
|
|
Title: 'BEiT: BERT Pre-Training of Image Transformers'
|
|
URL: https://arxiv.org/abs/2106.08254
|
|
README: configs/beit/README.md
|
|
Code:
|
|
URL: https://github.com/open-mmlab/mmpretrain/blob/main/mmpretrain/models/backbones/beit.py
|
|
Version: v1.0.0rc4
|
|
|
|
Models:
|
|
- Name: beit_beit-base-p16_8xb256-amp-coslr-300e_in1k
|
|
Metadata:
|
|
Epochs: 300
|
|
Batch Size: 2048
|
|
FLOPs: 17581219584
|
|
Parameters: 86530984
|
|
Training Data: ImageNet-1k
|
|
In Collection: BEiT
|
|
Results: null
|
|
Weights: https://download.openmmlab.com/mmselfsup/1.x/beit/beit_vit-base-p16_8xb256-amp-coslr-300e_in1k/beit_vit-base-p16_8xb256-amp-coslr-300e_in1k_20221128-ab79e626.pth
|
|
Config: configs/beit/beit_beit-base-p16_8xb256-amp-coslr-300e_in1k.py
|
|
Downstream:
|
|
- beit-base-p16_beit-pre_8xb128-coslr-100e_in1k
|
|
- Name: beit-base-p16_beit-pre_8xb128-coslr-100e_in1k
|
|
Metadata:
|
|
Epochs: 100
|
|
Batch Size: 1024
|
|
FLOPs: 17581219584
|
|
Parameters: 86530984
|
|
Training Data: ImageNet-1k
|
|
In Collection: BEiT
|
|
Results:
|
|
- Task: Image Classification
|
|
Dataset: ImageNet-1k
|
|
Metrics:
|
|
Top 1 Accuracy: 83.1
|
|
Weights: https://download.openmmlab.com/mmselfsup/1.x/beit/beit_vit-base-p16_8xb256-amp-coslr-300e_in1k/vit-base-p16_ft-8xb128-coslr-100e_in1k/vit-base-p16_ft-8xb128-coslr-100e_in1k_20221128-0ca393e9.pth
|
|
Config: configs/beit/benchmarks/beit-base-p16_8xb128-coslr-100e_in1k.py
|
|
- Name: beit-base-p16_beit-in21k-pre_3rdparty_in1k
|
|
Metadata:
|
|
FLOPs: 17581219584
|
|
Parameters: 86530984
|
|
Training Data:
|
|
- ImageNet-21k
|
|
- ImageNet-1k
|
|
In Collection: BEiT
|
|
Results:
|
|
- Dataset: ImageNet-1k
|
|
Task: Image Classification
|
|
Metrics:
|
|
Top 1 Accuracy: 85.28
|
|
Top 5 Accuracy: 97.59
|
|
Weights: https://download.openmmlab.com/mmclassification/v0/beit/beit-base_3rdparty_in1k_20221114-c0a4df23.pth
|
|
Config: configs/beit/benchmarks/beit-base-p16_8xb64_in1k.py
|
|
Converted From:
|
|
Weights: https://conversationhub.blob.core.windows.net/beit-share-public/beit/beit_base_patch16_224_pt22k_ft22kto1k.pth
|
|
Code: https://github.com/microsoft/unilm/tree/master/beit
|