mirror of
https://github.com/open-mmlab/mmpretrain.git
synced 2025-06-03 14:59:18 +08:00
* update base datasets * update base * update barlowtwins * update with new convention * update * update * update * add schedule * add densecl * add eva * add mae * add maskfeat * add milan and mixmim * add moco * add swav simclr * add simmim and simsiam * refine * update * add to model index * update config inheritance * fix error in metafile * Update pre-commit and metafile check script * update metafile * fix name error * Fix classification model name and config name --------- Co-authored-by: mzr1996 <mzr1996@163.com>
24 lines
656 B
Python
24 lines
656 B
Python
# model settings
|
|
model = dict(
|
|
type='MAE',
|
|
backbone=dict(type='MAEViT', arch='b', patch_size=16, mask_ratio=0.75),
|
|
neck=dict(
|
|
type='MAEPretrainDecoder',
|
|
patch_size=16,
|
|
in_chans=3,
|
|
embed_dim=768,
|
|
decoder_embed_dim=512,
|
|
decoder_depth=8,
|
|
decoder_num_heads=16,
|
|
mlp_ratio=4.,
|
|
),
|
|
head=dict(
|
|
type='MAEPretrainHead',
|
|
norm_pix=True,
|
|
patch_size=16,
|
|
loss=dict(type='MAEReconstructionLoss')),
|
|
init_cfg=[
|
|
dict(type='Xavier', layer='Linear', distribution='uniform'),
|
|
dict(type='Constant', layer='LayerNorm', val=1.0, bias=0.0)
|
|
])
|