mmpretrain/configs/vision_transformer/vit-large-p32_64xb64_in1k.py
Yixiao Fang 89000c10eb
[Refactor] Refactor configs and metafile (#1369)
* update base datasets

* update base

* update barlowtwins

* update with new convention

* update

* update

* update

* add schedule

* add densecl

* add eva

* add mae

* add maskfeat

* add milan and mixmim

* add moco

* add swav simclr

* add simmim and simsiam

* refine

* update

* add to model index

* update config inheritance

* fix error in metafile

* Update pre-commit and metafile check script

* update metafile

* fix name error

* Fix classification model name and config name

---------

Co-authored-by: mzr1996 <mzr1996@163.com>
2023-02-23 11:17:16 +08:00

16 lines
399 B
Python

_base_ = [
'../_base_/models/vit-large-p32.py',
'../_base_/datasets/imagenet_bs64_pil_resize_autoaug.py',
'../_base_/schedules/imagenet_bs4096_AdamW.py',
'../_base_/default_runtime.py'
]
# model setting
model = dict(
head=dict(hidden_dim=3072),
train_cfg=dict(augments=dict(type='Mixup', alpha=0.2)),
)
# schedule setting
optim_wrapper = dict(clip_grad=dict(max_norm=1.0))