Yixiao Fang
|
0b96dcaa67
|
[Enhance] Add init_cfg with type='pretrained' to downstream tasks. (#1717)
|
2023-07-28 15:28:29 +08:00 |
Yixiao Fang
|
ae7a7b7560
|
Bump version to 1.0.0 (#1686)
* bump version to 1.0.0
* update
* update
* fix lint
* update
* update
* update changelog
* update
|
2023-07-05 11:51:12 +08:00 |
Mashiro
|
8afad77a35
|
[Enhance] Update fsdp vit-huge and vit-large config (#1675)
* Update fsdp vit-huge and vit-large config
* Update fsdp vit-huge and vit-large config
* rename
|
2023-06-30 11:15:18 +08:00 |
fanqiNO1
|
658db80089
|
[Enhancement] Support deepspeed with flexible runner (#1673)
* [Feature] Support deepspeed with flexible runner
* [Fix] Reformat with yapf
* [Refacor] Rename configs
* [Fix] Reformat with yapf
* [Refactor] Remove unused keys
* [Refactor] Change the _base_ path
* [Refactor] Reformat
|
2023-06-29 10:16:27 +08:00 |
Ma Zerun
|
dbf3df21a3
|
[Refactor] Use `out_type` to specify ViT-like backbone output. (#1408)
* [Refactor] Use to specify ViT-like backbone output.
* Fix ClsBatchNormNeck
* Update mmpretrain/models/necks/mae_neck.py
---------
Co-authored-by: Yixiao Fang <36138628+fangyixiao18@users.noreply.github.com>
|
2023-03-09 11:02:58 +08:00 |
Ma Zerun
|
274a67223e
|
[Feature] Implement layer-wise learning rate decay optimizer constructor. (#1399)
* [Feature] Implement layer-wise learning rate decay optimizer constructor.
* Use num_layers instead of max_depth to avoid misleading
* Add UT
* Update docstring
* Update log info
* update LearningRateDecay configs
---------
Co-authored-by: fangyixiao18 <fangyx18@hotmail.com>
|
2023-03-07 17:30:39 +08:00 |
Ma Zerun
|
a05c79e806
|
[Refactor] Move transforms in mmselfsup to mmpretrain. (#1396)
* [Refactor] Move transforms in mmselfsup to mmpretrain.
* Update transform docs and configs. And register some mmcv transforms in
mmpretrain.
* Fix missing transform wrapper.
* update selfsup transforms
* Fix UT
* Fix UT
* update gaussianblur inconfigs
---------
Co-authored-by: fangyixiao18 <fangyx18@hotmail.com>
|
2023-03-03 15:01:11 +08:00 |
Yixiao Fang
|
89000c10eb
|
[Refactor] Refactor configs and metafile (#1369)
* update base datasets
* update base
* update barlowtwins
* update with new convention
* update
* update
* update
* add schedule
* add densecl
* add eva
* add mae
* add maskfeat
* add milan and mixmim
* add moco
* add swav simclr
* add simmim and simsiam
* refine
* update
* add to model index
* update config inheritance
* fix error in metafile
* Update pre-commit and metafile check script
* update metafile
* fix name error
* Fix classification model name and config name
---------
Co-authored-by: mzr1996 <mzr1996@163.com>
|
2023-02-23 11:17:16 +08:00 |