Commit Graph

14 Commits (17a886cb5825cd8c26df4e65f7112d404b99fe12)

Author SHA1 Message Date
ZhangYiqin da1da48eb6
[Enhance] Add iTPN Supports for Non-three channel image (#1735)
* Add channel argments to mae_head

When trying iTPN pretrain, it only supports images with 3 channels. One of the restrictions is from MAEHead.

* Transfer other argments from iTPNHiViT to HiViT

The HiViT supports specifying channels, but the iTPNHiViT class can't pass channel argments to it. This is one of the reasons that iTPNHiViT implementation only support images with 3 channels.

* Update itpn.py

Fix hint problem
2023-09-04 13:11:16 +08:00
Yuan Liu fa53174fd9
[Feature]: Add MFF (#1725)
* [Feature]: Add MFF

* [Feature]: Add mff linear prob

* [Feature]: Add ft

* [Fix]: Update docstring

* [Feature]: Update out_indices

* [Feature]: Add prefix to ft

* [Feature]: Add README

* [Feature]: Update readme

* [Feature]: Update README

* [Feature]: Add metafile

* [Feature]: Update README

* [Fix]: Fix lint

* [Feature]: Add UT

* [Feature]: Update paper link
2023-08-08 16:01:07 +08:00
fanqiNO1 465b6bdeec
[Refactor] Fix spelling (#1689) 2023-07-13 15:38:58 +08:00
fanqiNO1 7cbfb36c14
[Refactor] Fix spelling (#1681)
* [Refactor] Fix spelling

* [Refactor] Fix spelling

* [Refactor] Fix spelling

* [Refactor] Fix spelling
2023-07-05 11:07:43 +08:00
Yixiao Fang a1cfe888e2
[Feature] Support SparK. (#1531)
* add spark configs

* fix configs

* remove repeat aug

* add module codes

* support lr layer decay of resnet

* update

* fix lint

* add metafile and readme

* fix lint

* add models and logs

* refactor codes

* fix lint

* update model rst

* update name

* add docstring

* add ut

* fix lint

---------

Co-authored-by: Ma Zerun <mzr1996@163.com>
2023-06-19 11:27:50 +08:00
Yixiao Fang e4c4a81b56
[Feature] Support iTPN and HiViT (#1584)
* hivit added

* Update hivit.py

* Update hivit.py

* Add files via upload

* Update __init__.py

* Add files via upload

* Update __init__.py

* Add files via upload

* Update hivit.py

* Add files via upload

* Add files via upload

* Add files via upload

* Add files via upload

* Update itpn.py

* Add files via upload

* Update __init__.py

* Update mae_hivit-base-p16.py

* Delete mim_itpn-base-p16.py

* Add files via upload

* Update itpn_hivit-base-p16.py

* Update itpn.py

* Update hivit.py

* Update __init__.py

* Update mae.py

* Delete hivit.py

* Update __init__.py

* Delete configs/itpn directory

* Add files via upload

* Add files via upload

* Delete configs/hivit directory

* Add files via upload

* refactor and add metafile and readme

* update clip

* add ut

* update ut

* update

* update docstring

* update model.rst

---------

Co-authored-by: 田运杰 <48153283+sunsmarterjie@users.noreply.github.com>
2023-05-26 12:08:34 +08:00
Yixiao Fang 770eb8e24a
[Fix] Fix ddp bugs caused by `out_type`. (#1570)
* set out_type to be 'raw'

* update test
2023-05-17 17:32:10 +08:00
Yixiao Fang 15cc2a5193
[Fix] Fix clip generator init bug (#1518) 2023-04-25 19:35:09 +08:00
Ma Zerun dbf3df21a3
[Refactor] Use `out_type` to specify ViT-like backbone output. (#1408)
* [Refactor] Use  to specify ViT-like backbone output.

* Fix ClsBatchNormNeck

* Update mmpretrain/models/necks/mae_neck.py

---------

Co-authored-by: Yixiao Fang <36138628+fangyixiao18@users.noreply.github.com>
2023-03-09 11:02:58 +08:00
Ma Zerun 274a67223e
[Feature] Implement layer-wise learning rate decay optimizer constructor. (#1399)
* [Feature] Implement layer-wise learning rate decay optimizer constructor.

* Use num_layers instead of max_depth to avoid misleading

* Add UT

* Update docstring

* Update log info

* update LearningRateDecay configs

---------

Co-authored-by: fangyixiao18 <fangyx18@hotmail.com>
2023-03-07 17:30:39 +08:00
Yixiao Fang 08dc8c75d3
[Refactor] Add selfsup algorithms. (#1389)
* remove basehead

* add moco series

* add byol simclr simsiam

* add ut

* update configs

* add simsiam hook

* add and refactor beit

* update ut

* add cae

* update extract_feat

* refactor cae

* add mae

* refactor data preprocessor

* update heads

* add maskfeat

* add milan

* add simmim

* add mixmim

* fix lint

* fix ut

* fix lint

* add eva

* add densecl

* add barlowtwins

* add swav

* fix lint

* update readtherdocs rst

* update docs

* update

* Decrease UT memory usage

* Fix docstring

* update DALLEEncoder

* Update model docs

* refactor dalle encoder

* update docstring

* fix ut

* fix config error

* add val_cfg and test_cfg

* refactor clip generator

* fix lint

* pass check

* fix ut

* add lars

* update type of BEiT in configs

* Use MMEngine style momentum in EMA.

* apply mmpretrain solarize

---------

Co-authored-by: mzr1996 <mzr1996@163.com>
2023-03-06 16:53:15 +08:00
Yixiao Fang c9670173aa
[Refactor] Move and refactor utils from mmselfsup. (#1385)
* add heads

* add losses

* fix

* remove mim head

* add modified backbones and target generators

* fix lint

* fix lint

* add heads

* add losses

* fix

* add data preprocessor from mmselfsup

* add ut for data prepocessor

* add GatherLayer

* add ema

* add batch shuffle

* add misc

* fix lint

* update

* update docstring
2023-02-28 17:04:40 +08:00
Yixiao Fang e453a45d31
[Refactor] Add self-supervised backbones and target generators. (#1379)
* add heads

* add losses

* fix

* remove mim head

* add modified backbones and target generators

* add unittest

* refactor caevit

* add window_size check

* fix lint

* apply new DataSample

* fix ut error

* update ut

* fix ut

* fix lint

* Update base modules.

---------

Co-authored-by: mzr1996 <mzr1996@163.com>
2023-02-28 15:59:17 +08:00
mzr1996 0979e78573 Rename the package name to `mmpretrain`. 2023-02-17 15:20:55 +08:00