Ezra-Yu
05124dbb71
fix lint
2023-04-06 22:01:11 +08:00
Ezra-Yu
b8cab5c9f7
update readme
2023-04-06 21:56:25 +08:00
Ezra-Yu
3932ddec10
update ckpt path
2023-04-06 21:56:25 +08:00
techmonsterwang
a6c24d104e
update riformer mmpretrain
2023-04-06 21:56:25 +08:00
techmonsterwang
32c258ff19
update riformer mmpretrain
2023-04-06 21:56:25 +08:00
techmonsterwang
0b70c108b0
update riformer mmpretrain
2023-04-06 21:56:25 +08:00
Yixiao Fang
75dceaa78f
[Refactor] Add ln to vit avg_featmap output ( #1447 )
2023-04-06 11:59:39 +08:00
Ma Zerun
b017670e1b
[Improve] Use PyTorch official `scaled_dot_product_attention` to accelerate `MultiheadAttention`. ( #1434 )
...
* [Improve] Use PyTorch official `scaled_dot_product_attention` to accelerate `MultiheadAttention`.
* Support `--local-rank` and `--amp` option for new version PyTorch.
* Fix imports and UT.
2023-03-29 15:50:44 +08:00
Ma Zerun
dbf3df21a3
[Refactor] Use `out_type` to specify ViT-like backbone output. ( #1408 )
...
* [Refactor] Use to specify ViT-like backbone output.
* Fix ClsBatchNormNeck
* Update mmpretrain/models/necks/mae_neck.py
---------
Co-authored-by: Yixiao Fang <36138628+fangyixiao18@users.noreply.github.com>
2023-03-09 11:02:58 +08:00
Ma Zerun
274a67223e
[Feature] Implement layer-wise learning rate decay optimizer constructor. ( #1399 )
...
* [Feature] Implement layer-wise learning rate decay optimizer constructor.
* Use num_layers instead of max_depth to avoid misleading
* Add UT
* Update docstring
* Update log info
* update LearningRateDecay configs
---------
Co-authored-by: fangyixiao18 <fangyx18@hotmail.com>
2023-03-07 17:30:39 +08:00
Yixiao Fang
08dc8c75d3
[Refactor] Add selfsup algorithms. ( #1389 )
...
* remove basehead
* add moco series
* add byol simclr simsiam
* add ut
* update configs
* add simsiam hook
* add and refactor beit
* update ut
* add cae
* update extract_feat
* refactor cae
* add mae
* refactor data preprocessor
* update heads
* add maskfeat
* add milan
* add simmim
* add mixmim
* fix lint
* fix ut
* fix lint
* add eva
* add densecl
* add barlowtwins
* add swav
* fix lint
* update readtherdocs rst
* update docs
* update
* Decrease UT memory usage
* Fix docstring
* update DALLEEncoder
* Update model docs
* refactor dalle encoder
* update docstring
* fix ut
* fix config error
* add val_cfg and test_cfg
* refactor clip generator
* fix lint
* pass check
* fix ut
* add lars
* update type of BEiT in configs
* Use MMEngine style momentum in EMA.
* apply mmpretrain solarize
---------
Co-authored-by: mzr1996 <mzr1996@163.com>
2023-03-06 16:53:15 +08:00
Yixiao Fang
e453a45d31
[Refactor] Add self-supervised backbones and target generators. ( #1379 )
...
* add heads
* add losses
* fix
* remove mim head
* add modified backbones and target generators
* add unittest
* refactor caevit
* add window_size check
* fix lint
* apply new DataSample
* fix ut error
* update ut
* fix ut
* fix lint
* Update base modules.
---------
Co-authored-by: mzr1996 <mzr1996@163.com>
2023-02-28 15:59:17 +08:00
mzr1996
0979e78573
Rename the package name to `mmpretrain`.
2023-02-17 15:20:55 +08:00