Yuan Liu
|
fa53174fd9
|
[Feature]: Add MFF (#1725)
* [Feature]: Add MFF
* [Feature]: Add mff linear prob
* [Feature]: Add ft
* [Fix]: Update docstring
* [Feature]: Update out_indices
* [Feature]: Add prefix to ft
* [Feature]: Add README
* [Feature]: Update readme
* [Feature]: Update README
* [Feature]: Add metafile
* [Feature]: Update README
* [Fix]: Fix lint
* [Feature]: Add UT
* [Feature]: Update paper link
|
2023-08-08 16:01:07 +08:00 |
Fabien Merceron PRL
|
db395d35b1
|
fix_freeze_without_cls_token_vit (#1693)
|
2023-07-14 15:43:19 +08:00 |
fanqiNO1
|
5c43d3ef42
|
[Refactor] BEiT refactor (#1705)
* [Refactor] BEiT refactor
* [Fix] Fix arch zoo
* [Fix] Fix arch zoo
* [Fix] Fix freeze stages
* [Fix] Fix freeze ln2
* [Fix] Fix freezing vit ln2
|
2023-07-11 15:49:41 +08:00 |
fanqiNO1
|
7cbfb36c14
|
[Refactor] Fix spelling (#1681)
* [Refactor] Fix spelling
* [Refactor] Fix spelling
* [Refactor] Fix spelling
* [Refactor] Fix spelling
|
2023-07-05 11:07:43 +08:00 |
Peng Lu
|
00030e3f7d
|
[Fix] refactor _prepare_pos_embed in ViT to fix bug in loading old checkpoint (#1679)
|
2023-07-03 11:36:44 +08:00 |
Wangbo Zhao(黑色枷锁)
|
68758db7a8
|
[Fix] freeze pre norm in vision transformer. (#1672)
|
2023-06-28 17:00:27 +08:00 |
Yixiao Fang
|
70ff2abbf7
|
[Refactor] Refactor _prepare_pos_embed in ViT (#1656)
* deal with cls_token
* Update implement
---------
Co-authored-by: mzr1996 <mzr1996@163.com>
|
2023-06-20 17:37:08 +08:00 |
Yixiao Fang
|
d9e561a09d
|
[Feature] Support dinov2 backbone (#1522)
* support dinov2 backbone
* update metafile and readme
* compatible to use_layer_scale
* update SwiGLUFFN
* add deprecation warning
* update
|
2023-05-05 16:59:37 +08:00 |
Yixiao Fang
|
75dceaa78f
|
[Refactor] Add ln to vit avg_featmap output (#1447)
|
2023-04-06 11:59:39 +08:00 |
Ma Zerun
|
dbf3df21a3
|
[Refactor] Use `out_type` to specify ViT-like backbone output. (#1408)
* [Refactor] Use to specify ViT-like backbone output.
* Fix ClsBatchNormNeck
* Update mmpretrain/models/necks/mae_neck.py
---------
Co-authored-by: Yixiao Fang <36138628+fangyixiao18@users.noreply.github.com>
|
2023-03-09 11:02:58 +08:00 |
Ma Zerun
|
274a67223e
|
[Feature] Implement layer-wise learning rate decay optimizer constructor. (#1399)
* [Feature] Implement layer-wise learning rate decay optimizer constructor.
* Use num_layers instead of max_depth to avoid misleading
* Add UT
* Update docstring
* Update log info
* update LearningRateDecay configs
---------
Co-authored-by: fangyixiao18 <fangyx18@hotmail.com>
|
2023-03-07 17:30:39 +08:00 |
mzr1996
|
0979e78573
|
Rename the package name to `mmpretrain`.
|
2023-02-17 15:20:55 +08:00 |