Yixiao Fang
d9e561a09d
[Feature] Support dinov2 backbone ( #1522 )
...
* support dinov2 backbone
* update metafile and readme
* compatible to use_layer_scale
* update SwiGLUFFN
* add deprecation warning
* update
2023-05-05 16:59:37 +08:00
Ma Zerun
b017670e1b
[Improve] Use PyTorch official `scaled_dot_product_attention` to accelerate `MultiheadAttention`. ( #1434 )
...
* [Improve] Use PyTorch official `scaled_dot_product_attention` to accelerate `MultiheadAttention`.
* Support `--local-rank` and `--amp` option for new version PyTorch.
* Fix imports and UT.
2023-03-29 15:50:44 +08:00
Yixiao Fang
e453a45d31
[Refactor] Add self-supervised backbones and target generators. ( #1379 )
...
* add heads
* add losses
* fix
* remove mim head
* add modified backbones and target generators
* add unittest
* refactor caevit
* add window_size check
* fix lint
* apply new DataSample
* fix ut error
* update ut
* fix ut
* fix lint
* Update base modules.
---------
Co-authored-by: mzr1996 <mzr1996@163.com>
2023-02-28 15:59:17 +08:00
Yixiao Fang
63d9f27fde
[Refactor] Add necks, heads and losses for the self-supervised task. ( #1376 )
...
* add necks
* refactor linear neck
* rename simmim neck
* add heads
* add losses
* fix
* add unittest
* update
* update cae
* remove mim head
* update config
2023-02-28 10:05:00 +08:00
mzr1996
0979e78573
Rename the package name to `mmpretrain`.
2023-02-17 15:20:55 +08:00