Commit Graph

8 Commits (17a886cb5825cd8c26df4e65f7112d404b99fe12)

Author SHA1 Message Date
Coobiw ed5924b6fe
[Feature] Implement of RAM with a gradio interface. (#1802)
* [CodeCamp2023-584]Support DINO self-supervised learning in project (#1756)

* feat: impelemt DINO

* chore: delete debug code

* chore: impplement pre-commit

* fix: fix imported package

* chore: pre-commit check

* [CodeCamp2023-340] New Version of config Adapting MobileNet Algorithm (#1774)

* add new config adapting MobileNetV2,V3

* add base model config for mobile net v3, modified all training configs of mobile net v3 inherit from the base model config

* removed directory _base_/models/mobilenet_v3

* [Feature] Implement of Zero-Shot CLIP Classifier (#1737)

* zero-shot CLIP

* modify zero-shot clip config

* add in1k_sub_prompt(8 prompts) for improvement

* add some annotations doc

* clip base class & clip_zs sub-class

* some modifications of details after review

* convert into and use mmpretrain-vit

* modify names of some files and directories

* ram init commit

* [Fix] Fix pipeline bug in image retrieval inferencer

* [CodeCamp2023-341] 多模态数据集文档补充-COCO Retrieval

* Update OFA to compat with latest huggingface.

* Update train.py to compat with new config

* Bump version to v1.1.0

* Update __init__.py

---------

Co-authored-by: LALBJ <40877073+LALBJ@users.noreply.github.com>
Co-authored-by: DE009 <57087096+DE009@users.noreply.github.com>
Co-authored-by: mzr1996 <mzr1996@163.com>
Co-authored-by: 飞飞 <102729089+ASHORE1225@users.noreply.github.com>
2023-10-25 16:23:45 +08:00
John 9b75ce0aa4 only keep one file to set swin transformer v2 model config 2023-09-05 22:16:07 +08:00
John f4d372ba7d only keep one file to set swin transformer model config 2023-09-05 21:26:43 +08:00
John 634852ad61 [CodeCamp2023-338] New Version of config Adapting Swin Transformer Algorithm 2023-08-31 18:15:47 +08:00
mzr1996 bf62497e02 Merge remote-tracking branch 'origin/main' into dev 2023-08-15 11:37:22 +08:00
mstwutao 6474d6befa
[CodeCamp2023-336] New Version of `config` Adapting MAE Algorithm (#1750)
* fix typo MIMHIVIT to MAEHiViT

* fix typo MIMHiViT to MAEHiViT

* [CodeCamp2023-336] New version of config adapting MAE algorithm

* pre-commit check

* Revert soft-link modification

---------

Co-authored-by: mzr1996 <mzr1996@163.com>
2023-08-14 17:20:39 +08:00
Zeyuan 2fb52eefdc
[CodeCamp2023-339] New Version of `config` Adapting Vision Transformer Algorithm (#1727)
* add old config

* add old config

* add old config

* renew vit-base-p16_64xb64_in1k.py

* rename

* finish vit_base_p16_64xb64_in1k_384px.py

* finish vit_base_p32_64xb64_in1k.py and 384px

* finish 4 vit_large*.py

* finish vit_base_p16_32xb128_mae_in1k.py

* add vit_base_p16_4xb544_ipu_in1k.py

* modify data_root

* using  to modify cfg

* pre-commit check

* ignore ipu

* keep other files no change

* remove redefinition

* only keep vit_base_p16.py

* move init_cfg into model.update
2023-08-02 10:06:08 +08:00
Yixiao Fang aac398a83f
[Feature] Support new configs. (#1639)
* [Feature] Support new configs (#1638)

* add new config of mae and simclr

* update

* update setup.cfg

* update eva

* update

* update new config

* Add new config

* remove __init__.py

* 1. remove ; 2. remove mmpretrain/configs/_base_/models/convnext

* remove model_wrapper_cfg and add out type

* Add comment for setting default_scope to NOne

* update if '_base_' order

* update

* revert changes

---------

Co-authored-by: fangyixiao18 <fangyx18@hotmail.com>

* Add warn at the head of new config files

---------

Co-authored-by: Mashiro <57566630+HAOCHENYE@users.noreply.github.com>
Co-authored-by: mzr1996 <mzr1996@163.com>
2023-06-16 16:54:45 +08:00