Commit Graph

17 Commits (1a7cebe4b90cf72e730a7066267b75561864e811)

Author SHA1 Message Date
Ma Zerun 1a7cebe4b9
[Refactor] Refactor unittest (#321)
* Refactor unit tests folder structure.

* Remove label smooth and Vit test in `test_classifiers.py`

* Rename test_utils in dataset to test_dataset_utils

* Split test_models/test_utils/test_utils.py to multiple sub files.

* Add unit tests of classifiers and heads

* Use patch context manager.

* Add unit test of `is_tracing`, and add warning in `is_tracing` if torch
verison is smaller than 1.6.0
2021-07-08 22:49:05 +08:00
Ma Zerun 71621a5f62
Add `is_tracing` helper function to fix a tracing bug in PyTorch 1.6 (#347) 2021-07-07 11:55:53 +08:00
Ma Zerun 076ee10cac
[Feature] Add swin-transformer model. (#271)
* Add swin transformer archs S, B and L.

* Add SwinTransformer configs

* Add train config files of swin.

* Align init method with original code

* Use nn.Unfold to merge patch

* Change all ConfigDict to dict

* Add init_cfg for all subclasses of BaseModule.

* Use mmcv version init function

* Add Swin README

* Use safer cfg copy method

* Improve docstring and variable name.

* Fix some difference in randaug

Fix BGR bug, align scheduler config.

Fix label smoothing parameter difference.

* Fix missing droppath in attn

* Fix bug of relative posititon table if window width is not equal to
height.

* Make `PatchMerging` more general, support kernel, stride, padding and
dilation.

* Rename `residual` to `identity` in attention and FFN.

* Add `auto_pad` option to auto pad feature map

* Improve docstring.

* Fix bug in ShiftWMSA padding.

* Remove unused `key` and `value` in ShiftWMSA

* Move `PatchMerging` into utils and use common `PatchEmbed`.

* Use latest `LinearClsHead`, train augments and label smooth settings.
And remove original `SwinLinearClsHead`.

* Mark some configs as "Evalution Only".

* Remove useless comment in config

* 1. Move ShiftWindowMSA and WindowMSA to `utils/attention.py`
2. Add docstrings of each module.
3. Fix some variables' names.
4. Other small improvement.

* Add unit tests of swin-transformer and patchmerging.

* Fix some bugs in unit tests.

* Fix bug of rel_position_index if window is not square.

* Make WindowMSA implicit, and add unit tests.

* Add metafile.yml, update readme and model_zoo.
2021-07-01 09:30:42 +08:00
whcao bee0ac6b56
[Refactor]Modify patchembed (#330)
* add mytrain.py for test

* test before layers

* test attr in layers

* test classifier

* delete mytrain.py

* add patchembed and hybridembed

* add patchembed and hybridembed to __init__

* test patchembed and hybridembed

* fix some comments
2021-06-30 20:48:04 +08:00
Ma Zerun 65410b05ad
Fix Mobilenetv3 structure and add pretrained model (#291)
* Refactor Mobilenetv3 structure and add ConvClsHead.

* Change model's name from 'MobileNetv3' to 'MobileNetV3'

* Modify configs for MobileNetV3 on CIFAR10.

And add MobileNetV3 configs for imagenet

* Fix activate setting bugs in MobileNetV3.

And remove bias in SELayer.

* Modify unittest

* Remove useless config and file.

* Fix mobilenetv3-large arch setting

* Add dropout option in ConvClsHead

* Fix MobilenetV3 structure according to torchvision version.

1. Remove with_expand_conv option in InvertedResidual, it should be decided by channels.

2. Revert activation function, should before SE layer.

* Format code.

* Rename MobilenetV3 arch "big" to "large".

* Add mobilenetv3_small torchvision training recipe

* Modify default `out_indices` of MobilenetV3, now it will change
according to `arch` if not specified.

* Add MobilenetV3 large config.

* Add mobilenetv3 README

* Modify InvertedResidual unit test.

* Refactor ConvClsHead to StackedLinearClsHead, and add unit tests.

* Add unit test for `simple_test` of `StackedLinearClsHead`.

* Fix typo

Co-authored-by: Yidi Shao <ydshao@smail.nju.edu.cn>
2021-06-27 23:19:36 +08:00
whcao 3a08db9182
[Feature]Add augments to models/utils (#278)
* add mytrain.py for test

* test before layers

* test attr in layers

* test classifier

* delete mytrain.py

* add rand_bbox_minmax rand_bbox and cutmix_bbox_and_lam to BaseCutMixLayer

* add mixup_prob to BatchMixupLayer

* add cutmixup

* add cutmixup to __init__

* test classifier with cutmixup

* delete some comments

* set mixup_prob default to 1.0

* add cutmixup to classifier

* use cutmixup

* use cutmixup

* fix bugs

* test cutmixup

* move mixup and cutmix to augment

* inherit from BaseAugment

* add BaseAugment

* inherit from BaseAugment

* rename identity.py

* add @

* build augment

* register module

* rename to augment.py

* delete cutmixup.py

* do not inherit from BaseAugment

* add augments

* use augments in classifier

* prob default to 1.0

* add comments

* use augments

* use augments

* assert sum of augmentation probabilities should equal to 1

* augmentation probabilities equal to 1

* calculate Identity prob

* replace xxx with self.xxx

* add comments

* sync with augments

* for BC-breaking

* delete useless comments in mixup.py
2021-06-20 09:44:51 +08:00
Miao Zheng 4ca21c7d03
[WIP] Refactoring weights initialization (#270)
* [WIP] Refactoring weights initialization

* fix lint and constant init cfg

* fix pretrained bug

* fix typo

* fix isort

* revise model utils
2021-06-10 10:54:34 +08:00
whcao affb39fe07
[Feature]Add Vit (#214)
* add imagenet bs 4096

* add vit_base_patch16_224_finetune

* add vit_base_patch16_224_pretrain

* add vit_base_patch16_384_finetune

* add vit_base_patch16_384_finetune

* add vit_b_p16_224_finetune_imagenet

* add vit_b_p16_224_pretrain_imagenet

* add vit_b_p16_384_finetune_imagenet

* add vit

* add vit

* add vit head

* vit unitest

* keep up with ClsHead

* test vit

* add flag to determiine whether to calculate acc during training

* Changes related to mmcv1.3.0

* change checkpoint saving interval to 10

* add label smooth

* default_runtime.py recovery

* docformatter

* docformatter

* delete 2 lines of comments

* delete configs/_base_/schedules/imagenet_bs4096.py

* add configs/_base_/schedules/imagenet_bs2048_AdamW.py

* rename imagenet_bs4096.py to imagenet_bs2048_AdamW.py

* add helpers.py

* test vit hybrid backbone

* fix HybridEmbed

* use to_2tuple instead
2021-04-16 19:22:41 +08:00
LXXXXR 7d618e6606
[Fix] Fix version (#209)
* fix version

* add projects in openmmlab

* minor fix

* empty

* add mmocr

* empty

* empty

* fix linting
2021-04-16 19:07:17 +08:00
whcao 1cde6f6e65
[Feature] Add cutmix option (#198)
* Add cutmix option

* fix code style

* add some annotations

* add annotation about custom_hooks

* check constraint of alpha > 0

* add test cutmix

* fix code style

* add cutmix to configs/models

* add cutmix to configs/resnet

* flake8

* empty
2021-04-14 21:27:42 +08:00
mzr1996 b7b520881f
Update CONTRIBUTING.md according to mmcv (#210)
* Update CONTRIBUTING.md according to mmcv

* Docstring formatting by docformatter

* Update openmmlab website.
2021-04-14 21:22:37 +08:00
ftbabi bdd6b01ae7
[Feature] Add "mixup" from Bag of Tricks (#160)
* Add mixup option

* Modify the structure of mixup and add configs

* Clean configs

* Add test for mixup and SoftCrossEntropyLoss

* Add simple test for ImageClassifier

* Fix bug in test_losses.py

* Add assertion in CrossEntropyLoss
2021-02-25 14:06:58 +08:00
LXXXXR 63f38988eb
[Fix] Fix optional issues in docstring (#138)
* fix optional issue in docstring

* revised according to comments

* add optional
2021-01-14 11:09:08 +08:00
louzan 03b75789c6 Dev mobilenetv3 2020-06-30 15:50:36 +08:00
chenkai 02e11cc1f3 Refactoring for ResNet family 2020-06-25 11:57:50 +08:00
yangmingmin f729a60f87 Dev se resnet 2020-06-17 14:20:20 +08:00
lixiaojie 29975930f9 Dev/backbone utils 2020-06-15 16:42:15 +08:00