XiaotongLu
0b24276158
[Feature] Add DMCP and fix the deploy pipeline of NAS algorithms ( #406 )
...
* Copybook
* Newly created copy PR
* Newly created copy PR
* update op_counters
* update subnet/commit/FLOPsCounter
* update docs/UT
* update docs/UT
* add setter for current_mask
* replace current_mask with activated_tensor_channel
* update subnet training
* fix ci
* fix ci
* fix ci
* fix readme.md
* fix readme.md
* update
* fix expression
* fix CI
* fix UT
* fix ci
* fix arch YAMLs
* fix yapf
* revise mmcv version<=2.0.0rc3
* fix build.yaml
* Rollback mmdet to v3.0.0rc5
* Rollback mmdet to v3.0.0rc5
* Rollback mmseg to v1.0.0rc4
* remove search_groups in mutator
* revert env change
* update usage of sub_model
* fix UT
* fix bignas config
* fix UT for dcff & registry
* update Ut&channel_mutator
* fix test_channel_mutator
* fix Ut
* fix bug for load dcffnet
* update nas config
* update nas config
* fix api in evolution_search_loop
* update evolu_search_loop
* fix metric_predictor
* update url
* fix a0 fine_grained
* fix subnet export misskey
* fix ofa yaml
* fix lint
* fix comments
* add autoformer cfg
* update readme
* update supernet link
* fix sub_model configs
* update subnet inference readme
* fix lint
* fix lint
* Update autoformer_subnet_8xb256_in1k.py
* update test.py to support args.checkpoint as none
* update DARTS readme
* update readme
---------
Co-authored-by: gaoyang07 <1546308416@qq.com>
Co-authored-by: aptsunny <aptsunny@tongji.edu.cn>
Co-authored-by: sunyue1 <sunyue1@sensetime.com>
Co-authored-by: aptsunny <36404164+aptsunny@users.noreply.github.com>
Co-authored-by: wang shiguang <xiaohu_wyyx@163.com>
2023-03-02 18:22:20 +08:00
LKJacky
7acc046678
Add GroupFisher pruning algorithm. ( #459 )
...
* init
* support expand dwconv
* add tools
* init
* add import
* add configs
* add ut and fix bug
* update
* update finetune config
* update impl imports
* add deploy configs and result
* add _train_step
* detla_type -> normalization_type
* change img link
* add prune to config
* add json dump when GroupFisherSubModel init
* update prune config
* update finetune config
* update deploy config
* update prune config
* update readme
* mutable_cfg -> fix_subnet
* update readme
* impl -> implementations
* update script.sh
* rm gen_fake_cfg
* add Implementation to readme
* update docstring
* add finetune_lr to config
* update readme
* fix error in config
* update links
* update configs
* refine
* fix spell error
* add test to readme
* update README
* update readme
* update readme
* update cite format
* fix for ci
* update to pass ci
* update readme
---------
Co-authored-by: liukai <your_email@abc.example>
Co-authored-by: Your Name <you@example.com>
2023-02-20 14:29:42 +08:00
Yang Gao
f6d68dc73c
[Fix] Fix commands in README to adapt branch 1.x ( #400 )
...
* update commands in README for 1.x
* fix commands
Co-authored-by: gaoyang07 <1546308416@qq.com>
2022-12-16 20:54:21 +08:00
Yang Gao
42e8de73af
[Improvement] Adapt OFA series with SearchableMobileNetV3 ( #385 )
...
* fix mutable bug in AttentiveMobileNetV3
* remove unness code
* update ATTENTIVE_SUBNET_A0-A6.yaml with optimized names
* unify the sampling usage in sandwich_rule-based NAS
* use alias to export subnet
* update OFA configs
* fix attr bug
* fix comments
* update convert_supernet2subnet.py
* correct the way to dump DerivedMutable
* fix convert index bug
* update OFA configs & models
* fix dynamic2static
* generalize convert_ofa_ckpt.py
* update input_resizer
* update README.md
* fix ut
* update export_fix_subnet
* update _dynamic_to_static
* update fix_subnet UT & minor fix bugs
* fix ut
* add new autoaug compared to attentivenas
* clean
* fix act
* fix act_cfg
* update fix_subnet
* fix lint
* add docstring
Co-authored-by: gaoyang07 <1546308416@qq.com>
Co-authored-by: aptsunny <aptsunny@tongji.edu.cn>
2022-12-15 22:19:55 +08:00
LKJacky
f886821ba1
Add get_prune_config and a demo config_pruning ( #389 )
...
* update tools and test
* add demo
* disable test doc
* add switch for test tools and test_doc
* fix bug
* update doc
* update tools name
* mv get_channel_units
Co-authored-by: liukai <your_email@abc.example>
2022-12-13 10:56:29 +08:00
Xianpan Zhou
c6168cb02a
[Feature] Add tools to convert distill ckpt to student-only ckpt. ( #381 )
...
* [Feature] Add tools to convert distill ckpt to student-only ckpt.
* fix bug.
* add --model-only to only save model.
* Make changes accroding to PR review.
2022-12-08 15:32:36 +08:00
qiufeng
b0b3fbdb49
[Feature] Add BigNAS algorithm ( #219 )
...
* add calibrate-bn-statistics
* add test calibrate-bn-statistics
* fix mixins
* fix mixins
* fix mixin tests
* remove slimmable channel mutable and refactor dynamic op
* refact dynamic batch norm
* add progressive dynamic conv2d
* add center crop dynamic conv2d
* refactor dynamic directory
* refactor dynamic sequential
* rename length to depth in dynamic sequential
* add test for derived mutable
* refactor dynamic op
* refactor api of dynamic op
* add derive mutable mixin
* addbignas algorithm
* refactor bignas structure
* add input resizer
* add input resizer to bignas
* move input resizer from algorithm into classifier
* remove compnents
* add attentive mobilenet
* delete json file
* nearly(less 0.2) align inference accuracy with gml
* move mutate seperated in bignas mobilenet backbone
* add zero_init_residual
* add set_dropout
* set dropout in bignas algorithm
* fix registry
* add subnet yaml and nearly align inference accuracy with gml
* add rsb config for bignas
* remove base in config
* add gml bignas config
* convert to iter based
* bignas forward and backward fly
* fix merge conflict
* fix dynamicseq bug
* fix bug and refactor bignas
* arrange configs of bignas
* fix typo
* refactor attentive_mobilenet
* fix channel mismatch due to registion of DerivedMutable
* update bignas & fix se channel mismatch
* add AutoAugmentV2 & remove unness configs
* fix lint
* recover channel assertion in channel unit
* fix a group bug
* fix comments
* add docstring
* add norm in dynamic_embed
* fix search loop & other minor changes
* fix se expansion
* minor change
* add ut for bignas & attentive_mobilenet
* fix ut
* update bignas readme
* rm unness ut & supplement get_placeholder
* fix lint
* fix ut
* add subnet deployment in downstream tasks.
* minor change
* update ofa backbone
* minor fix
* Continued improvements of searchable backbone
* minor change
* drop ratio in backbone
* fix comments
* fix ci test
* fix test
* add dynamic shortcut UT
* modify strategy to fit bignas
* fix test
* fix bug in neck
* fix error
* fix error
* fix yaml
* save subnet ckpt
* merge autoslim_val/test_loop into subnet_val_loop
* move calibrate_bn_mixin to utils
* fix bugs and add docstring
* clean code
* fix register bug
* clean code
* update
Co-authored-by: wangshiguang <wangshiguang@sensetime.com>
Co-authored-by: gaoyang07 <1546308416@qq.com>
Co-authored-by: aptsunny <aptsunny@tongji.edu.cn>
Co-authored-by: sunyue1 <sunyue1@sensetime.com>
2022-12-07 11:28:10 +08:00
Xianpan Zhou
8fe54c9f64
[Fix] Fix bug on mmrazor visualization, mismatch argument in define and use. ( #356 )
...
fix bug on mmrazor visualization, mismatch argument in define and use.
Co-authored-by: Xianpan Zhou <32625100+PanDaMeow@users.noreply.github.com>
2022-12-01 22:38:39 +08:00
whcao
1e8f886523
[Feature]Feature map visualization ( #293 )
...
* WIP: vis
* WIP: add visualization
* WIP: add visualization hook
* WIP: support razor visualizer
* WIP
* WIP: wrap draw_featmap
* support feature map visualization
* add a demo image for visualization
* fix typos
* change eps to 1e-6
* add pytest for visualization
* fix vis hook
* fix arguments' name
* fix img path
* support draw inference results
* add visualization doc
* fix figure url
* move files
Co-authored-by: weihan cao <HIT-cwh>
2022-10-26 13:26:20 +08:00
PJDong
dd51ab8ca0
[Feature] Support unroll with MMDDP in darts algorithm ( #210 )
...
* support unroll in darts
* fix bugs in optimizer; add docstring
* update darts algorithm [untested]
* modify autograd.grad to optim_wrapper.backward
* add amp in train.py; support constructor
* rename mmcls.data to mmcls.structures
* modify darts algo to support apex [not done]
* fix code spell in diff_mutable_module
* modify optim_context of dartsddp
* add testcase for dartsddp
* fix bugs of apex in dartsddp
* standardized the unittest of darts
* adapt new data_preprocessor
* fix ut bugs
* remove unness code
Co-authored-by: gaoyang07 <1546308416@qq.com>
2022-10-14 17:41:11 +08:00
LKJacky
b4b7e2432a
merge pruning into dev-1.x ( #312 )
...
* add ChannelGroup (#250 )
* rebase new dev-1.x
* modification for adding config_template
* add docstring to channel_group.py
* add docstring to mutable_channel_group.py
* rm channel_group_cfg from Graph2ChannelGroups
* change choice type of SequentialChannelGroup from float to int
* add a warning about group-wise conv
* restore __init__ of dynamic op
* in_channel_mutable -> mutable_in_channel
* rm abstractproperty
* add a comment about VT
* rm registry for ChannelGroup
* MUTABLECHANNELGROUP -> ChannelGroupType
* refine docstring of IndexDict
* update docstring
* update docstring
* is_prunable -> is_mutable
* update docstring
* fix error in pre-commit
* update unittest
* add return type
* unify init_xxx apit
* add unitest about init of MutableChannelGroup
* update according to reviews
* sequential_channel_group -> sequential_mutable_channel_group
Co-authored-by: liukai <liukai@pjlab.org.cn>
* Add BaseChannelMutator and refactor Autoslim (#289 )
* add BaseChannelMutator
* add autoslim
* tmp
* make SequentialMutableChannelGroup accpeted both of num and ratio as choice. and supports divisior
* update OneShotMutableChannelGroup
* pass supernet training of autoslim
* refine autoslim
* fix bug in OneShotMutableChannelGroup
* refactor make_divisible
* fix spell error: channl -> channel
* init_using_backward_tracer -> init_from_backward_tracer
init_from_fx_tracer -> init_from_fx_tracer
* refine SequentialMutableChannelGroup
* let mutator support models with dynamicop
* support define search space in model
* tracer_cfg -> parse_cfg
* refine
* using -> from
* update docstring
* update docstring
Co-authored-by: liukai <liukai@pjlab.org.cn>
* refactor slimmable and add l1-norm (#291 )
* refactor slimmable and add l1-norm
* make l1-norm support convnd
* update get_channel_groups
* add l1-norm_resnet34_8xb32_in1k.py
* add pretrained to resnet34-l1
* remove old channel mutator
* BaseChannelMutator -> ChannelMutator
* update according to reviews
* add readme to l1-norm
* MBV2_slimmable -> MBV2_slimmable_config
Co-authored-by: liukai <liukai@pjlab.org.cn>
* Clean old codes. (#296 )
* remove old dynamic ops
* move dynamic ops
* clean old mutable_channels
* rm OneShotMutableChannel
* rm MutableChannel
* refine
* refine
* use SquentialMutableChannel to replace OneshotMutableChannel
* refactor dynamicops folder
* let SquentialMutableChannel support float
Co-authored-by: liukai <liukai@pjlab.org.cn>
* Add channel-flow (#301 )
* base_channel_mutator -> channel_mutator
* init
* update docstring
* allow omitting redundant configs for channel
* add register_mutable_channel_to_a_module to MutableChannelContainer
* update according to reviews 1
* update according to reviews 2
* update according to reviews 3
* remove old docstring
* fix error
* using->from
* update according to reviews
* support self-define input channel number
* update docstring
* chanenl -> channel_elem
Co-authored-by: liukai <liukai@pjlab.org.cn>
Co-authored-by: jacky <jacky@xx.com>
* Rename: ChannelGroup -> ChannelUnit (#302 )
* refine repr of MutableChannelGroup
* rename folder name
* ChannelGroup -> ChannelUnit
* filename in units folder
* channel_group -> channel_unit
* groups -> units
* group -> unit
* update
* get_mutable_channel_groups -> get_mutable_channel_units
* fix bug
* refine docstring
* fix ci
* fix bug in tracer
Co-authored-by: liukai <liukai@pjlab.org.cn>
* Merge dev-1.x to pruning (#311 )
* [feature] CONTRASTIVE REPRESENTATION DISTILLATION with dataset wrapper (#281 )
* init
* TD: CRDLoss
* complete UT
* fix docstrings
* fix ci
* update
* fix CI
* DONE
* maintain CRD dataset unique funcs as a mixin
* maintain CRD dataset unique funcs as a mixin
* maintain CRD dataset unique funcs as a mixin
* add UT: CRD_ClsDataset
* init
* TODO: UT test formatting.
* init
* crd dataset wrapper
* update docstring
Co-authored-by: huangpengsheng <huangpengsheng@sensetime.com>
* [Improvement] Update estimator with api revision (#277 )
* update estimator usage and fix bugs
* refactor api of estimator & add inner check methods
* fix docstrings
* update search loop and config
* fix lint
* update unittest
* decouple mmdet dependency and fix lint
Co-authored-by: humu789 <humu@pjlab.org.cn>
* [Fix] Fix tracer (#273 )
* test image_classifier_loss_calculator
* fix backward tracer
* update SingleStageDetectorPseudoLoss
* merge
* [Feature] Add Dsnas Algorithm (#226 )
* [tmp] Update Dsnas
* [tmp] refactor arch_loss & flops_loss
* Update Dsnas & MMRAZOR_EVALUATOR:
1. finalized compute_loss & handle_grads in algorithm;
2. add MMRAZOR_EVALUATOR;
3. fix bugs.
* Update lr scheduler & fix a bug:
1. update param_scheduler & lr_scheduler for dsnas;
2. fix a bug of switching to finetune stage.
* remove old evaluators
* remove old evaluators
* update param_scheduler config
* merge dev-1.x into gy/estimator
* add flops_loss in Dsnas using ResourcesEstimator
* get resources before mutator.prepare_from_supernet
* delete unness broadcast api from gml
* broadcast spec_modules_resources when estimating
* update early fix mechanism for Dsnas
* fix merge
* update units in estimator
* minor change
* fix data_preprocessor api
* add flops_loss_coef
* remove DsnasOptimWrapper
* fix bn eps and data_preprocessor
* fix bn weight decay bug
* add betas for mutator optimizer
* set diff_rank_seed=True for dsnas
* fix start_factor of lr when warm up
* remove .module in non-ddp mode
* add GlobalAveragePoolingWithDropout
* add UT for dsnas
* remove unness channel adjustment for shufflenetv2
* update supernet configs
* delete unness dropout
* delete unness part with minor change on dsnas
* minor change on the flag of search stage
* update README and subnet configs
* add UT for OneHotMutableOP
* [Feature] Update train (#279 )
* support auto resume
* add enable auto_scale_lr in train.py
* support '--amp' option
* [Fix] Fix darts metafile (#278 )
fix darts metafile
* fix ci (#284 )
* fix ci for circle ci
* fix bug in test_metafiles
* add pr_stage_test for github ci
* add multiple version
* fix ut
* fix lint
* Temporarily skip dataset UT
* update github ci
* add github lint ci
* install wheel
* remove timm from requirements
* install wheel when test on windows
* fix error
* fix bug
* remove github windows ci
* fix device error of arch_params when DsnasDDP
* fix CRD dataset ut
* fix scope error
* rm test_cuda in workflows of github
* [Doc] fix typos in en/usr_guides
Co-authored-by: liukai <liukai@pjlab.org.cn>
Co-authored-by: pppppM <gjf_mail@126.com>
Co-authored-by: gaoyang07 <1546308416@qq.com>
Co-authored-by: huangpengsheng <huangpengsheng@sensetime.com>
Co-authored-by: SheffieldCao <1751899@tongji.edu.cn>
Co-authored-by: P.Huang <37200926+FreakieHuang@users.noreply.github.com>
Co-authored-by: huangpengsheng <huangpengsheng@sensetime.com>
Co-authored-by: Yang Gao <Gary1546308416AL@gmail.com>
Co-authored-by: humu789 <humu@pjlab.org.cn>
Co-authored-by: whcao <41630003+HIT-cwh@users.noreply.github.com>
Co-authored-by: liukai <liukai@pjlab.org.cn>
Co-authored-by: pppppM <gjf_mail@126.com>
Co-authored-by: gaoyang07 <1546308416@qq.com>
Co-authored-by: SheffieldCao <1751899@tongji.edu.cn>
* Refine pruning branch (#307 )
* [feature] CONTRASTIVE REPRESENTATION DISTILLATION with dataset wrapper (#281 )
* init
* TD: CRDLoss
* complete UT
* fix docstrings
* fix ci
* update
* fix CI
* DONE
* maintain CRD dataset unique funcs as a mixin
* maintain CRD dataset unique funcs as a mixin
* maintain CRD dataset unique funcs as a mixin
* add UT: CRD_ClsDataset
* init
* TODO: UT test formatting.
* init
* crd dataset wrapper
* update docstring
Co-authored-by: huangpengsheng <huangpengsheng@sensetime.com>
* [Improvement] Update estimator with api revision (#277 )
* update estimator usage and fix bugs
* refactor api of estimator & add inner check methods
* fix docstrings
* update search loop and config
* fix lint
* update unittest
* decouple mmdet dependency and fix lint
Co-authored-by: humu789 <humu@pjlab.org.cn>
* [Fix] Fix tracer (#273 )
* test image_classifier_loss_calculator
* fix backward tracer
* update SingleStageDetectorPseudoLoss
* merge
* [Feature] Add Dsnas Algorithm (#226 )
* [tmp] Update Dsnas
* [tmp] refactor arch_loss & flops_loss
* Update Dsnas & MMRAZOR_EVALUATOR:
1. finalized compute_loss & handle_grads in algorithm;
2. add MMRAZOR_EVALUATOR;
3. fix bugs.
* Update lr scheduler & fix a bug:
1. update param_scheduler & lr_scheduler for dsnas;
2. fix a bug of switching to finetune stage.
* remove old evaluators
* remove old evaluators
* update param_scheduler config
* merge dev-1.x into gy/estimator
* add flops_loss in Dsnas using ResourcesEstimator
* get resources before mutator.prepare_from_supernet
* delete unness broadcast api from gml
* broadcast spec_modules_resources when estimating
* update early fix mechanism for Dsnas
* fix merge
* update units in estimator
* minor change
* fix data_preprocessor api
* add flops_loss_coef
* remove DsnasOptimWrapper
* fix bn eps and data_preprocessor
* fix bn weight decay bug
* add betas for mutator optimizer
* set diff_rank_seed=True for dsnas
* fix start_factor of lr when warm up
* remove .module in non-ddp mode
* add GlobalAveragePoolingWithDropout
* add UT for dsnas
* remove unness channel adjustment for shufflenetv2
* update supernet configs
* delete unness dropout
* delete unness part with minor change on dsnas
* minor change on the flag of search stage
* update README and subnet configs
* add UT for OneHotMutableOP
* [Feature] Update train (#279 )
* support auto resume
* add enable auto_scale_lr in train.py
* support '--amp' option
* [Fix] Fix darts metafile (#278 )
fix darts metafile
* fix ci (#284 )
* fix ci for circle ci
* fix bug in test_metafiles
* add pr_stage_test for github ci
* add multiple version
* fix ut
* fix lint
* Temporarily skip dataset UT
* update github ci
* add github lint ci
* install wheel
* remove timm from requirements
* install wheel when test on windows
* fix error
* fix bug
* remove github windows ci
* fix device error of arch_params when DsnasDDP
* fix CRD dataset ut
* fix scope error
* rm test_cuda in workflows of github
* [Doc] fix typos in en/usr_guides
Co-authored-by: liukai <liukai@pjlab.org.cn>
Co-authored-by: pppppM <gjf_mail@126.com>
Co-authored-by: gaoyang07 <1546308416@qq.com>
Co-authored-by: huangpengsheng <huangpengsheng@sensetime.com>
Co-authored-by: SheffieldCao <1751899@tongji.edu.cn>
* fix bug when python=3.6
* fix lint
* fix bug when test using cpu only
* refine ci
* fix error in ci
* try ci
* update repr of Channel
* fix error
* mv init_from_predefined_model to MutableChannelUnit
* move tests
* update SquentialMutableChannel
* update l1 mutable channel unit
* add OneShotMutableChannel
* candidate_mode -> choice_mode
* update docstring
* change ci
Co-authored-by: P.Huang <37200926+FreakieHuang@users.noreply.github.com>
Co-authored-by: huangpengsheng <huangpengsheng@sensetime.com>
Co-authored-by: Yang Gao <Gary1546308416AL@gmail.com>
Co-authored-by: humu789 <humu@pjlab.org.cn>
Co-authored-by: whcao <41630003+HIT-cwh@users.noreply.github.com>
Co-authored-by: liukai <liukai@pjlab.org.cn>
Co-authored-by: pppppM <gjf_mail@126.com>
Co-authored-by: gaoyang07 <1546308416@qq.com>
Co-authored-by: SheffieldCao <1751899@tongji.edu.cn>
Co-authored-by: liukai <liukai@pjlab.org.cn>
Co-authored-by: jacky <jacky@xx.com>
Co-authored-by: P.Huang <37200926+FreakieHuang@users.noreply.github.com>
Co-authored-by: huangpengsheng <huangpengsheng@sensetime.com>
Co-authored-by: Yang Gao <Gary1546308416AL@gmail.com>
Co-authored-by: humu789 <humu@pjlab.org.cn>
Co-authored-by: whcao <41630003+HIT-cwh@users.noreply.github.com>
Co-authored-by: pppppM <gjf_mail@126.com>
Co-authored-by: gaoyang07 <1546308416@qq.com>
Co-authored-by: SheffieldCao <1751899@tongji.edu.cn>
2022-10-10 17:30:25 +08:00
whcao
ef39c51bb9
[Feature] Update train ( #279 )
...
* support auto resume
* add enable auto_scale_lr in train.py
* support '--amp' option
2022-10-08 10:31:04 +08:00
pppppM
8a249fd98d
[CI] Add circle ci ( #257 )
...
* copy .circleci from mmdet
* adapt mmrazor
* change the min docstring coverage
* fix typos
* update publish model script
* update circle ci config
2022-08-30 20:20:10 +08:00
wang shiguang
ba71abf357
[fix] fix mmcv mmengine ( #242 )
...
* align_with_mmcv_and_mmengine
* fix_mmcv.fileio
2022-08-24 09:58:11 +08:00
qiufeng
6987511e6b
[Fix] Fix mmcls import error ( #206 )
...
* fix mmcls import error
* fix __init__.py
2022-07-26 15:33:11 +08:00
pppppM
ae205ac0c6
Refactor darts ( #204 )
...
* add separate optim wrapper
* refactor darts related modules
* refactor darts algorithm
* fix some bugs
* update darts related modules
* update unittest
* update darts configs
2022-07-25 09:52:39 +08:00
wutongshenqiu
c6a2d482fd
refactor autoslim config
2022-07-15 23:05:10 +08:00
pppppM
5bf1eca4e4
Add benchmark tools & Reorgnazie configs
2022-07-15 23:05:10 +08:00
qiufeng
5ddfed5040
[Feature] Add slimmable algorithm
2022-07-15 23:05:07 +08:00
PJDong
6c920c88ee
Align SPOS and DetNAS to MMRazor2.0
2022-07-15 23:04:38 +08:00
PJDong
332f49ac6f
Support SubnetMixin and add Razor Registry Build Function
2022-07-15 23:04:13 +08:00
pppppM
dee5352f92
[CI] Add mypy and mdformat
2022-07-15 23:04:10 +08:00
pppppM
590bfa448c
[Refactor] Refactor SPOS & DetNAS interface
2022-07-15 23:02:57 +08:00
pppppM
baa8c8614e
[Deprecated] Clean up code that will be deprecated in OpenMMLab 2.0
2022-07-15 23:02:37 +08:00
pppppM
2dad24044d
Bump version to 0.3.1 ( #155 )
...
* [Enhance] Add extra dataloader settings in configs (#141 )
* [Docs] fix md link failure in docs (#142 )
* [Docs] update Cream readme
* delete 'readme.md' in model_zoo.md
* fix md link failure in docs
* [Docs] add myst_parser to extensions in conf.py
* [Docs] delete the deprecated recommonmark
* [Docs] delete recommandmark from conf.py
* [Docs] fix md link failure and lint failture
* [Fix] Fix seed error in mmseg/train_seg.py and typos in train.md (#152 )
* [Docs] update Cream readme
* delete 'readme.md' in model_zoo.md
* fix cwd docs and fix seed in #151
* delete readme of cream
* [Enhancement]Support broadcast_object_list in multi-machines & support Searcher running in single GPU (#153 )
* broadcast_object_list support multi-machines
* add userwarning
* [Fix] Fix configs (#149 )
* fix configs
* fix spos configs
* fix readme
* replace the official mutable_cfg with the mutable_cfg searched by ourselves
* update https prefix
Co-authored-by: pppppM <gjf_mail@126.com>
* [BUG]Support to prune models containing GroupNorm or InstanceNorm. (#144 )
* suport GN and IN
* test pruner
* limit pytorch version
* fix pytest
* throw an error when tracing groupnorm with torch version under 1.6.0
Co-authored-by: caoweihan <caoweihan@sensetime.com>
* Bump version to 0.3.1
Co-authored-by: qiufeng <44188071+wutongshenqiu@users.noreply.github.com>
Co-authored-by: PJDong <1115957667@qq.com>
Co-authored-by: humu789 <88702197+humu789@users.noreply.github.com>
Co-authored-by: whcao <41630003+HIT-cwh@users.noreply.github.com>
Co-authored-by: caoweihan <caoweihan@sensetime.com>
2022-05-05 01:02:45 +08:00
pppppM
49f1bee45b
Bump version to v0.3.0 ( #135 )
...
* [Feature] Add function to meet mmdeploy support (#102 )
* add init_model function for mmdeploy
* fix lint
* add unittest for init_xxx_model
* fix lint
* mv test_inference.py to test_apis directory
* [Feature] Add function to meet mmdeploy support (#102 )
* add init_model function for mmdeploy
* fix lint
* add unittest for init_xxx_model
* fix lint
* mv test_inference.py to test_apis directory
* [Refactor] Delete redundant `set_random_seed` function (#104 )
* refactor set_random_seed
* add unittests
* fix unittests error
* fix lint
* avoid bc breaking
* [Feature] Add diff seeds to diff ranks and set torch seed in worker_init_fn (#113 )
* add init_random_seed
* Set diff seed to diff workers
* [Feature] Add multi machine dist_train (#114 )
* support multi nodes
* update training doc
* fix lints
* remove fixed seed
* fix ddp wrapper registry (#128 )
* [Docs] Add brief installation steps in README(_zh-CN).md (#121 )
* Add brief installation
* add brief installtion ref to mmediting pr#816
Co-authored-by: caoweihan <caoweihan@sensetime.com>
* [BUG]Fix bugs in pruner (#126 )
* fix bugs in pruner when pruning models with shared modules
* pruner can trace models with dilation conv2d
* fix deploy_subnet
* fix add_pruning_attrs
* fix bugs in modify_forward
* fix lint
* fix StructurePruner
* test tracing models with shared modules
Co-authored-by: caoweihan <caoweihan@sensetime.com>
* [Docs]Add some more details to docs (#133 )
* add docs for dataset
* add cfg-options for distillation
* fix docs
Co-authored-by: caoweihan <caoweihan@sensetime.com>
* reset norm running status after prepare_from_supernet (#81 )
* [Improvement]Sync train api (#115 )
Co-authored-by: caoweihan <caoweihan@sensetime.com>
* [Feature]Support Relational Knowledge Distillation (#127 )
* add rkd
* add rkd pytest
* add rkd configs
* fix readme
* fix rkd
* split rkd loss to distance-wise and angle-wise losses
* rename rkd losses
* add rkd metaflie
* add rkd related links
* rename rkd metafile and add to model index
* delete cifar100
Co-authored-by: caoweihan <caoweihan@sensetime.com>
Co-authored-by: pppppM <gjf_mail@126.com>
Co-authored-by: qiufeng <44188071+wutongshenqiu@users.noreply.github.com>
Co-authored-by: wutongshenqiu <690364065@qq.com>
Co-authored-by: whcao <41630003+HIT-cwh@users.noreply.github.com>
Co-authored-by: caoweihan <caoweihan@sensetime.com>
2022-04-02 19:30:50 +08:00
whcao
81e0e3452a
[Feature] Resume from the latest checkpoint automatically. ( #61 )
...
* support auto-resume
* support auto-resume
* support auto-resume
* support auto-resume
Co-authored-by: pppppM <67539920+pppppM@users.noreply.github.com>
2022-03-08 11:25:19 +08:00
qiufeng
91415b92a5
[Enhancement] Add distributed scripts ( #105 )
...
* add dist scripts
* add PYTHONPATH
2022-03-07 22:07:13 +08:00
qiufeng
3b6423d39b
fix slurm search shell scripts ( #90 )
2022-02-17 14:51:57 +08:00
qiufeng
64ccbc03fd
[Enhance] Add setup multi-processes for all tasks ( #59 )
...
* Add setup-multi-processes for all tasks
* Add setup-multi-processes for all tasks
* Add test for setup-multi-processes
2022-01-26 19:17:55 +08:00
qiufeng
8b6c084e40
[Fix] Fix bug in non-distributed training/testing for all tasks ( #63 )
...
* Fix bug in non-distributed training/testing for all tasks
* Fix add warning infos
2022-01-26 19:16:29 +08:00
qiufeng
9596379bb3
[Fix] Fix `show_result` error during test ( #53 )
...
* move from algorithm to model
* rename model to algorithm
2022-01-19 19:32:37 +08:00
pppppM
cb5cb6da05
Base Framework
2021-12-23 03:09:46 +08:00