LKJacky
c8e14e5489
fix bug in placer holder ( #395 )
...
* fix bug in placer holder
* remove redundent comment
Co-authored-by: liukai <your_email@abc.example>
2022-12-13 10:56:13 +08:00
LKJacky
1c03a07350
Enhance the Abilities of the Tracer for Pruning. ( #371 )
...
* tmp
* add new mmdet models
* add docstring
* pass test and pre-commit
* rm razor tracer
* update fx tracer, now it can automatically wrap methods and functions.
* update tracer passed models
* add warning for torch <1.12.0
fix bug for python3.6
update placeholder to support placeholder.XXX
* fix bug
* update docs
* fix lint
* fix parse_cfg in configs
* restore mutablechannel
* test ite prune algorithm when using dist
* add get_model_from_path to MMModelLibrrary
* add mm models to DefaultModelLibrary
* add uts
* fix bug
* fix bug
* add uts
* add uts
* add uts
* add uts
* fix bug
* restore ite_prune_algorithm
* update doc
* PruneTracer -> ChannelAnalyzer
* prune_tracer -> channel_analyzer
* add test for fxtracer
* fix bug
* fix bug
* PruneTracer -> ChannelAnalyzer
refine
* CustomFxTracer -> MMFxTracer
* fix bug when test with torch<1.12
* update print log
* fix lint
* rm unuseful code
Co-authored-by: liukai <liukai@pjlab.org.cn>
Co-authored-by: jacky <jacky@xx.com>
Co-authored-by: Your Name <you@example.com>
Co-authored-by: liukai <your_email@abc.example>
2022-12-08 15:59:27 +08:00
LKJacky
b4b7e2432a
merge pruning into dev-1.x ( #312 )
...
* add ChannelGroup (#250 )
* rebase new dev-1.x
* modification for adding config_template
* add docstring to channel_group.py
* add docstring to mutable_channel_group.py
* rm channel_group_cfg from Graph2ChannelGroups
* change choice type of SequentialChannelGroup from float to int
* add a warning about group-wise conv
* restore __init__ of dynamic op
* in_channel_mutable -> mutable_in_channel
* rm abstractproperty
* add a comment about VT
* rm registry for ChannelGroup
* MUTABLECHANNELGROUP -> ChannelGroupType
* refine docstring of IndexDict
* update docstring
* update docstring
* is_prunable -> is_mutable
* update docstring
* fix error in pre-commit
* update unittest
* add return type
* unify init_xxx apit
* add unitest about init of MutableChannelGroup
* update according to reviews
* sequential_channel_group -> sequential_mutable_channel_group
Co-authored-by: liukai <liukai@pjlab.org.cn>
* Add BaseChannelMutator and refactor Autoslim (#289 )
* add BaseChannelMutator
* add autoslim
* tmp
* make SequentialMutableChannelGroup accpeted both of num and ratio as choice. and supports divisior
* update OneShotMutableChannelGroup
* pass supernet training of autoslim
* refine autoslim
* fix bug in OneShotMutableChannelGroup
* refactor make_divisible
* fix spell error: channl -> channel
* init_using_backward_tracer -> init_from_backward_tracer
init_from_fx_tracer -> init_from_fx_tracer
* refine SequentialMutableChannelGroup
* let mutator support models with dynamicop
* support define search space in model
* tracer_cfg -> parse_cfg
* refine
* using -> from
* update docstring
* update docstring
Co-authored-by: liukai <liukai@pjlab.org.cn>
* refactor slimmable and add l1-norm (#291 )
* refactor slimmable and add l1-norm
* make l1-norm support convnd
* update get_channel_groups
* add l1-norm_resnet34_8xb32_in1k.py
* add pretrained to resnet34-l1
* remove old channel mutator
* BaseChannelMutator -> ChannelMutator
* update according to reviews
* add readme to l1-norm
* MBV2_slimmable -> MBV2_slimmable_config
Co-authored-by: liukai <liukai@pjlab.org.cn>
* Clean old codes. (#296 )
* remove old dynamic ops
* move dynamic ops
* clean old mutable_channels
* rm OneShotMutableChannel
* rm MutableChannel
* refine
* refine
* use SquentialMutableChannel to replace OneshotMutableChannel
* refactor dynamicops folder
* let SquentialMutableChannel support float
Co-authored-by: liukai <liukai@pjlab.org.cn>
* Add channel-flow (#301 )
* base_channel_mutator -> channel_mutator
* init
* update docstring
* allow omitting redundant configs for channel
* add register_mutable_channel_to_a_module to MutableChannelContainer
* update according to reviews 1
* update according to reviews 2
* update according to reviews 3
* remove old docstring
* fix error
* using->from
* update according to reviews
* support self-define input channel number
* update docstring
* chanenl -> channel_elem
Co-authored-by: liukai <liukai@pjlab.org.cn>
Co-authored-by: jacky <jacky@xx.com>
* Rename: ChannelGroup -> ChannelUnit (#302 )
* refine repr of MutableChannelGroup
* rename folder name
* ChannelGroup -> ChannelUnit
* filename in units folder
* channel_group -> channel_unit
* groups -> units
* group -> unit
* update
* get_mutable_channel_groups -> get_mutable_channel_units
* fix bug
* refine docstring
* fix ci
* fix bug in tracer
Co-authored-by: liukai <liukai@pjlab.org.cn>
* Merge dev-1.x to pruning (#311 )
* [feature] CONTRASTIVE REPRESENTATION DISTILLATION with dataset wrapper (#281 )
* init
* TD: CRDLoss
* complete UT
* fix docstrings
* fix ci
* update
* fix CI
* DONE
* maintain CRD dataset unique funcs as a mixin
* maintain CRD dataset unique funcs as a mixin
* maintain CRD dataset unique funcs as a mixin
* add UT: CRD_ClsDataset
* init
* TODO: UT test formatting.
* init
* crd dataset wrapper
* update docstring
Co-authored-by: huangpengsheng <huangpengsheng@sensetime.com>
* [Improvement] Update estimator with api revision (#277 )
* update estimator usage and fix bugs
* refactor api of estimator & add inner check methods
* fix docstrings
* update search loop and config
* fix lint
* update unittest
* decouple mmdet dependency and fix lint
Co-authored-by: humu789 <humu@pjlab.org.cn>
* [Fix] Fix tracer (#273 )
* test image_classifier_loss_calculator
* fix backward tracer
* update SingleStageDetectorPseudoLoss
* merge
* [Feature] Add Dsnas Algorithm (#226 )
* [tmp] Update Dsnas
* [tmp] refactor arch_loss & flops_loss
* Update Dsnas & MMRAZOR_EVALUATOR:
1. finalized compute_loss & handle_grads in algorithm;
2. add MMRAZOR_EVALUATOR;
3. fix bugs.
* Update lr scheduler & fix a bug:
1. update param_scheduler & lr_scheduler for dsnas;
2. fix a bug of switching to finetune stage.
* remove old evaluators
* remove old evaluators
* update param_scheduler config
* merge dev-1.x into gy/estimator
* add flops_loss in Dsnas using ResourcesEstimator
* get resources before mutator.prepare_from_supernet
* delete unness broadcast api from gml
* broadcast spec_modules_resources when estimating
* update early fix mechanism for Dsnas
* fix merge
* update units in estimator
* minor change
* fix data_preprocessor api
* add flops_loss_coef
* remove DsnasOptimWrapper
* fix bn eps and data_preprocessor
* fix bn weight decay bug
* add betas for mutator optimizer
* set diff_rank_seed=True for dsnas
* fix start_factor of lr when warm up
* remove .module in non-ddp mode
* add GlobalAveragePoolingWithDropout
* add UT for dsnas
* remove unness channel adjustment for shufflenetv2
* update supernet configs
* delete unness dropout
* delete unness part with minor change on dsnas
* minor change on the flag of search stage
* update README and subnet configs
* add UT for OneHotMutableOP
* [Feature] Update train (#279 )
* support auto resume
* add enable auto_scale_lr in train.py
* support '--amp' option
* [Fix] Fix darts metafile (#278 )
fix darts metafile
* fix ci (#284 )
* fix ci for circle ci
* fix bug in test_metafiles
* add pr_stage_test for github ci
* add multiple version
* fix ut
* fix lint
* Temporarily skip dataset UT
* update github ci
* add github lint ci
* install wheel
* remove timm from requirements
* install wheel when test on windows
* fix error
* fix bug
* remove github windows ci
* fix device error of arch_params when DsnasDDP
* fix CRD dataset ut
* fix scope error
* rm test_cuda in workflows of github
* [Doc] fix typos in en/usr_guides
Co-authored-by: liukai <liukai@pjlab.org.cn>
Co-authored-by: pppppM <gjf_mail@126.com>
Co-authored-by: gaoyang07 <1546308416@qq.com>
Co-authored-by: huangpengsheng <huangpengsheng@sensetime.com>
Co-authored-by: SheffieldCao <1751899@tongji.edu.cn>
Co-authored-by: P.Huang <37200926+FreakieHuang@users.noreply.github.com>
Co-authored-by: huangpengsheng <huangpengsheng@sensetime.com>
Co-authored-by: Yang Gao <Gary1546308416AL@gmail.com>
Co-authored-by: humu789 <humu@pjlab.org.cn>
Co-authored-by: whcao <41630003+HIT-cwh@users.noreply.github.com>
Co-authored-by: liukai <liukai@pjlab.org.cn>
Co-authored-by: pppppM <gjf_mail@126.com>
Co-authored-by: gaoyang07 <1546308416@qq.com>
Co-authored-by: SheffieldCao <1751899@tongji.edu.cn>
* Refine pruning branch (#307 )
* [feature] CONTRASTIVE REPRESENTATION DISTILLATION with dataset wrapper (#281 )
* init
* TD: CRDLoss
* complete UT
* fix docstrings
* fix ci
* update
* fix CI
* DONE
* maintain CRD dataset unique funcs as a mixin
* maintain CRD dataset unique funcs as a mixin
* maintain CRD dataset unique funcs as a mixin
* add UT: CRD_ClsDataset
* init
* TODO: UT test formatting.
* init
* crd dataset wrapper
* update docstring
Co-authored-by: huangpengsheng <huangpengsheng@sensetime.com>
* [Improvement] Update estimator with api revision (#277 )
* update estimator usage and fix bugs
* refactor api of estimator & add inner check methods
* fix docstrings
* update search loop and config
* fix lint
* update unittest
* decouple mmdet dependency and fix lint
Co-authored-by: humu789 <humu@pjlab.org.cn>
* [Fix] Fix tracer (#273 )
* test image_classifier_loss_calculator
* fix backward tracer
* update SingleStageDetectorPseudoLoss
* merge
* [Feature] Add Dsnas Algorithm (#226 )
* [tmp] Update Dsnas
* [tmp] refactor arch_loss & flops_loss
* Update Dsnas & MMRAZOR_EVALUATOR:
1. finalized compute_loss & handle_grads in algorithm;
2. add MMRAZOR_EVALUATOR;
3. fix bugs.
* Update lr scheduler & fix a bug:
1. update param_scheduler & lr_scheduler for dsnas;
2. fix a bug of switching to finetune stage.
* remove old evaluators
* remove old evaluators
* update param_scheduler config
* merge dev-1.x into gy/estimator
* add flops_loss in Dsnas using ResourcesEstimator
* get resources before mutator.prepare_from_supernet
* delete unness broadcast api from gml
* broadcast spec_modules_resources when estimating
* update early fix mechanism for Dsnas
* fix merge
* update units in estimator
* minor change
* fix data_preprocessor api
* add flops_loss_coef
* remove DsnasOptimWrapper
* fix bn eps and data_preprocessor
* fix bn weight decay bug
* add betas for mutator optimizer
* set diff_rank_seed=True for dsnas
* fix start_factor of lr when warm up
* remove .module in non-ddp mode
* add GlobalAveragePoolingWithDropout
* add UT for dsnas
* remove unness channel adjustment for shufflenetv2
* update supernet configs
* delete unness dropout
* delete unness part with minor change on dsnas
* minor change on the flag of search stage
* update README and subnet configs
* add UT for OneHotMutableOP
* [Feature] Update train (#279 )
* support auto resume
* add enable auto_scale_lr in train.py
* support '--amp' option
* [Fix] Fix darts metafile (#278 )
fix darts metafile
* fix ci (#284 )
* fix ci for circle ci
* fix bug in test_metafiles
* add pr_stage_test for github ci
* add multiple version
* fix ut
* fix lint
* Temporarily skip dataset UT
* update github ci
* add github lint ci
* install wheel
* remove timm from requirements
* install wheel when test on windows
* fix error
* fix bug
* remove github windows ci
* fix device error of arch_params when DsnasDDP
* fix CRD dataset ut
* fix scope error
* rm test_cuda in workflows of github
* [Doc] fix typos in en/usr_guides
Co-authored-by: liukai <liukai@pjlab.org.cn>
Co-authored-by: pppppM <gjf_mail@126.com>
Co-authored-by: gaoyang07 <1546308416@qq.com>
Co-authored-by: huangpengsheng <huangpengsheng@sensetime.com>
Co-authored-by: SheffieldCao <1751899@tongji.edu.cn>
* fix bug when python=3.6
* fix lint
* fix bug when test using cpu only
* refine ci
* fix error in ci
* try ci
* update repr of Channel
* fix error
* mv init_from_predefined_model to MutableChannelUnit
* move tests
* update SquentialMutableChannel
* update l1 mutable channel unit
* add OneShotMutableChannel
* candidate_mode -> choice_mode
* update docstring
* change ci
Co-authored-by: P.Huang <37200926+FreakieHuang@users.noreply.github.com>
Co-authored-by: huangpengsheng <huangpengsheng@sensetime.com>
Co-authored-by: Yang Gao <Gary1546308416AL@gmail.com>
Co-authored-by: humu789 <humu@pjlab.org.cn>
Co-authored-by: whcao <41630003+HIT-cwh@users.noreply.github.com>
Co-authored-by: liukai <liukai@pjlab.org.cn>
Co-authored-by: pppppM <gjf_mail@126.com>
Co-authored-by: gaoyang07 <1546308416@qq.com>
Co-authored-by: SheffieldCao <1751899@tongji.edu.cn>
Co-authored-by: liukai <liukai@pjlab.org.cn>
Co-authored-by: jacky <jacky@xx.com>
Co-authored-by: P.Huang <37200926+FreakieHuang@users.noreply.github.com>
Co-authored-by: huangpengsheng <huangpengsheng@sensetime.com>
Co-authored-by: Yang Gao <Gary1546308416AL@gmail.com>
Co-authored-by: humu789 <humu@pjlab.org.cn>
Co-authored-by: whcao <41630003+HIT-cwh@users.noreply.github.com>
Co-authored-by: pppppM <gjf_mail@126.com>
Co-authored-by: gaoyang07 <1546308416@qq.com>
Co-authored-by: SheffieldCao <1751899@tongji.edu.cn>
2022-10-10 17:30:25 +08:00
pppppM
baa8c8614e
[Deprecated] Clean up code that will be deprecated in OpenMMLab 2.0
2022-07-15 23:02:37 +08:00
whcao
81e0e3452a
[Feature] Resume from the latest checkpoint automatically. ( #61 )
...
* support auto-resume
* support auto-resume
* support auto-resume
* support auto-resume
Co-authored-by: pppppM <67539920+pppppM@users.noreply.github.com>
2022-03-08 11:25:19 +08:00
qiufeng
64ccbc03fd
[Enhance] Add setup multi-processes for all tasks ( #59 )
...
* Add setup-multi-processes for all tasks
* Add setup-multi-processes for all tasks
* Add test for setup-multi-processes
2022-01-26 19:17:55 +08:00