LKJacky
7acc046678
Add GroupFisher pruning algorithm. ( #459 )
...
* init
* support expand dwconv
* add tools
* init
* add import
* add configs
* add ut and fix bug
* update
* update finetune config
* update impl imports
* add deploy configs and result
* add _train_step
* detla_type -> normalization_type
* change img link
* add prune to config
* add json dump when GroupFisherSubModel init
* update prune config
* update finetune config
* update deploy config
* update prune config
* update readme
* mutable_cfg -> fix_subnet
* update readme
* impl -> implementations
* update script.sh
* rm gen_fake_cfg
* add Implementation to readme
* update docstring
* add finetune_lr to config
* update readme
* fix error in config
* update links
* update configs
* refine
* fix spell error
* add test to readme
* update README
* update readme
* update readme
* update cite format
* fix for ci
* update to pass ci
* update readme
---------
Co-authored-by: liukai <your_email@abc.example>
Co-authored-by: Your Name <you@example.com>
2023-02-20 14:29:42 +08:00
Yue Sun
18754f3599
[Improvement] Update searchable model ( #438 )
...
* bugfix search save_subnet
* update link
* clean
---------
Co-authored-by: aptsunny <aptsunny@tongji.edu.cn>
2023-02-17 17:26:41 +08:00
Yang Gao
a27952dbb1
[Improvement] Update NasMutator to build search_space in NAS ( #426 )
...
* update space_mixin
* update NAS algorithms with SpaceMixin
* update pruning algorithms with SpaceMixin
* fix ut
* fix comments
* revert _load_fix_subnet_by_mutator
* fix dcff test
* add ut for registry
* update autoslim_greedy_search
* fix repeat-mutables bug
* fix slice_weight in export_fix_subnet
* Update NasMutator:
1. unify mutators for NAS algorithms as the NasMutator;
2. regard ChannelMutator as pruning-specified;
3. remove value_mutators & module_mutators;
4. set GroupMixin only for NAS;
5. revert all changes in ChannelMutator.
* update NAS algorithms using NasMutator
* update channel mutator
* update one_shot_channel_mutator
* fix comments
* update UT for NasMutator
* fix isort version
* fix comments
---------
Co-authored-by: gaoyang07 <1546308416@qq.com>
Co-authored-by: liukai <your_email@abc.example>
2023-02-01 22:51:38 +08:00
LKJacky
b750375f73
fix bug when use get_channel_unit.py ( #432 )
...
fix
Co-authored-by: liukai <your_email@abc.example>
2023-02-01 14:38:29 +08:00
Yue Sun
705da2272b
update bignas cfg ( #412 )
...
* check attentivenas training
* update ckpt link
* update supernet log
Co-authored-by: aptsunny <aptsunny@tongji.edu.cn>
2023-01-06 16:35:42 +08:00
whcao
7e4e2ea977
[Fix] Fix metafile ( #422 )
...
* fix ckpt path in metafile and readme
* fix darts file path
* fix docstring in ConfigurableDistiller
* fix darts
* fix error
* add darts of mmrazor version
* delete py36
Co-authored-by: liukai <your_email@abc.example>
2023-01-05 19:31:51 +08:00
whcao
1c47009b1f
[Feature] Add greedy search for AutoSlim ( #336 )
...
* WIP: add greedysearch
* fix greedy search and add bn_training_mode to autoslim
* fix cfg files
* fix autoslim configs
* fix bugs when converting dynamic bn to static bn
* change to test loop
* refactor greedy search
* rebase and fix greedysearch
* fix lint
* fix and delete useless codes
* fix pytest
* fix pytest and add bn_training_mode
* fix lint
* add reference to AutoSlimGreedySearchLoop's docstring
* sort candidate_choices
* fix save subnet
* delete useless codes in channel container
* change files' name: convert greedy_search_loop to autoslim_greedy_search_loop
2023-01-03 21:12:04 +08:00
LKJacky
15768fd3e9
update l1 config ( #405 )
...
* add l1 config
* update l1 config
Co-authored-by: jacky <jacky@xx.com>
2023-01-03 17:29:46 +08:00
Xianpan Zhou
5ebf839a30
[CodeCamp #122 ] Support KD algorithm MGD for detection. ( #377 )
...
* [Feature] Support KD algorithm MGD for detection.
* use connector to beauty mgd.
* fix typo, add unitest.
* fix mgd loss unitest.
* fix mgd connector unitest.
* add model pth and log file.
* add mAP.
2023-01-03 17:21:42 +08:00
Yang Gao
f6d68dc73c
[Fix] Fix commands in README to adapt branch 1.x ( #400 )
...
* update commands in README for 1.x
* fix commands
Co-authored-by: gaoyang07 <1546308416@qq.com>
2022-12-16 20:54:21 +08:00
zengyi
82e9549dff
[Fix]Dcff Deploy Revision ( #383 )
...
* dcff deploy revision
* tempsave
* update fix_subnet
* update mutator load
* export/load_fix_subnet revision for mutator
* update fix_subnet with dev-1.x
* update comments
* update docs
* update registry
2022-12-16 20:53:30 +08:00
Yang Gao
42e8de73af
[Improvement] Adapt OFA series with SearchableMobileNetV3 ( #385 )
...
* fix mutable bug in AttentiveMobileNetV3
* remove unness code
* update ATTENTIVE_SUBNET_A0-A6.yaml with optimized names
* unify the sampling usage in sandwich_rule-based NAS
* use alias to export subnet
* update OFA configs
* fix attr bug
* fix comments
* update convert_supernet2subnet.py
* correct the way to dump DerivedMutable
* fix convert index bug
* update OFA configs & models
* fix dynamic2static
* generalize convert_ofa_ckpt.py
* update input_resizer
* update README.md
* fix ut
* update export_fix_subnet
* update _dynamic_to_static
* update fix_subnet UT & minor fix bugs
* fix ut
* add new autoaug compared to attentivenas
* clean
* fix act
* fix act_cfg
* update fix_subnet
* fix lint
* add docstring
Co-authored-by: gaoyang07 <1546308416@qq.com>
Co-authored-by: aptsunny <aptsunny@tongji.edu.cn>
2022-12-15 22:19:55 +08:00
LKJacky
f886821ba1
Add get_prune_config and a demo config_pruning ( #389 )
...
* update tools and test
* add demo
* disable test doc
* add switch for test tools and test_doc
* fix bug
* update doc
* update tools name
* mv get_channel_units
Co-authored-by: liukai <your_email@abc.example>
2022-12-13 10:56:29 +08:00
LKJacky
1c03a07350
Enhance the Abilities of the Tracer for Pruning. ( #371 )
...
* tmp
* add new mmdet models
* add docstring
* pass test and pre-commit
* rm razor tracer
* update fx tracer, now it can automatically wrap methods and functions.
* update tracer passed models
* add warning for torch <1.12.0
fix bug for python3.6
update placeholder to support placeholder.XXX
* fix bug
* update docs
* fix lint
* fix parse_cfg in configs
* restore mutablechannel
* test ite prune algorithm when using dist
* add get_model_from_path to MMModelLibrrary
* add mm models to DefaultModelLibrary
* add uts
* fix bug
* fix bug
* add uts
* add uts
* add uts
* add uts
* fix bug
* restore ite_prune_algorithm
* update doc
* PruneTracer -> ChannelAnalyzer
* prune_tracer -> channel_analyzer
* add test for fxtracer
* fix bug
* fix bug
* PruneTracer -> ChannelAnalyzer
refine
* CustomFxTracer -> MMFxTracer
* fix bug when test with torch<1.12
* update print log
* fix lint
* rm unuseful code
Co-authored-by: liukai <liukai@pjlab.org.cn>
Co-authored-by: jacky <jacky@xx.com>
Co-authored-by: Your Name <you@example.com>
Co-authored-by: liukai <your_email@abc.example>
2022-12-08 15:59:27 +08:00
whcao
79f1e9a6ca
[Bug] Fix ckpt ( #372 )
...
fix ckpt
2022-12-08 11:52:23 +08:00
qiufeng
b0b3fbdb49
[Feature] Add BigNAS algorithm ( #219 )
...
* add calibrate-bn-statistics
* add test calibrate-bn-statistics
* fix mixins
* fix mixins
* fix mixin tests
* remove slimmable channel mutable and refactor dynamic op
* refact dynamic batch norm
* add progressive dynamic conv2d
* add center crop dynamic conv2d
* refactor dynamic directory
* refactor dynamic sequential
* rename length to depth in dynamic sequential
* add test for derived mutable
* refactor dynamic op
* refactor api of dynamic op
* add derive mutable mixin
* addbignas algorithm
* refactor bignas structure
* add input resizer
* add input resizer to bignas
* move input resizer from algorithm into classifier
* remove compnents
* add attentive mobilenet
* delete json file
* nearly(less 0.2) align inference accuracy with gml
* move mutate seperated in bignas mobilenet backbone
* add zero_init_residual
* add set_dropout
* set dropout in bignas algorithm
* fix registry
* add subnet yaml and nearly align inference accuracy with gml
* add rsb config for bignas
* remove base in config
* add gml bignas config
* convert to iter based
* bignas forward and backward fly
* fix merge conflict
* fix dynamicseq bug
* fix bug and refactor bignas
* arrange configs of bignas
* fix typo
* refactor attentive_mobilenet
* fix channel mismatch due to registion of DerivedMutable
* update bignas & fix se channel mismatch
* add AutoAugmentV2 & remove unness configs
* fix lint
* recover channel assertion in channel unit
* fix a group bug
* fix comments
* add docstring
* add norm in dynamic_embed
* fix search loop & other minor changes
* fix se expansion
* minor change
* add ut for bignas & attentive_mobilenet
* fix ut
* update bignas readme
* rm unness ut & supplement get_placeholder
* fix lint
* fix ut
* add subnet deployment in downstream tasks.
* minor change
* update ofa backbone
* minor fix
* Continued improvements of searchable backbone
* minor change
* drop ratio in backbone
* fix comments
* fix ci test
* fix test
* add dynamic shortcut UT
* modify strategy to fit bignas
* fix test
* fix bug in neck
* fix error
* fix error
* fix yaml
* save subnet ckpt
* merge autoslim_val/test_loop into subnet_val_loop
* move calibrate_bn_mixin to utils
* fix bugs and add docstring
* clean code
* fix register bug
* clean code
* update
Co-authored-by: wangshiguang <wangshiguang@sensetime.com>
Co-authored-by: gaoyang07 <1546308416@qq.com>
Co-authored-by: aptsunny <aptsunny@tongji.edu.cn>
Co-authored-by: sunyue1 <sunyue1@sensetime.com>
2022-12-07 11:28:10 +08:00
P.Huang
3b6ef31158
[FIX] Fix wrn configs ( #368 )
...
* fix wrn configs
* fix wrn configs
* update online wrn model weight
2022-12-05 17:34:16 +08:00
LKJacky
b1db8f4999
fix bug in benchmark_test ( #364 )
...
fix bug in configs
Co-authored-by: Your Name <you@example.com>
2022-12-05 10:59:50 +08:00
zhongyu zhang
bbb58f1a5c
[Fix] Fix configs of wrn models and ofd. ( #361 )
...
* 1.revise the configs of wrn22, wrn24, and wrn40. 2.revise the data_preprocessor of ofd_backbone_resnet50_resnet18_8xb16_cifar10
* 1.Add README for vanilla-wrm.
* 1.Revise readme of wrn
Co-authored-by: zhangzhongyu <zhangzhongyu@pjlab.org.cn>
2022-11-30 23:41:49 +08:00
zengyi
76c3773e83
[Feature] Add DCFF ( #295 )
...
* add ChannelGroup (#250 )
* rebase new dev-1.x
* modification for adding config_template
* add docstring to channel_group.py
* add docstring to mutable_channel_group.py
* rm channel_group_cfg from Graph2ChannelGroups
* change choice type of SequentialChannelGroup from float to int
* add a warning about group-wise conv
* restore __init__ of dynamic op
* in_channel_mutable -> mutable_in_channel
* rm abstractproperty
* add a comment about VT
* rm registry for ChannelGroup
* MUTABLECHANNELGROUP -> ChannelGroupType
* refine docstring of IndexDict
* update docstring
* update docstring
* is_prunable -> is_mutable
* update docstring
* fix error in pre-commit
* update unittest
* add return type
* unify init_xxx apit
* add unitest about init of MutableChannelGroup
* update according to reviews
* sequential_channel_group -> sequential_mutable_channel_group
Co-authored-by: liukai <liukai@pjlab.org.cn>
* Add BaseChannelMutator and refactor Autoslim (#289 )
* add BaseChannelMutator
* add autoslim
* tmp
* make SequentialMutableChannelGroup accpeted both of num and ratio as choice. and supports divisior
* update OneShotMutableChannelGroup
* pass supernet training of autoslim
* refine autoslim
* fix bug in OneShotMutableChannelGroup
* refactor make_divisible
* fix spell error: channl -> channel
* init_using_backward_tracer -> init_from_backward_tracer
init_from_fx_tracer -> init_from_fx_tracer
* refine SequentialMutableChannelGroup
* let mutator support models with dynamicop
* support define search space in model
* tracer_cfg -> parse_cfg
* refine
* using -> from
* update docstring
* update docstring
Co-authored-by: liukai <liukai@pjlab.org.cn>
* tmpsave
* migrate ut
* tmpsave2
* add loss collector
* refactor slimmable and add l1-norm (#291 )
* refactor slimmable and add l1-norm
* make l1-norm support convnd
* update get_channel_groups
* add l1-norm_resnet34_8xb32_in1k.py
* add pretrained to resnet34-l1
* remove old channel mutator
* BaseChannelMutator -> ChannelMutator
* update according to reviews
* add readme to l1-norm
* MBV2_slimmable -> MBV2_slimmable_config
Co-authored-by: liukai <liukai@pjlab.org.cn>
* update config
* fix md & pytorch support <1.9.0 in batchnorm init
* Clean old codes. (#296 )
* remove old dynamic ops
* move dynamic ops
* clean old mutable_channels
* rm OneShotMutableChannel
* rm MutableChannel
* refine
* refine
* use SquentialMutableChannel to replace OneshotMutableChannel
* refactor dynamicops folder
* let SquentialMutableChannel support float
Co-authored-by: liukai <liukai@pjlab.org.cn>
* fix ci
* ci fix py3.6.x & add mmpose
* ci fix py3.6.9 in utils/index_dict.py
* fix mmpose
* minimum_version_cpu=3.7
* fix ci 3.7.13
* fix pruning &meta ci
* support python3.6.9
* fix py3.6 import caused by circular import patch in py3.7
* fix py3.6.9
* Add channel-flow (#301 )
* base_channel_mutator -> channel_mutator
* init
* update docstring
* allow omitting redundant configs for channel
* add register_mutable_channel_to_a_module to MutableChannelContainer
* update according to reviews 1
* update according to reviews 2
* update according to reviews 3
* remove old docstring
* fix error
* using->from
* update according to reviews
* support self-define input channel number
* update docstring
* chanenl -> channel_elem
Co-authored-by: liukai <liukai@pjlab.org.cn>
Co-authored-by: jacky <jacky@xx.com>
* support >=3.7
* support py3.6.9
* Rename: ChannelGroup -> ChannelUnit (#302 )
* refine repr of MutableChannelGroup
* rename folder name
* ChannelGroup -> ChannelUnit
* filename in units folder
* channel_group -> channel_unit
* groups -> units
* group -> unit
* update
* get_mutable_channel_groups -> get_mutable_channel_units
* fix bug
* refine docstring
* fix ci
* fix bug in tracer
Co-authored-by: liukai <liukai@pjlab.org.cn>
* update new channel config format
* update pruning refactor
* update merged pruning
* update commit
* fix dynamic_conv_mixin
* update comments: readme&dynamic_conv_mixins.py
* update readme
* move kl softmax channel pooling to op by comments
* fix comments: fix redundant & split README.md
* dcff in ItePruneAlgorithm
* partial dynamic params for fuseconv
* add step_freq & prune_time check
* update comments
* update comments
* update comments
* fix ut
* fix gpu ut & revise step_freq in ItePruneAlgorithm
* update readme
* revise ItePruneAlgorithm
* fix docs
* fix dynamic_conv attr
* fix ci
Co-authored-by: LKJacky <108643365+LKJacky@users.noreply.github.com>
Co-authored-by: liukai <liukai@pjlab.org.cn>
Co-authored-by: zengyi.vendor <zengyi.vendor@sensetime.com>
Co-authored-by: jacky <jacky@xx.com>
2022-11-23 09:55:33 +08:00
Yang Gao
18fc50f7bc
[Feature] Add performance predictor ( #306 )
...
* add predictor with 4 handlers
* [Improvement] Update Candidate with multi-dim search constraints. (#322 )
* update doc
* add support type
* clean code
* update candidates
* clean
* xx
* set_resource -> set_score
* fix ci bug
* py36 lint
* fix bug
* fix check constrain
* py36 ci
* redesign candidate
* fix pre-commit
* update cfg
* add build_resource_estimator
* fix ci bug
* remove runner.epoch in testcase
* update metric_predictor:
1. update MetricPredictor;
2. add predictor config for searching;
3. add predictor in evolution_search_loop.
* add UT for predictor
* add MLPHandler
* patch optional.txt for predictors
* patch test_evolution_search_loop
* refactor apis of predictor and handlers
* fix ut and remove predictor_cfg in predictor
* adapt new mutable & mutator design
* fix ut
* remove unness assert after rebase
* move predictor-build in __init__ & simplify estimator-build
Co-authored-by: Yue Sun <aptsunny@tongji.edu.cn>
2022-11-14 17:09:32 +08:00
Yue Sun
fb42405af8
[Feature] Add Autoformer algorithm ( #315 )
...
* update candidates
* update subnet_sampler_loop
* update candidate
* add readme
* rename variable
* rename variable
* clean
* update
* add doc string
* Revert "[Improvement] Support for candidate multiple dimensional search constraints."
* [Improvement] Update Candidate with multi-dim search constraints. (#322 )
* update doc
* add support type
* clean code
* update candidates
* clean
* xx
* set_resource -> set_score
* fix ci bug
* py36 lint
* fix bug
* fix check constrain
* py36 ci
* redesign candidate
* fix pre-commit
* update cfg
* add build_resource_estimator
* fix ci bug
* remove runner.epoch in testcase
* [Feature] Autoformer architecture and dynamicOPs (#327 )
* add DynamicSequential
* dynamiclayernorm
* add dynamic_pathchembed
* add DynamicMultiheadAttention and DynamicRelativePosition2D
* add channel-level dynamicOP
* add autoformer algo
* clean notes
* adapt channel_mutator
* vit fly
* fix import
* mutable init
* remove annotation
* add DynamicInputResizer
* add unittest for mutables
* add OneShotMutableChannelUnit_VIT
* clean code
* reset unit for vit
* remove attr
* add autoformer backbone UT
* add valuemutator UT
* clean code
* add autoformer algo UT
* update classifier UT
* fix test error
* ignore
* make lint
* update
* fix lint
* mutable_attrs
* fix test
* fix error
* remove DynamicInputResizer
* fix test ci
* remove InputResizer
* rename variables
* modify type
* Continued improvements of ChannelUnit
* fix lint
* fix lint
* remove OneShotMutableChannelUnit
* adjust derived type
* combination mixins
* clean code
* fix sample subnet
* search loop fly
* more annotations
* avoid counter warning and modify batch_augment cfg by gy
* restore
* source_value_mutables restriction
* simply arch_setting api
* update
* clean
* fix ut
2022-11-14 13:01:04 +08:00
whcao
d90c786820
[Fix] Update readme ( #341 )
...
* update kl readme
* update dsnas readme
* fix url
2022-11-01 21:12:04 +08:00
pppppM
d37829eb60
[Refactor] Refactor Mutables and Mutators ( #324 )
...
* refactor mutables
* update load fix subnet
* add DumpChosen Typehint
* adapt UTs
* fix lint
* Add GroupMixin to ChannelMutator (temporarily)
* fix type hints
* add GroupMixin doc-string
* modified by comments
* fix type hits
* update subnet format
* fix channel group bugs and add UTs
* fix doc string
* fix comments
* refactor diff module forward
* fix error in channel mutator doc
* fix comments
Co-authored-by: liukai <liukai@pjlab.org.cn>
2022-11-01 12:49:42 +08:00
whcao
86c61539b1
[Feature] PyTorch version of `PKD: General Distillation Framework for Object Detectors via Pearson Correlation Coefficient`. ( #304 )
...
* add pkd
* add pytest for pkd
* fix cfg
* WIP: support fcos3d
* WIP: support fcos3d pkd
* support mmdet3d
* fix cfgs
* change eps to 1e-6 and add some comments
* fix docstring
* fix cfg
* add assert
* add type hint
* WIP: add readme and metafile
* fix readme
* update metafiles and readme
* fix metafile
* fix pipeline figure
2022-10-26 19:18:20 +08:00
whcao
db32b32e38
[Feature] Add kd examples ( #305 )
...
* support kd for mbv2 and shufflenetv2
* WIP: fix ckpt path
* WIP: fix kd r34-r18
* add metafile
* fix metafile
* delete
2022-10-26 13:27:11 +08:00
whcao
1e8f886523
[Feature]Feature map visualization ( #293 )
...
* WIP: vis
* WIP: add visualization
* WIP: add visualization hook
* WIP: support razor visualizer
* WIP
* WIP: wrap draw_featmap
* support feature map visualization
* add a demo image for visualization
* fix typos
* change eps to 1e-6
* add pytest for visualization
* fix vis hook
* fix arguments' name
* fix img path
* support draw inference results
* add visualization doc
* fix figure url
* move files
Co-authored-by: weihan cao <HIT-cwh>
2022-10-26 13:26:20 +08:00
whcao
8c7cdb3c73
[Feature] Add deit-base ( #332 )
...
* WIP: support deit
* WIP: add deithead
* WIP: fix checkpoint hook
* fix data preprocessor
* fix cfg
* WIP: add readme
* reset single_teacher_distill
* add metafile
* add model to model-index
* fix configs and readme
2022-10-25 21:18:18 +08:00
LKJacky
b3c8bb9550
move autoslim to nas ( #326 )
...
Co-authored-by: jacky <jacky@xx.com>
2022-10-18 16:29:55 +08:00
PJDong
dd51ab8ca0
[Feature] Support unroll with MMDDP in darts algorithm ( #210 )
...
* support unroll in darts
* fix bugs in optimizer; add docstring
* update darts algorithm [untested]
* modify autograd.grad to optim_wrapper.backward
* add amp in train.py; support constructor
* rename mmcls.data to mmcls.structures
* modify darts algo to support apex [not done]
* fix code spell in diff_mutable_module
* modify optim_context of dartsddp
* add testcase for dartsddp
* fix bugs of apex in dartsddp
* standardized the unittest of darts
* adapt new data_preprocessor
* fix ut bugs
* remove unness code
Co-authored-by: gaoyang07 <1546308416@qq.com>
2022-10-14 17:41:11 +08:00
LKJacky
b4b7e2432a
merge pruning into dev-1.x ( #312 )
...
* add ChannelGroup (#250 )
* rebase new dev-1.x
* modification for adding config_template
* add docstring to channel_group.py
* add docstring to mutable_channel_group.py
* rm channel_group_cfg from Graph2ChannelGroups
* change choice type of SequentialChannelGroup from float to int
* add a warning about group-wise conv
* restore __init__ of dynamic op
* in_channel_mutable -> mutable_in_channel
* rm abstractproperty
* add a comment about VT
* rm registry for ChannelGroup
* MUTABLECHANNELGROUP -> ChannelGroupType
* refine docstring of IndexDict
* update docstring
* update docstring
* is_prunable -> is_mutable
* update docstring
* fix error in pre-commit
* update unittest
* add return type
* unify init_xxx apit
* add unitest about init of MutableChannelGroup
* update according to reviews
* sequential_channel_group -> sequential_mutable_channel_group
Co-authored-by: liukai <liukai@pjlab.org.cn>
* Add BaseChannelMutator and refactor Autoslim (#289 )
* add BaseChannelMutator
* add autoslim
* tmp
* make SequentialMutableChannelGroup accpeted both of num and ratio as choice. and supports divisior
* update OneShotMutableChannelGroup
* pass supernet training of autoslim
* refine autoslim
* fix bug in OneShotMutableChannelGroup
* refactor make_divisible
* fix spell error: channl -> channel
* init_using_backward_tracer -> init_from_backward_tracer
init_from_fx_tracer -> init_from_fx_tracer
* refine SequentialMutableChannelGroup
* let mutator support models with dynamicop
* support define search space in model
* tracer_cfg -> parse_cfg
* refine
* using -> from
* update docstring
* update docstring
Co-authored-by: liukai <liukai@pjlab.org.cn>
* refactor slimmable and add l1-norm (#291 )
* refactor slimmable and add l1-norm
* make l1-norm support convnd
* update get_channel_groups
* add l1-norm_resnet34_8xb32_in1k.py
* add pretrained to resnet34-l1
* remove old channel mutator
* BaseChannelMutator -> ChannelMutator
* update according to reviews
* add readme to l1-norm
* MBV2_slimmable -> MBV2_slimmable_config
Co-authored-by: liukai <liukai@pjlab.org.cn>
* Clean old codes. (#296 )
* remove old dynamic ops
* move dynamic ops
* clean old mutable_channels
* rm OneShotMutableChannel
* rm MutableChannel
* refine
* refine
* use SquentialMutableChannel to replace OneshotMutableChannel
* refactor dynamicops folder
* let SquentialMutableChannel support float
Co-authored-by: liukai <liukai@pjlab.org.cn>
* Add channel-flow (#301 )
* base_channel_mutator -> channel_mutator
* init
* update docstring
* allow omitting redundant configs for channel
* add register_mutable_channel_to_a_module to MutableChannelContainer
* update according to reviews 1
* update according to reviews 2
* update according to reviews 3
* remove old docstring
* fix error
* using->from
* update according to reviews
* support self-define input channel number
* update docstring
* chanenl -> channel_elem
Co-authored-by: liukai <liukai@pjlab.org.cn>
Co-authored-by: jacky <jacky@xx.com>
* Rename: ChannelGroup -> ChannelUnit (#302 )
* refine repr of MutableChannelGroup
* rename folder name
* ChannelGroup -> ChannelUnit
* filename in units folder
* channel_group -> channel_unit
* groups -> units
* group -> unit
* update
* get_mutable_channel_groups -> get_mutable_channel_units
* fix bug
* refine docstring
* fix ci
* fix bug in tracer
Co-authored-by: liukai <liukai@pjlab.org.cn>
* Merge dev-1.x to pruning (#311 )
* [feature] CONTRASTIVE REPRESENTATION DISTILLATION with dataset wrapper (#281 )
* init
* TD: CRDLoss
* complete UT
* fix docstrings
* fix ci
* update
* fix CI
* DONE
* maintain CRD dataset unique funcs as a mixin
* maintain CRD dataset unique funcs as a mixin
* maintain CRD dataset unique funcs as a mixin
* add UT: CRD_ClsDataset
* init
* TODO: UT test formatting.
* init
* crd dataset wrapper
* update docstring
Co-authored-by: huangpengsheng <huangpengsheng@sensetime.com>
* [Improvement] Update estimator with api revision (#277 )
* update estimator usage and fix bugs
* refactor api of estimator & add inner check methods
* fix docstrings
* update search loop and config
* fix lint
* update unittest
* decouple mmdet dependency and fix lint
Co-authored-by: humu789 <humu@pjlab.org.cn>
* [Fix] Fix tracer (#273 )
* test image_classifier_loss_calculator
* fix backward tracer
* update SingleStageDetectorPseudoLoss
* merge
* [Feature] Add Dsnas Algorithm (#226 )
* [tmp] Update Dsnas
* [tmp] refactor arch_loss & flops_loss
* Update Dsnas & MMRAZOR_EVALUATOR:
1. finalized compute_loss & handle_grads in algorithm;
2. add MMRAZOR_EVALUATOR;
3. fix bugs.
* Update lr scheduler & fix a bug:
1. update param_scheduler & lr_scheduler for dsnas;
2. fix a bug of switching to finetune stage.
* remove old evaluators
* remove old evaluators
* update param_scheduler config
* merge dev-1.x into gy/estimator
* add flops_loss in Dsnas using ResourcesEstimator
* get resources before mutator.prepare_from_supernet
* delete unness broadcast api from gml
* broadcast spec_modules_resources when estimating
* update early fix mechanism for Dsnas
* fix merge
* update units in estimator
* minor change
* fix data_preprocessor api
* add flops_loss_coef
* remove DsnasOptimWrapper
* fix bn eps and data_preprocessor
* fix bn weight decay bug
* add betas for mutator optimizer
* set diff_rank_seed=True for dsnas
* fix start_factor of lr when warm up
* remove .module in non-ddp mode
* add GlobalAveragePoolingWithDropout
* add UT for dsnas
* remove unness channel adjustment for shufflenetv2
* update supernet configs
* delete unness dropout
* delete unness part with minor change on dsnas
* minor change on the flag of search stage
* update README and subnet configs
* add UT for OneHotMutableOP
* [Feature] Update train (#279 )
* support auto resume
* add enable auto_scale_lr in train.py
* support '--amp' option
* [Fix] Fix darts metafile (#278 )
fix darts metafile
* fix ci (#284 )
* fix ci for circle ci
* fix bug in test_metafiles
* add pr_stage_test for github ci
* add multiple version
* fix ut
* fix lint
* Temporarily skip dataset UT
* update github ci
* add github lint ci
* install wheel
* remove timm from requirements
* install wheel when test on windows
* fix error
* fix bug
* remove github windows ci
* fix device error of arch_params when DsnasDDP
* fix CRD dataset ut
* fix scope error
* rm test_cuda in workflows of github
* [Doc] fix typos in en/usr_guides
Co-authored-by: liukai <liukai@pjlab.org.cn>
Co-authored-by: pppppM <gjf_mail@126.com>
Co-authored-by: gaoyang07 <1546308416@qq.com>
Co-authored-by: huangpengsheng <huangpengsheng@sensetime.com>
Co-authored-by: SheffieldCao <1751899@tongji.edu.cn>
Co-authored-by: P.Huang <37200926+FreakieHuang@users.noreply.github.com>
Co-authored-by: huangpengsheng <huangpengsheng@sensetime.com>
Co-authored-by: Yang Gao <Gary1546308416AL@gmail.com>
Co-authored-by: humu789 <humu@pjlab.org.cn>
Co-authored-by: whcao <41630003+HIT-cwh@users.noreply.github.com>
Co-authored-by: liukai <liukai@pjlab.org.cn>
Co-authored-by: pppppM <gjf_mail@126.com>
Co-authored-by: gaoyang07 <1546308416@qq.com>
Co-authored-by: SheffieldCao <1751899@tongji.edu.cn>
* Refine pruning branch (#307 )
* [feature] CONTRASTIVE REPRESENTATION DISTILLATION with dataset wrapper (#281 )
* init
* TD: CRDLoss
* complete UT
* fix docstrings
* fix ci
* update
* fix CI
* DONE
* maintain CRD dataset unique funcs as a mixin
* maintain CRD dataset unique funcs as a mixin
* maintain CRD dataset unique funcs as a mixin
* add UT: CRD_ClsDataset
* init
* TODO: UT test formatting.
* init
* crd dataset wrapper
* update docstring
Co-authored-by: huangpengsheng <huangpengsheng@sensetime.com>
* [Improvement] Update estimator with api revision (#277 )
* update estimator usage and fix bugs
* refactor api of estimator & add inner check methods
* fix docstrings
* update search loop and config
* fix lint
* update unittest
* decouple mmdet dependency and fix lint
Co-authored-by: humu789 <humu@pjlab.org.cn>
* [Fix] Fix tracer (#273 )
* test image_classifier_loss_calculator
* fix backward tracer
* update SingleStageDetectorPseudoLoss
* merge
* [Feature] Add Dsnas Algorithm (#226 )
* [tmp] Update Dsnas
* [tmp] refactor arch_loss & flops_loss
* Update Dsnas & MMRAZOR_EVALUATOR:
1. finalized compute_loss & handle_grads in algorithm;
2. add MMRAZOR_EVALUATOR;
3. fix bugs.
* Update lr scheduler & fix a bug:
1. update param_scheduler & lr_scheduler for dsnas;
2. fix a bug of switching to finetune stage.
* remove old evaluators
* remove old evaluators
* update param_scheduler config
* merge dev-1.x into gy/estimator
* add flops_loss in Dsnas using ResourcesEstimator
* get resources before mutator.prepare_from_supernet
* delete unness broadcast api from gml
* broadcast spec_modules_resources when estimating
* update early fix mechanism for Dsnas
* fix merge
* update units in estimator
* minor change
* fix data_preprocessor api
* add flops_loss_coef
* remove DsnasOptimWrapper
* fix bn eps and data_preprocessor
* fix bn weight decay bug
* add betas for mutator optimizer
* set diff_rank_seed=True for dsnas
* fix start_factor of lr when warm up
* remove .module in non-ddp mode
* add GlobalAveragePoolingWithDropout
* add UT for dsnas
* remove unness channel adjustment for shufflenetv2
* update supernet configs
* delete unness dropout
* delete unness part with minor change on dsnas
* minor change on the flag of search stage
* update README and subnet configs
* add UT for OneHotMutableOP
* [Feature] Update train (#279 )
* support auto resume
* add enable auto_scale_lr in train.py
* support '--amp' option
* [Fix] Fix darts metafile (#278 )
fix darts metafile
* fix ci (#284 )
* fix ci for circle ci
* fix bug in test_metafiles
* add pr_stage_test for github ci
* add multiple version
* fix ut
* fix lint
* Temporarily skip dataset UT
* update github ci
* add github lint ci
* install wheel
* remove timm from requirements
* install wheel when test on windows
* fix error
* fix bug
* remove github windows ci
* fix device error of arch_params when DsnasDDP
* fix CRD dataset ut
* fix scope error
* rm test_cuda in workflows of github
* [Doc] fix typos in en/usr_guides
Co-authored-by: liukai <liukai@pjlab.org.cn>
Co-authored-by: pppppM <gjf_mail@126.com>
Co-authored-by: gaoyang07 <1546308416@qq.com>
Co-authored-by: huangpengsheng <huangpengsheng@sensetime.com>
Co-authored-by: SheffieldCao <1751899@tongji.edu.cn>
* fix bug when python=3.6
* fix lint
* fix bug when test using cpu only
* refine ci
* fix error in ci
* try ci
* update repr of Channel
* fix error
* mv init_from_predefined_model to MutableChannelUnit
* move tests
* update SquentialMutableChannel
* update l1 mutable channel unit
* add OneShotMutableChannel
* candidate_mode -> choice_mode
* update docstring
* change ci
Co-authored-by: P.Huang <37200926+FreakieHuang@users.noreply.github.com>
Co-authored-by: huangpengsheng <huangpengsheng@sensetime.com>
Co-authored-by: Yang Gao <Gary1546308416AL@gmail.com>
Co-authored-by: humu789 <humu@pjlab.org.cn>
Co-authored-by: whcao <41630003+HIT-cwh@users.noreply.github.com>
Co-authored-by: liukai <liukai@pjlab.org.cn>
Co-authored-by: pppppM <gjf_mail@126.com>
Co-authored-by: gaoyang07 <1546308416@qq.com>
Co-authored-by: SheffieldCao <1751899@tongji.edu.cn>
Co-authored-by: liukai <liukai@pjlab.org.cn>
Co-authored-by: jacky <jacky@xx.com>
Co-authored-by: P.Huang <37200926+FreakieHuang@users.noreply.github.com>
Co-authored-by: huangpengsheng <huangpengsheng@sensetime.com>
Co-authored-by: Yang Gao <Gary1546308416AL@gmail.com>
Co-authored-by: humu789 <humu@pjlab.org.cn>
Co-authored-by: whcao <41630003+HIT-cwh@users.noreply.github.com>
Co-authored-by: pppppM <gjf_mail@126.com>
Co-authored-by: gaoyang07 <1546308416@qq.com>
Co-authored-by: SheffieldCao <1751899@tongji.edu.cn>
2022-10-10 17:30:25 +08:00
LKJacky
f98ac3416b
fix ci ( #284 )
...
* fix ci for circle ci
* fix bug in test_metafiles
* add pr_stage_test for github ci
* add multiple version
* fix ut
* fix lint
* Temporarily skip dataset UT
* update github ci
* add github lint ci
* install wheel
* remove timm from requirements
* install wheel when test on windows
* fix error
* fix bug
* remove github windows ci
* fix device error of arch_params when DsnasDDP
* fix CRD dataset ut
* fix scope error
* rm test_cuda in workflows of github
* [Doc] fix typos in en/usr_guides
Co-authored-by: liukai <liukai@pjlab.org.cn>
Co-authored-by: pppppM <gjf_mail@126.com>
Co-authored-by: gaoyang07 <1546308416@qq.com>
Co-authored-by: huangpengsheng <huangpengsheng@sensetime.com>
Co-authored-by: SheffieldCao <1751899@tongji.edu.cn>
2022-10-10 10:06:57 +08:00
Yang Gao
6cd8c68d0f
[Fix] Fix darts metafile ( #278 )
...
fix darts metafile
2022-10-08 10:35:48 +08:00
Yang Gao
8d603d917e
[Feature] Add Dsnas Algorithm ( #226 )
...
* [tmp] Update Dsnas
* [tmp] refactor arch_loss & flops_loss
* Update Dsnas & MMRAZOR_EVALUATOR:
1. finalized compute_loss & handle_grads in algorithm;
2. add MMRAZOR_EVALUATOR;
3. fix bugs.
* Update lr scheduler & fix a bug:
1. update param_scheduler & lr_scheduler for dsnas;
2. fix a bug of switching to finetune stage.
* remove old evaluators
* remove old evaluators
* update param_scheduler config
* merge dev-1.x into gy/estimator
* add flops_loss in Dsnas using ResourcesEstimator
* get resources before mutator.prepare_from_supernet
* delete unness broadcast api from gml
* broadcast spec_modules_resources when estimating
* update early fix mechanism for Dsnas
* fix merge
* update units in estimator
* minor change
* fix data_preprocessor api
* add flops_loss_coef
* remove DsnasOptimWrapper
* fix bn eps and data_preprocessor
* fix bn weight decay bug
* add betas for mutator optimizer
* set diff_rank_seed=True for dsnas
* fix start_factor of lr when warm up
* remove .module in non-ddp mode
* add GlobalAveragePoolingWithDropout
* add UT for dsnas
* remove unness channel adjustment for shufflenetv2
* update supernet configs
* delete unness dropout
* delete unness part with minor change on dsnas
* minor change on the flag of search stage
* update README and subnet configs
* add UT for OneHotMutableOP
2022-09-29 16:48:47 +08:00
Yang Gao
4e80037393
[Improvement] Update estimator with api revision ( #277 )
...
* update estimator usage and fix bugs
* refactor api of estimator & add inner check methods
* fix docstrings
* update search loop and config
* fix lint
* update unittest
* decouple mmdet dependency and fix lint
Co-authored-by: humu789 <humu@pjlab.org.cn>
2022-09-14 20:39:49 +08:00
P.Huang
eb25bb7577
[feature] CONTRASTIVE REPRESENTATION DISTILLATION with dataset wrapper ( #281 )
...
* init
* TD: CRDLoss
* complete UT
* fix docstrings
* fix ci
* update
* fix CI
* DONE
* maintain CRD dataset unique funcs as a mixin
* maintain CRD dataset unique funcs as a mixin
* maintain CRD dataset unique funcs as a mixin
* add UT: CRD_ClsDataset
* init
* TODO: UT test formatting.
* init
* crd dataset wrapper
* update docstring
Co-authored-by: huangpengsheng <huangpengsheng@sensetime.com>
2022-09-13 20:53:43 +08:00
pppppM
41464c6af3
Regression Benchmark ( #271 )
...
* update benchmark test
* fix circle ci gpu config
* move delivery, recorder, tracer from structures to task modules
* move ops from models to models.architectures
* rename dynamic_op to dynamic_ops
* fix configs and metafiles
* remove some github ci
* fix configs / readme / metafile
Co-authored-by: gaojianfei <gaojianfei@sensetime.com>
2022-09-01 11:54:18 +08:00
P.Huang
5e072c4031
[Improvement] Update OFD & FT metafiles ( #267 )
...
* update OFD & FT metafiles
* update wrn16 metadata
* update all metafiles
Co-authored-by: huangpengsheng <huangpengsheng@sensetime.com>
2022-09-01 09:36:28 +08:00
humu789
57bf6fa30e
Update search configs ( #269 )
...
* update score_key
* search_debug
* add search config & fix _check_constraints
* fix ut
2022-09-01 00:13:01 +08:00
zengyi
f9ac06f36c
[Docs] Add metafiles for KD algo (BYOT, DKD) ( #264 )
...
* fix config
* add metafile
add model & log link
* fix inference time
* fix recorder.py typo
* fix advanced_guides linting
* fix docs tailing space & eof
* fix readme style
* fix auto merge
Co-authored-by: zengyi.vendor <zengyi.vendor@sensetime.com>
2022-08-31 22:16:07 +08:00
zhongyu zhang
f69aeabc69
[Docs] Add metafiles for KD algos (ABLoss, DAFL, DFAD, FBKD, FitNets, ZSKT) ( #266 )
...
1.Add metafiles for 6 kd algos.
2.Add model and log links.
3.Revise data_samples in datafreedistillation for new feature of mmengine.
2022-08-31 22:12:25 +08:00
PJDong
24e106ba1d
[Doc] Optimize docs and Fix lint ( #261 )
...
* fix name of mmcv-full to mmcv
* [doc] move the location of nas/kd/pruning; fix lint errors
* optimize docs and fix pre-commit error
* [Doc&Fix] add note for installation; fix the requirements
* update docs
Co-authored-by: humu789 <humu@pjlab.org.cn>
2022-08-31 18:33:47 +08:00
pppppM
5105489d64
Revert "[Enhancement] Add benchmark test script" ( #263 )
...
Revert "[Enhancement] Add benchmark test script (#262 )"
This reverts commit f60cf9c469
.
2022-08-30 22:03:54 +08:00
pppppM
f60cf9c469
[Enhancement] Add benchmark test script ( #262 )
...
* update benchmark test
* fix circle ci gpu config
* fix lints
Co-authored-by: gaojianfei <gaojianfei@sensetime.com>
2022-08-30 21:59:39 +08:00
humu789
ce22497b25
[Docs] Add docs and update algo README ( #259 )
...
* docs v0.1
* update picture links in algo README
2022-08-30 19:46:37 +08:00
pppppM
e3390ce8ae
[Bugs] Fix some bugs found during testing and add dev scripts ( #256 )
...
* update training benchmark
* fix detnas config
* fix pspnet config
* fix cwd configs
* fix cwd metafile
Co-authored-by: gaojianfei <gaojianfei@sensetime.com>
2022-08-29 22:45:46 +08:00
pppppM
179bd5287d
[Fix] Adapt latest mmcv ( #253 )
...
* Adapt to the latest mmcv and mmengine
* fixed ut_subnet_sampler_loop
* fix get_model
* fix lints
Co-authored-by: humu789 <humu@pjlab.org.cn>
2022-08-29 20:34:51 +08:00
P.Huang
5d9fcd8070
[feature] add `A Comprehensive Overhaul of Feature Distillation` ( #244 )
...
* init
* init
* linting
* add README
* add wrn docstrings
* add vanilla wrn configs
* fix UT
* OFD DONE
* update OFD readme
* update config path
* rename vanilla model files
* rename vanilla models config files
Co-authored-by: huangpengsheng <huangpengsheng@sensetime.com>
2022-08-29 14:06:58 +08:00
zhongyu zhang
1c0da58dae
[Feature] Add FBKD algorithm and torch_connectors ( #248 )
...
* 1.Add FBKD
* 1.Add torch_connector and its ut. 2.Revise readme and fbkd config.
* 1.Revise UT for torch_connectors
* 1.Revise nonlocalblock into a subclass of NonLocal2d in mmcv.cnn
2022-08-29 10:05:32 +08:00
zhongyu zhang
f3b964c521
[Feature] Add DFAD algorithm ( #247 )
...
1.Add DFAD algorithm. 2.Add L1Loss and its UT.
2022-08-25 15:46:45 +08:00