Mashiro
d6ad01a4cf
[Fix]: fix ci ( #441 )
2022-08-18 14:04:19 +08:00
Mashiro
e08b9031fc
[Enhance] Support building optimizer wrapper from built Optimizer instance ( #422 )
...
* support build optimizer wrapper from built Optimizer instance
* refine comments
2022-08-17 19:17:00 +08:00
Zaida Zhou
f98ba60629
[Enhancement] Improve unit tests of mmengine/runner ( #182 )
...
* [Enhancement] Add unit test for get_priority
* fix priority ut
* fix typo
Co-authored-by: Wenwei Zhang <40779233+ZwwWayne@users.noreply.github.com>
2022-08-15 10:57:58 +08:00
Mashiro
2708b7ed48
fix ci ( #424 )
2022-08-13 09:15:08 +08:00
Mashiro
ee56f151f6
[Fix] Support training with data without metainfo
. ( #417 )
...
* support training with data without metainfo
* clean the code
* clean the code
2022-08-11 14:51:11 +08:00
Ma Zerun
9b2a0e02da
[Enhance] Add data_preprocessor
config as an argument of runner. ( #343 )
...
* [Enhance] Add `preprocess_cfg` as an argument of runner.
* Rename `preprocess_cfg` to `data_preprocessor`
* Fix docstring
2022-08-09 11:25:29 +08:00
Mashiro
a07a063306
[Enhance] Add build function for scheduler. ( #372 )
...
* add build function for scheduler
* add unit test
add unit test
* handle convert_to_iter in build_scheduler_from_cfg
* restore deleted code
* format import
* fix lint
2022-08-08 20:34:16 +08:00
Mashiro
5580542666
[Fix] Fix build multiple list of scheduler for multiple optimizers ( #383 )
...
* fix build multiple scheduler
* add new unit test
* fix comment and error message
* fix comment and error message
* extract _parse_scheduler_cfg
* always call build_param_scheduler during train and resume. If there is only one optimizer, the defaut value for sheduler will be a list, otherwise there is multiple optimizer, the default value of sheduler will be a dict
* minor refine
* rename runner test exp name
* fix as comment
* minor refine
* fix ut
* only check parameter scheduler
* minor refine
2022-08-08 17:05:27 +08:00
Mashiro
1a8f013937
[Refine] Make scheduler default to None ( #396 )
...
* make scheduler default to None
* fix bc breaking
* refine warning message
* fix as comment
* fix as comment
* fix lint
2022-08-04 20:13:13 +08:00
RangiLyu
1241c21296
[Fix] Fix weight initializing in test and refine registry logging. ( #367 )
...
* [Fix] Fix weight initializing and registry logging.
* sync params
* resolve comments
2022-07-19 18:28:57 +08:00
Ma Zerun
3da66d1f87
[Enhance] Auto set the end
of param schedulers. ( #361 )
...
* [Enhance] Auto set the `end` of param schedulers.
* Add log output and unit test
* Update docstring
* Update unit tests of `CosineAnnealingParamScheduler`.
2022-07-15 19:53:28 +08:00
Mashiro
78fad67d0d
[Fix] fix resume message_hub ( #353 )
...
* fix resume message_hub
* add unit test
* support resume from messagehub
* minor refine
* add comment
* fix typo
* update docstring
2022-07-14 20:13:22 +08:00
Mashiro
2853045e96
[Fix] Fix build multiple runners error ( #348 )
...
* fix build multiple runner error
* fix comments
* fix cpu ci
2022-07-05 20:35:06 +08:00
Cedric Luo
9c55b4300c
[Enhance] Support dynamic interval ( #342 )
...
* support dynamic interval in iterbasedtrainloop
* update typehint
* update typehint
* add dynamic interval in epochbasedtrainloop
* update
* fix
Co-authored-by: luochunhua.vendor <luochunhua@pjlab.org.cn>
2022-06-30 15:08:56 +08:00
Mashiro
59b0ccfe6f
[Fix] Fix pytorch version compatibility of autocast ( #339 )
...
* fix unit test of autocast
* fix compatiblity of unit test of optimizerwrapper
* clean code
* fix as comment
* fix docstring
2022-06-29 20:30:53 +08:00
Yuan Liu
03d5c17ba6
[Feature]: Set different seed to different rank ( #298 )
...
* [Feature]: Set different seed for diff rank
* [Feature]: Add log
* [Fix]: Fix lint
* [Fix]: Fix docstring
* [Fix]: Fix sampler seed
* [Fix]: Fix log bug
* [Fix]: Change diff_seed to diff_rank_seed
* [Fix]: Fix lint
2022-06-24 14:28:16 +08:00
Alex Yang
e18832f046
[Feat] Support revert syncbn ( #326 )
...
* [Feat] Support revert syncbn
* use logger.info but not warning
* fix info string
2022-06-22 19:50:54 +08:00
Mashiro
312f264ecd
[Feature] Add autocast wrapper ( #307 )
...
* add autocast wrapper
* fix docstring
* fix docstring
* fix compare version
* fix unit test
* fix incompatible arguments
* fix as comment
* fix unit test
* rename auto_cast to autocast
2022-06-22 19:49:20 +08:00
Alex Yang
216521a936
[Feat] Support save best ckpt ( #310 )
...
* [Feat] Support save best ckpt
* reformat code
* rename function and reformat code
* fix logging info
2022-06-22 19:48:46 +08:00
Mashiro
4a4d6b1ab2
[Enhance] dump messagehub in runner.resume ( #237 )
...
* [Enhance] dump messagehub in runner.resume
* delete unnecessary code
* delete debugging code
Co-authored-by: imabackstabber <312276423@qq.com>
2022-06-17 11:10:37 +08:00
Jiazhen Wang
7b55c5bdbf
[Feature] Support resume from Ceph ( #294 )
...
* support resume from ceph
* move func and refine
* delete symlink
* fix unittest
* perserve _allow_symlink and symlink
2022-06-17 10:37:19 +08:00
RangiLyu
1c18f30854
[Enhance] Support infinite dataloader iterator wrapper for IterBasedTrainLoop. ( #289 )
2022-06-14 14:52:59 +08:00
Mashiro
b7866021c4
[Refactor] Refactor the accumulate gradient implemention of OptimWrapper ( #284 )
...
* merge context
* update unit test
* add docstring
* fix bug in AmpOptimWrapper
* add docstring for backward
* add warning and docstring for accumuate gradient
* fix docstring
* fix docstring
* add params_group method
* fix as comment
* fix as comment
* make default_value of loss_scale to dynamic
* Fix docstring
* decouple should update and should no sync
* rename attribute in OptimWrapper
* fix docstring
* fix comment
* fix comment
* fix as comment
* fix as comment and add unit test
2022-06-13 23:20:53 +08:00
Mashiro
8b0c9c5f6f
[Fix] fix build train_loop during test ( #295 )
...
* fix build train_loop during test
* fix build train_loop during test
* fix build train_loop during test
* fix build train_loop during test
* Fix as comment
2022-06-13 21:23:46 +08:00
Alex Yang
df0c510444
[Feat]:support customizing evaluator ( #287 )
...
* [Feat]:support customizing evaluator
* fix keyname of determining using default evaluator or not
* add assertion
* fix typo
2022-06-10 15:34:10 +08:00
jbwang1997
7a5d3c83ea
[Fix] Replace auto_scale_lr_cfg to auto_scale_lr ( #286 )
...
* Replace auto_scale_lr_cfg to auto_scale_lr
* Update
2022-06-09 20:15:36 +08:00
Mashiro
f04fec736d
[Feature]: add base model, ddp model wrapper and unit test ( #268 )
...
* add base model, ddp model and unit test
* add unit test
* fix unit test
* fix docstring
* fix cpu unit test
* refine base data preprocessor
* refine base data preprocessor
* refine interface of ddp module
* remove optimizer hook
* add forward
* fix as comment
* fix unit test
* fix as comment
* fix build optimizer wrapper
* rebase main and fix unit test
* stack_batch support stacking ndim tensor, add docstring for merge dict
* fix lint
* fix test loop
* make precision_context effective to data_preprocessor
* fix as comment
* fix as comment
* refine docstring
* change collate_data output typehints
* rename to_rgb to bgr_to_rgb and rgb_to_bgr
* support build basemodel with built DataPreprocessor
* fix as comment
* fix docstring
2022-06-07 22:13:53 +08:00
RangiLyu
ad965a5309
[Enhance] Enhance checkpoint meta info. ( #279 )
2022-06-07 18:48:50 +08:00
jbwang1997
bd3c53b385
[Fix] Fix CI after merging support auto scale lr and support custom runner ( #280 )
2022-06-07 16:03:51 +08:00
jbwang1997
8f3fcee301
[Feature] Add auto scale lr fucntion ( #270 )
...
* Add auto scale lr fucntion
* Update
* Update
* Update
* Update
* Update
* Update
* Update
* Update
* Update
* Update
Co-authored-by: wangjiabao1.vendor <wangjiabao@pjlab.org.cn>
2022-06-06 22:27:15 +08:00
Jiazhen Wang
65bc95036c
[Enhance] Support Custom Runner ( #258 )
...
* support custom runner
* change build_runner_from_cfg
* refine docstring
* refine docstring
2022-06-06 14:33:32 +08:00
RangiLyu
70c4ea191f
[Refactor]: Modify val_interval and val_begin to be the attributes of TrainLoop. ( #274 )
...
* Modify val_interval and val_begin to be the attributes of TrainLoop.
* update doc
* fix lint
* type hint
2022-06-06 11:13:25 +08:00
Mashiro
80a46c4848
[Fix] fix build optimizer wrapper without type ( #272 )
...
* fix build optimizer wrapper without type
* refine logic
* fix as comment
* fix optim_wrapper config error in docstring and unit test
* refine docstring of build_optim_wrapper
2022-06-05 22:35:16 +08:00
Mashiro
3e3866c1b9
[Feature] Add optimizer wrapper ( #265 )
...
* Support multiple optimizers
* minor refinement
* improve unit tests
* minor fix
* Update unit tests for resuming or saving ckpt for multiple optimizers
* refine docstring
* refine docstring
* fix typo
* update docstring
* refactor the logic to build multiple optimizers
* resolve comments
* ParamSchedulers spports multiple optimizers
* add optimizer_wrapper
* fix comment and docstirng
* fix unit test
* add unit test
* refine docstring
* RuntimeInfoHook supports printing multi learning rates
* resolve comments
* add optimizer_wrapper
* fix mypy
* fix lint
* fix OptimizerWrapperDict docstring and add unit test
* rename OptimizerWrapper to OptimWrapper, OptimWrapperDict inherit OptimWrapper, and fix as comment
* Fix AmpOptimizerWrapper
* rename build_optmizer_wrapper to build_optim_wrapper
* refine optimizer wrapper
* fix AmpOptimWrapper.step, docstring
* resolve confict
* rename DefaultOptimConstructor
* fix as comment
* rename clig grad auguments
* refactor optim_wrapper config
* fix docstring of DefaultOptimWrapperConstructor
fix docstring of DefaultOptimWrapperConstructor
* add get_lr method to OptimWrapper and OptimWrapperDict
* skip some amp unit test
* fix unit test
* fix get_lr, get_momentum docstring
* refactor get_lr, get_momentum, fix as comment
* fix error message
Co-authored-by: zhouzaida <zhouzaida@163.com>
2022-06-01 18:04:38 +08:00
Zaida Zhou
f1da9a1d7f
[Feature] Support multiple optimizers ( #235 )
...
* Support multiple optimizers
* minor refinement
* improve unit tests
* minor fix
* Update unit tests for resuming or saving ckpt for multiple optimizers
* refine docstring
* refine docstring
* fix typo
* update docstring
* refactor the logic to build multiple optimizers
* resolve comments
* ParamSchedulers spports multiple optimizers
* refine docstring
* RuntimeInfoHook supports printing multi learning rates
* resolve comments
* fix typo
2022-05-31 16:54:39 +08:00
Jiazhen Wang
f2190de787
[Enhance] Improve Exception in call_hook ( #247 )
...
* improve exception in call_hook
* refine unit test
* add test_call_hook
* refine
* update docstring and ut
2022-05-31 11:34:30 +08:00
Jingwei Zhang
40daf46a45
Support validation only after some epoch/iteration in ValLoop ( #257 )
...
* add the epoch/iter that begins validating
* fix lint
* add property and fix unit test
* minor changes
* fix typos and add unit test
* add unit test about begin
* fix docstring
2022-05-27 15:10:12 +08:00
RangiLyu
4705e1fe3d
[Enhance] Add RuntimeInfoHook to update runtime information. ( #254 )
...
* [Enhance] Add RuntimeInfoHook to update runtime information.
* move lr to runtime info
* docstring
* resolve comments
* update ut and doc
2022-05-26 14:35:37 +08:00
Jiazhen Wang
4cbbbc0c31
[Enhance] Refine sync random seed ( #256 )
...
* refine sync random seed
* cancel seed param in batch-sampler
2022-05-25 19:18:03 +08:00
Haian Huang(深度眸)
c197bdf359
[Feature] Profiling tools ( #241 )
...
* Add profiling tools
* fix docstr
* fix docstr
* update
* fix bug
* update
* update
* fix error
* fix mypy
* uodate
* merge main
* fix UT
2022-05-25 10:55:07 +08:00
Jiazhen Wang
a976257ca9
[Enhance] Support Custom LogProcessor ( #251 )
...
* support custom log processor
* supplementary docs
* format code
2022-05-24 17:17:35 +08:00
RangiLyu
11688507ba
[Fix] Fix some bugs in hooks and runner. ( #242 )
...
* [Fix] Fix some bugs in hooks and runner.
* fix markdown
* fix latex formula
* resolve comments
2022-05-20 17:18:24 +08:00
RangiLyu
e37f1f905b
[Refactor] Make loop-related attributes to be runner's properties. ( #236 )
...
* [Enhance] Make loop related attributes to be runner's properties.
* move iter and epoch to loop
* resolve comments
2022-05-18 22:35:10 +08:00
Mashiro
fd962437e9
[Fix] Support Runner dump cfg without filename ( #228 )
...
* fix runner dump cfg
* convert dict cfg to Config
2022-05-17 17:32:10 +08:00
RangiLyu
1912660db9
[Feature] Support convert epoch-based schedulers to iter-based. ( #221 )
...
* [Feature] Support convert epoch-based schedulers to iter-based.
* Support convert and refactor LR and Momentum to mixin.
* Add unit tests
* fix args and add runner ut
* resolve comments
2022-05-10 15:17:51 +08:00
Zaida Zhou
661e759063
[Fix] param_scheduler can not None when training models ( #208 )
...
* [Fix] param_scheduler can not None when training models
* update unit tests
* fix unit tests
* refactor ParamSchedulerHook
* refactor unit tests
* param_schedulers can be an empty list
2022-04-27 19:45:27 +08:00
Wenwei Zhang
96f3d97fc4
Try to fix lint issue ( #199 )
...
* try to fix lint
* upgrade yapf version
* use another way to bypass yapf
* update docstring
2022-04-26 13:53:00 +08:00
Mashiro
e0d00c5bdd
[Fix] resolve conflict betweem adapt and main. ( #198 )
...
* [Docs] Refine registry documentation (#186 )
* [Docs] Refine registry documentation
* reslove comments
* minor refinement
* Refine Visualizer docs (#177 )
* Refine Visualizer docs
* update
* update
* update featmap
* update docs
* update visualizer docs
* [Refactor] Refine LoggerHook (#155 )
* rename global accessible and intergration get_sintance and create_instance
* move ManagerMixin to utils
* fix as docstring and seporate get_instance to get_instance and get_current_instance
* fix lint
* fix docstring, rename and move test_global_meta
* rename LogBuffer to HistoryBuffer, rename MessageHub methods, MessageHub support resume
* refine MMLogger timestamp, update unit test
* MMLogger add logger_name arguments
* Fix docstring
* Add LogProcessor and some unit test
* update unit test
* complete LogProcessor unit test
* refine LoggerHook
* solve circle import
* change default logger_name to mmengine
* refactor eta
* Fix docstring comment and unitt test
* Fix with runner
* fix docstring
fix docstring
* fix docstring
* Add by_epoch attribute to LoggerHook and fix docstring
* Please mypy and fix comment
* remove \ in MMLogger
* Fix lint
* roll back pre-commit-hook
* Fix hook unit test
* Fix comments
* remove \t in log and add docstring
* Fix as comment
* should not accept other arguments if corresponding instance has been created
* fix logging ddp file saving
* fix logging ddp file saving
* move log processor to logging
* move log processor to logging
* remove current datalaoder
* fix docstring
* fix unit test
* add learing rate in messagehub
* Support output training/validation/testing message after iterations/epochs
* fix docstring
* Fix IterBasedRunner log string
* Fix IterBasedRunner log string
* Support parse validation loss in log processor
* [Enhancement] Add PolyParamScheduler, PolyMomentum and PolyLR (#188 )
* [Enhancement] Add PolyParamScheduler, PolyMomentum and PolyLR
* min_lr -> eta_min, refined docstr
Co-authored-by: Zaida Zhou <58739961+zhouzaida@users.noreply.github.com>
Co-authored-by: Haian Huang(深度眸) <1286304229@qq.com>
Co-authored-by: Tong Gao <gaotongxiao@gmail.com>
2022-04-26 00:37:16 +08:00
ZwwWayne
ae3b857480
Merge branch 'adapt' of github.com:open-mmlab/mmengine into adapt
2022-04-22 13:48:14 +08:00
Mashiro
45567b1d1c
automaticaly update iter and epoch in message_hub ( #168 )
...
* automatic update iter and epoch in message_hub
* add docstring
* Update comment and docstring
* Fix as comment
* Fix docstring and comment
* refine comments
2022-04-21 11:45:03 +08:00