* fix the bug when the params in shared modules do not require grad
* test DefaultOptimWrapperConstructor when the params in shared modules do not require grad
* [Feature] Add ReduceOnPlateauParamScheduler and change ParamSchedulerHook
* [Feature] add ReduceOnPlateauLR and ReduceOnPlateauMomentum
* pre-commit check
* add a little docs
* change position
* fix the conflict between isort and yapf
* fix ParamSchedulerHook after_val_epoch execute without train_loop and param_schedulers built
* Apply suggestions from code review
Co-authored-by: Mashiro <57566630+HAOCHENYE@users.noreply.github.com>
* update ReduceOnPlateauParamScheduler, ReduceOnPlateauMomentum and ParamSchedulerHook
* fix get need_step_args attribute error in ParamSchedulerHook
* fix load_state_dict error for rule in ReduceOnPlateauParamScheduler
* add docs for ParamSchedulerHook and fix a few codes
* [Docs] add ReduceOnPlateauParamScheduler, ReduceOnPlateauMomentum and ReduceOnPlateauLR docs
* [Refactor] adjust the order of import
* [Fix] add init check for threshold in ReduceOnPlateauParamScheduler
* [Test] add test for ReduceOnPlateauParamScheduler, ReduceOnPlateauLR and ReduceOnPlateauMomentum
* [Fix] fix no attribute self.min_value
* [Fix] fix numerical problem in tests
* [Fix] fix error in tests
* [Fix] fix ignore first param in tests
* [Fix] fix bug in tests
* [Fix] fix bug in tests
* [Fix] fix bug in tests
* [Fix] increase coverage
* [Fix] fix count self._global_step bug and docs
* [Fix] fix tests
* [Fix] modified ParamSchedulerHook test
* Update mmengine/optim/scheduler/param_scheduler.py
Co-authored-by: Mashiro <57566630+HAOCHENYE@users.noreply.github.com>
* Apply suggestions from code review
Co-authored-by: Mashiro <57566630+HAOCHENYE@users.noreply.github.com>
* [Fix] modified something according to commented
* [Docs] add api for en and zh_cn
* [Fix] fix bug in test_param_scheduler_hook.py
* [Test] support more complicated test modes(less, greater, rel, abs) for ReduceOnPlateauParamScheduler
* [Docs] add docs for rule
* [Fix] fix pop from empty list bug in test
* [Fix] fix check param_schedulers is not built bug
* [Fix] fix step_args bug and without runner._train_loop bug
* [Fix] fix step_args bug and without runner._train_loop bug
* [Fix] fix scheduler type bug
* [Test] rename step_args to step_kwargs
* [Fix] remove redundancy check
* [Test] remove redundancy check
* Apply suggestions from code review
Co-authored-by: Zaida Zhou <58739961+zhouzaida@users.noreply.github.com>
* [Test] fix some defects
Co-authored-by: Mashiro <57566630+HAOCHENYE@users.noreply.github.com>
Co-authored-by: Zaida Zhou <58739961+zhouzaida@users.noreply.github.com>
* fix zero_optimizer error with param groups when pytorch < 1.12.0
* add docstring
* fix docstring
* add unittest
* change ut to use a valid paramwise_cfg
* modify ut
* fix as comments
* [Fix] fix CosineRestart eta_min
* add ut case
* Enhance unit test
Enhance unit test
* remove unused code
Co-authored-by: HAOCHENYE <21724054@zju.edu.cn>
* [Enhance] add documents for , and support clip grad by value
* refine docstring
* fix as comment
* Fix as comment
* minor refine
* minor refine
* remove error comment for clip grad
* refine docstring
* Rename data to structure
* adjust the way to import module
* adjust the way to import module
* rename Structure to Data Structures in docs api
* rename structure to structures
* support using some modules of mmengine without torch
* fix circleci config
* fix circleci config
* fix registry ut
* minor fix
* move init method from model/utils to model/weight_init.py
* move init method from model/utils to model/weight_init.py
* move sync_bn to model
* move functions depending on torch to dl_utils
* format import
* fix logging ut
* add weight init in model/__init__.py
* move get_config and get_model to mmengine/hub
* move log_processor.py to mmengine/runner
* fix ut
* Add TimeCounter in dl_utils/__init__.py
* fix save scheduler state dict with optim wrapper
* remove for loop and inherit TestParameterScheduler
* remove for loop and inherit TestParameterScheduler
* minor refine
* [Enhance] Auto set the `end` of param schedulers.
* Add log output and unit test
* Update docstring
* Update unit tests of `CosineAnnealingParamScheduler`.
* merge context
* update unit test
* add docstring
* fix bug in AmpOptimWrapper
* add docstring for backward
* add warning and docstring for accumuate gradient
* fix docstring
* fix docstring
* add params_group method
* fix as comment
* fix as comment
* make default_value of loss_scale to dynamic
* Fix docstring
* decouple should update and should no sync
* rename attribute in OptimWrapper
* fix docstring
* fix comment
* fix comment
* fix as comment
* fix as comment and add unit test
* [Feature] Support convert epoch-based schedulers to iter-based.
* Support convert and refactor LR and Momentum to mixin.
* Add unit tests
* fix args and add runner ut
* resolve comments
* tmp
* add scheduler unit test
* disable yapf
* add more test
* add more test
* not use torch test case
* solve comments
* update file
* add more unit tests
* resolve comments
* update cosine ut
* fix typo
* solve comments
* solve comments
* resolve comments