RangiLyu
a3d2916790
[Enhance] Support scheduling betas with MomentumScheduler. ( #346 )
...
* [Enhance] Support scheduling betas with MomentumScheduler.
* enhance ut
* test adam betas
* enhance ut
* enhance ut
2022-07-05 20:37:23 +08:00
Mashiro
2fd6beb972
[Fix] Fix UT of optimizer wrapper failed in pytorch1.6 ( #340 )
2022-06-28 10:31:14 +08:00
Mashiro
b7866021c4
[Refactor] Refactor the accumulate gradient implemention of OptimWrapper ( #284 )
...
* merge context
* update unit test
* add docstring
* fix bug in AmpOptimWrapper
* add docstring for backward
* add warning and docstring for accumuate gradient
* fix docstring
* fix docstring
* add params_group method
* fix as comment
* fix as comment
* make default_value of loss_scale to dynamic
* Fix docstring
* decouple should update and should no sync
* rename attribute in OptimWrapper
* fix docstring
* fix comment
* fix comment
* fix as comment
* fix as comment and add unit test
2022-06-13 23:20:53 +08:00
Miao Zheng
fd295741ca
[Features]Add OneCycleLR ( #296 )
...
* [Features]Add OnecycleLR
* [Features]Add OnecycleLR
* yapf disable
* build_iter_from_epoch
* add epoch
* fix args
* fix according to comments;
* lr-param
* fix according to comments
* defaults -> default to
* remove epoch and steps per step
* variabel names
2022-06-13 21:23:59 +08:00
Mashiro
3e3866c1b9
[Feature] Add optimizer wrapper ( #265 )
...
* Support multiple optimizers
* minor refinement
* improve unit tests
* minor fix
* Update unit tests for resuming or saving ckpt for multiple optimizers
* refine docstring
* refine docstring
* fix typo
* update docstring
* refactor the logic to build multiple optimizers
* resolve comments
* ParamSchedulers spports multiple optimizers
* add optimizer_wrapper
* fix comment and docstirng
* fix unit test
* add unit test
* refine docstring
* RuntimeInfoHook supports printing multi learning rates
* resolve comments
* add optimizer_wrapper
* fix mypy
* fix lint
* fix OptimizerWrapperDict docstring and add unit test
* rename OptimizerWrapper to OptimWrapper, OptimWrapperDict inherit OptimWrapper, and fix as comment
* Fix AmpOptimizerWrapper
* rename build_optmizer_wrapper to build_optim_wrapper
* refine optimizer wrapper
* fix AmpOptimWrapper.step, docstring
* resolve confict
* rename DefaultOptimConstructor
* fix as comment
* rename clig grad auguments
* refactor optim_wrapper config
* fix docstring of DefaultOptimWrapperConstructor
fix docstring of DefaultOptimWrapperConstructor
* add get_lr method to OptimWrapper and OptimWrapperDict
* skip some amp unit test
* fix unit test
* fix get_lr, get_momentum docstring
* refactor get_lr, get_momentum, fix as comment
* fix error message
Co-authored-by: zhouzaida <zhouzaida@163.com>
2022-06-01 18:04:38 +08:00
RangiLyu
1912660db9
[Feature] Support convert epoch-based schedulers to iter-based. ( #221 )
...
* [Feature] Support convert epoch-based schedulers to iter-based.
* Support convert and refactor LR and Momentum to mixin.
* Add unit tests
* fix args and add runner ut
* resolve comments
2022-05-10 15:17:51 +08:00
Tong Gao
c3aff4fc9a
[Enhancement] Add PolyParamScheduler, PolyMomentum and PolyLR ( #188 )
...
* [Enhancement] Add PolyParamScheduler, PolyMomentum and PolyLR
* min_lr -> eta_min, refined docstr
2022-04-25 13:44:15 +08:00
RangiLyu
49b7d0ce6f
Support default_scope when building optimizer and evaluator. ( #109 )
...
* Support default_scope when building optimizer and evaluator.
* add docstring
* fix
* fix
2022-03-08 16:05:29 +08:00
RangiLyu
c2c5664fad
Fix pt1.5 unit tests. ( #65 )
...
* Fix pt1.5 unit tests.
* move to mmengine.testing
2022-03-01 11:28:21 +08:00
Zaida Zhou
1e79b97444
Mock unimplemented modules and fix unit tests ( #54 )
...
* Mock unimplemented modules and fix unit tests
* add a comment
2022-02-25 15:24:27 +08:00
RangiLyu
7353778b7c
[Feature]: Add optimzier and constructor. ( #25 )
...
* [Feature]: Add optimzier and constructor.
* refactor unit tests
* add optimizer doc
* add parrots wrapper
* add parrots wrapper
* solve comments
* resolve comments
2022-02-19 14:09:37 +08:00
RangiLyu
7905f039b6
[Feature]: Add parameter schedulers. ( #22 )
...
* [Feature]: Add parameter schedulers.
* update
* update
* update
* update
* add docstring to lr and momentum
* resolve comments
2022-02-16 23:43:59 +08:00
RangiLyu
bbb7d625e6
add scheduler unit test ( #13 )
...
* tmp
* add scheduler unit test
* disable yapf
* add more test
* add more test
* not use torch test case
* solve comments
* update file
* add more unit tests
* resolve comments
* update cosine ut
* fix typo
* solve comments
* solve comments
* resolve comments
2022-02-16 22:33:46 +08:00