5 Commits

Author SHA1 Message Date
Mashiro
6b47035fdf
[Fix] Fix save scheduler state dict with optim wrapper (#375)
* fix save scheduler state dict with optim wrapper

* remove for loop and inherit TestParameterScheduler

* remove for loop and inherit TestParameterScheduler

* minor refine
2022-07-20 16:32:48 +08:00
Mashiro
b2ee9f8b11
[Fix] Fix loss could be nan in optimizer wrapper (#345)
* fix optimizer wrapper counts

* fix ut
2022-07-06 16:42:49 +08:00
Mashiro
2fd6beb972
[Fix] Fix UT of optimizer wrapper failed in pytorch1.6 (#340) 2022-06-28 10:31:14 +08:00
Mashiro
b7866021c4
[Refactor] Refactor the accumulate gradient implemention of OptimWrapper (#284)
* merge context

* update unit test

* add docstring

* fix bug in AmpOptimWrapper

* add docstring for backward

* add warning and docstring for accumuate gradient

* fix docstring

* fix docstring

* add params_group method

* fix as comment

* fix as comment

* make default_value of loss_scale to dynamic

* Fix docstring

* decouple should update and should no sync

* rename attribute in OptimWrapper

* fix docstring

* fix comment

* fix comment

* fix as comment

* fix as comment and add unit test
2022-06-13 23:20:53 +08:00
Mashiro
3e3866c1b9
[Feature] Add optimizer wrapper (#265)
* Support multiple optimizers

* minor refinement

* improve unit tests

* minor fix

* Update unit tests for resuming or saving ckpt for multiple optimizers

* refine docstring

* refine docstring

* fix typo

* update docstring

* refactor the logic to build multiple optimizers

* resolve comments

* ParamSchedulers spports multiple optimizers

* add optimizer_wrapper

* fix comment and docstirng

* fix unit test

* add unit test

* refine docstring

* RuntimeInfoHook supports printing multi learning rates

* resolve comments

* add optimizer_wrapper

* fix mypy

* fix lint

* fix OptimizerWrapperDict docstring and add unit test

* rename OptimizerWrapper to OptimWrapper, OptimWrapperDict inherit OptimWrapper, and fix as comment

* Fix AmpOptimizerWrapper

* rename build_optmizer_wrapper to build_optim_wrapper

* refine optimizer wrapper

* fix AmpOptimWrapper.step, docstring

* resolve confict

* rename DefaultOptimConstructor

* fix as comment

* rename clig grad auguments

* refactor optim_wrapper config

* fix docstring of DefaultOptimWrapperConstructor

fix docstring of DefaultOptimWrapperConstructor

* add get_lr method to OptimWrapper and OptimWrapperDict

* skip some amp unit test

* fix unit test

* fix get_lr, get_momentum docstring

* refactor get_lr, get_momentum, fix as comment

* fix error message

Co-authored-by: zhouzaida <zhouzaida@163.com>
2022-06-01 18:04:38 +08:00