Zaida Zhou
fc9518e2c1
[Feature] Add Lion optimizer ( #952 )
2023-02-23 11:24:50 +08:00
whcao
a5f48f7d99
[Bug] Fix the bug when the params in shared modules do not require grad ( #903 )
...
* fix the bug when the params in shared modules do not require grad
* test DefaultOptimWrapperConstructor when the params in shared modules do not require grad
2023-02-15 11:25:15 +08:00
takuoko
d1d4609fa2
[Feature] Support using optimizers from dadaptation ( #902 )
...
* add dadaptation
* Update mmengine/optim/optimizer/builder.py
Co-authored-by: Zaida Zhou <58739961+zhouzaida@users.noreply.github.com>
* update dadaptation docs
---------
Co-authored-by: Zaida Zhou <58739961+zhouzaida@users.noreply.github.com>
2023-02-03 15:00:32 +08:00
Qian Zhao
4147e976a6
[Fix] ZeroRedundancyOptimizer ambiguous error with param groups when pytorch < 1.12.0 ( #818 )
...
* fix zero_optimizer error with param groups when pytorch < 1.12.0
* add docstring
* fix docstring
* add unittest
* change ut to use a valid paramwise_cfg
* modify ut
* fix as comments
2022-12-19 17:07:58 +08:00
RangiLyu
3582b4c787
[Enhance] Support setting decay multiplier for flatten parameter ( #771 )
...
* [Fix] Fix bias decay mult of depth-wise conv.
* support flatten param weight decay multiplier
* add unit tests
* REMOVE TODO
* update doc
2022-12-16 17:37:22 +08:00
Hakjin Lee
0857f9fb40
[Feature] Support torch ZeroRedundancyOptimizer ( #551 )
...
* [Feature] Support torch ZeRORedundancyOptimizer
Co-authored-by: Junhwa Song <ethan9867@gmail.com>
Signed-off-by: Junhwa Song <ethan9867@gmail.com>
Signed-off-by: Hakjin Lee <nijkah@gmail.com>
* lint
* Fix saving optimizer state_dict
* Fix handling import error
* Add test case
* fix UT
* Revert "fix UT"
This reverts commit dd64538960ff7440c6020f533d43945ffc23f2d2.
* fix handling import in UT
* Fix saving zero checkpoint and delete redundant master_only
* lint
* test unittest
* Fix handling impor error
* Fix UT condition
* Edit docstrings
* Fix typo
* Skip redundant procudure in checkpoint hook
* fix typo again
* Update mmengine/optim/optimizer/zero_optimizer.py
Co-authored-by: Zaida Zhou <58739961+zhouzaida@users.noreply.github.com>
* Add api info
* lint
* Fix lint
* Handling AmpOptimWrapper case
* handling overlap_with_ddp
* Fix error
Signed-off-by: Junhwa Song <ethan9867@gmail.com>
Signed-off-by: Hakjin Lee <nijkah@gmail.com>
Co-authored-by: Junhwa Song <ethan9867@gmail.com>
Co-authored-by: Zaida Zhou <58739961+zhouzaida@users.noreply.github.com>
2022-10-27 20:31:50 +08:00
Zaida Zhou
7e1d7af2d9
[Refactor] Refactor code structure ( #395 )
...
* Rename data to structure
* adjust the way to import module
* adjust the way to import module
* rename Structure to Data Structures in docs api
* rename structure to structures
* support using some modules of mmengine without torch
* fix circleci config
* fix circleci config
* fix registry ut
* minor fix
* move init method from model/utils to model/weight_init.py
* move init method from model/utils to model/weight_init.py
* move sync_bn to model
* move functions depending on torch to dl_utils
* format import
* fix logging ut
* add weight init in model/__init__.py
* move get_config and get_model to mmengine/hub
* move log_processor.py to mmengine/runner
* fix ut
* Add TimeCounter in dl_utils/__init__.py
2022-08-24 19:14:07 +08:00
Mashiro
3e3866c1b9
[Feature] Add optimizer wrapper ( #265 )
...
* Support multiple optimizers
* minor refinement
* improve unit tests
* minor fix
* Update unit tests for resuming or saving ckpt for multiple optimizers
* refine docstring
* refine docstring
* fix typo
* update docstring
* refactor the logic to build multiple optimizers
* resolve comments
* ParamSchedulers spports multiple optimizers
* add optimizer_wrapper
* fix comment and docstirng
* fix unit test
* add unit test
* refine docstring
* RuntimeInfoHook supports printing multi learning rates
* resolve comments
* add optimizer_wrapper
* fix mypy
* fix lint
* fix OptimizerWrapperDict docstring and add unit test
* rename OptimizerWrapper to OptimWrapper, OptimWrapperDict inherit OptimWrapper, and fix as comment
* Fix AmpOptimizerWrapper
* rename build_optmizer_wrapper to build_optim_wrapper
* refine optimizer wrapper
* fix AmpOptimWrapper.step, docstring
* resolve confict
* rename DefaultOptimConstructor
* fix as comment
* rename clig grad auguments
* refactor optim_wrapper config
* fix docstring of DefaultOptimWrapperConstructor
fix docstring of DefaultOptimWrapperConstructor
* add get_lr method to OptimWrapper and OptimWrapperDict
* skip some amp unit test
* fix unit test
* fix get_lr, get_momentum docstring
* refactor get_lr, get_momentum, fix as comment
* fix error message
Co-authored-by: zhouzaida <zhouzaida@163.com>
2022-06-01 18:04:38 +08:00
RangiLyu
49b7d0ce6f
Support default_scope when building optimizer and evaluator. ( #109 )
...
* Support default_scope when building optimizer and evaluator.
* add docstring
* fix
* fix
2022-03-08 16:05:29 +08:00
Zaida Zhou
1e79b97444
Mock unimplemented modules and fix unit tests ( #54 )
...
* Mock unimplemented modules and fix unit tests
* add a comment
2022-02-25 15:24:27 +08:00
RangiLyu
7353778b7c
[Feature]: Add optimzier and constructor. ( #25 )
...
* [Feature]: Add optimzier and constructor.
* refactor unit tests
* add optimizer doc
* add parrots wrapper
* add parrots wrapper
* solve comments
* resolve comments
2022-02-19 14:09:37 +08:00