Commit Graph

31 Commits (425ca99e901a9e2c2fb503b4a5f52efb4aa32f42)

Author SHA1 Message Date
whcao a5f48f7d99
[Bug] Fix the bug when the params in shared modules do not require grad (#903)
* fix the bug when the params in shared modules do not require grad

* test DefaultOptimWrapperConstructor when the params in shared modules do not require grad
2023-02-15 11:25:15 +08:00
xcnick e35ed5fd2e
[Feature] Add ApexOptimWrapper (#742)
* add ApexOptimWrapper

* typo fix

* add apex amp.initialize in optim_context

* assert apex_amp

* polish code

* add parameters of apex_amp.initialize

* add docs

* polish code

* polish code

* polish code

* fix calling of apex amp load_state_dict

* polish

* add comments

* Update apex_optimizer_wrapper.py

* Update apex_optimizer_wrapper.py

---------

Co-authored-by: Zaida Zhou <58739961+zhouzaida@users.noreply.github.com>
2023-02-06 15:30:10 +08:00
takuoko d1d4609fa2
[Feature] Support using optimizers from dadaptation (#902)
* add dadaptation

* Update mmengine/optim/optimizer/builder.py

Co-authored-by: Zaida Zhou <58739961+zhouzaida@users.noreply.github.com>

* update dadaptation docs

---------

Co-authored-by: Zaida Zhou <58739961+zhouzaida@users.noreply.github.com>
2023-02-03 15:00:32 +08:00
LEFTeyes 0b59a90a21
[Feature] Support ReduceOnPlateauParamScheduler(#819)
* [Feature] Add ReduceOnPlateauParamScheduler and change ParamSchedulerHook

* [Feature] add ReduceOnPlateauLR and ReduceOnPlateauMomentum

* pre-commit check

* add a little docs

* change position

* fix the conflict between isort and yapf

* fix ParamSchedulerHook after_val_epoch execute without train_loop and param_schedulers built

* Apply suggestions from code review

Co-authored-by: Mashiro <57566630+HAOCHENYE@users.noreply.github.com>

* update ReduceOnPlateauParamScheduler, ReduceOnPlateauMomentum and ParamSchedulerHook

* fix get need_step_args attribute error in ParamSchedulerHook

* fix load_state_dict error for rule in ReduceOnPlateauParamScheduler

* add docs for ParamSchedulerHook and fix a few codes

* [Docs] add ReduceOnPlateauParamScheduler, ReduceOnPlateauMomentum and ReduceOnPlateauLR docs

* [Refactor] adjust the order of import

* [Fix] add init check for threshold in ReduceOnPlateauParamScheduler

* [Test] add test for ReduceOnPlateauParamScheduler, ReduceOnPlateauLR and ReduceOnPlateauMomentum

* [Fix] fix no attribute self.min_value

* [Fix] fix numerical problem in tests

* [Fix] fix error in tests

* [Fix] fix ignore first param in tests

* [Fix] fix bug in tests

* [Fix] fix bug in tests

* [Fix] fix bug in tests

* [Fix] increase coverage

* [Fix] fix count self._global_step bug and docs

* [Fix] fix tests

* [Fix] modified ParamSchedulerHook test

* Update mmengine/optim/scheduler/param_scheduler.py

Co-authored-by: Mashiro <57566630+HAOCHENYE@users.noreply.github.com>

* Apply suggestions from code review

Co-authored-by: Mashiro <57566630+HAOCHENYE@users.noreply.github.com>

* [Fix] modified something according to commented

* [Docs] add api for en and zh_cn

* [Fix] fix bug in test_param_scheduler_hook.py

* [Test] support more complicated test modes(less, greater, rel, abs) for ReduceOnPlateauParamScheduler

* [Docs] add docs for rule

* [Fix] fix pop from empty list bug in test

* [Fix] fix check param_schedulers is not built bug

* [Fix] fix step_args bug and without runner._train_loop bug

* [Fix] fix step_args bug and without runner._train_loop bug

* [Fix] fix scheduler type bug

* [Test] rename step_args to step_kwargs

* [Fix] remove redundancy check

* [Test] remove redundancy check

* Apply suggestions from code review

Co-authored-by: Zaida Zhou <58739961+zhouzaida@users.noreply.github.com>

* [Test] fix some defects

Co-authored-by: Mashiro <57566630+HAOCHENYE@users.noreply.github.com>
Co-authored-by: Zaida Zhou <58739961+zhouzaida@users.noreply.github.com>
2023-01-16 11:39:03 +08:00
Qian Zhao 4147e976a6
[Fix] ZeroRedundancyOptimizer ambiguous error with param groups when pytorch < 1.12.0 (#818)
* fix zero_optimizer error with param groups when pytorch < 1.12.0

* add docstring

* fix docstring

* add unittest

* change ut to use a valid paramwise_cfg

* modify ut

* fix as comments
2022-12-19 17:07:58 +08:00
RangiLyu 3582b4c787
[Enhance] Support setting decay multiplier for flatten parameter (#771)
* [Fix] Fix bias decay mult of depth-wise conv.

* support flatten param weight decay multiplier

* add unit tests

* REMOVE TODO

* update doc
2022-12-16 17:37:22 +08:00
cir7 0e6bb48b12
[Enhance] Support eta_min_ratio in CosineAnnealingParamScheduler (#725)
* [Enhance] support eta_min_ratio in CosineAnnealingParamScheduler

* [doc] fix docstring

* [Enhance] add ut for eta_min_ratio

* [doc] update docstring

* avoid bc-breaking of eta_min

* [doc] add docstring in CosineAnnealingParamScheduler and CosineAnnealingMomentum

* Apply suggestions from code review

Co-authored-by: Zaida Zhou <58739961+zhouzaida@users.noreply.github.com>
2022-11-22 20:19:16 +08:00
Z-Fran 090104df21
[Fix] Fix the calculation error of eta_min in CosineRestart (#639)
* [Fix] fix CosineRestart eta_min

* add ut case

* Enhance unit test

Enhance unit test

* remove unused code

Co-authored-by: HAOCHENYE <21724054@zju.edu.cn>
2022-11-01 15:48:39 +08:00
Hakjin Lee 0857f9fb40
[Feature] Support torch ZeroRedundancyOptimizer (#551)
* [Feature] Support torch ZeRORedundancyOptimizer

Co-authored-by: Junhwa Song <ethan9867@gmail.com>
Signed-off-by: Junhwa Song <ethan9867@gmail.com>
Signed-off-by: Hakjin Lee <nijkah@gmail.com>

* lint

* Fix saving optimizer state_dict

* Fix handling import error

* Add test case

* fix UT

* Revert "fix UT"

This reverts commit dd64538960.

* fix handling import in UT

* Fix saving zero checkpoint and delete redundant master_only

* lint

* test unittest

* Fix handling impor error

* Fix UT condition

* Edit docstrings

* Fix typo

* Skip redundant procudure in checkpoint hook

* fix typo again

* Update mmengine/optim/optimizer/zero_optimizer.py

Co-authored-by: Zaida Zhou <58739961+zhouzaida@users.noreply.github.com>

* Add api info

* lint

* Fix lint

* Handling AmpOptimWrapper case

* handling overlap_with_ddp

* Fix error

Signed-off-by: Junhwa Song <ethan9867@gmail.com>
Signed-off-by: Hakjin Lee <nijkah@gmail.com>
Co-authored-by: Junhwa Song <ethan9867@gmail.com>
Co-authored-by: Zaida Zhou <58739961+zhouzaida@users.noreply.github.com>
2022-10-27 20:31:50 +08:00
Mashiro 6073d9ebd8
[Enhance] add documents for `clip_grad` , and support clip grad by value. (#513)
* [Enhance] add documents for , and support clip grad by value

* refine docstring

* fix as comment

* Fix as comment

* minor refine

* minor refine

* remove error comment for clip grad

* refine docstring
2022-10-18 18:02:46 +08:00
Mashiro 5d7527d75e
[Fix ] Fix unit test could may fail caused by `MultiProcessTestCase` (#535)
* fix merge ci

* make timeout larger

* tmp test merge

* debug

* sleep 1 in optimizer ut

* skip test clip grad

* skip test clip grad

* fix merge ci
2022-09-15 18:08:56 +08:00
Zaida Zhou 7e1d7af2d9
[Refactor] Refactor code structure (#395)
* Rename data to structure

* adjust the way to import module

* adjust the way to import module

* rename Structure to Data Structures in docs api

* rename structure to structures

* support using some modules of mmengine without torch

* fix circleci config

* fix circleci config

* fix registry ut

* minor fix

* move init method from model/utils to model/weight_init.py

* move init method from model/utils to model/weight_init.py

* move sync_bn to model

* move functions depending on torch to dl_utils

* format import

* fix logging ut

* add weight init in model/__init__.py

* move get_config and get_model to mmengine/hub

* move log_processor.py to mmengine/runner

* fix ut

* Add TimeCounter in dl_utils/__init__.py
2022-08-24 19:14:07 +08:00
Zaida Zhou 486d8cda56
[Refactor] Refactor the import rule (#459)
* [Refactor] Refactor the import rule

* minor refinement

* add a comment
2022-08-23 18:58:36 +08:00
RangiLyu 813f49bf23
[Feature] Support CosineRestartParamScheduler. (#397)
* [Feature] Support CosineRestartParamScheduler.

* add ut and docstring

* add docstring
2022-08-11 17:57:35 +08:00
Miao Zheng 39e7efb04d
[Fix] Revise UT of OneCycle schedulor (#388) 2022-07-27 16:22:00 +08:00
Mashiro 6b47035fdf
[Fix] Fix save scheduler state dict with optim wrapper (#375)
* fix save scheduler state dict with optim wrapper

* remove for loop and inherit TestParameterScheduler

* remove for loop and inherit TestParameterScheduler

* minor refine
2022-07-20 16:32:48 +08:00
Ma Zerun 3da66d1f87
[Enhance] Auto set the `end` of param schedulers. (#361)
* [Enhance] Auto set the `end` of param schedulers.

* Add log output and unit test

* Update docstring

* Update unit tests of `CosineAnnealingParamScheduler`.
2022-07-15 19:53:28 +08:00
Mashiro b2ee9f8b11
[Fix] Fix loss could be nan in optimizer wrapper (#345)
* fix optimizer wrapper counts

* fix ut
2022-07-06 16:42:49 +08:00
RangiLyu a3d2916790
[Enhance] Support scheduling betas with MomentumScheduler. (#346)
* [Enhance] Support scheduling betas with MomentumScheduler.

* enhance ut

* test adam betas

* enhance ut

* enhance ut
2022-07-05 20:37:23 +08:00
Mashiro 2fd6beb972
[Fix] Fix UT of optimizer wrapper failed in pytorch1.6 (#340) 2022-06-28 10:31:14 +08:00
Mashiro b7866021c4
[Refactor] Refactor the accumulate gradient implemention of OptimWrapper (#284)
* merge context

* update unit test

* add docstring

* fix bug in AmpOptimWrapper

* add docstring for backward

* add warning and docstring for accumuate gradient

* fix docstring

* fix docstring

* add params_group method

* fix as comment

* fix as comment

* make default_value of loss_scale to dynamic

* Fix docstring

* decouple should update and should no sync

* rename attribute in OptimWrapper

* fix docstring

* fix comment

* fix comment

* fix as comment

* fix as comment and add unit test
2022-06-13 23:20:53 +08:00
Miao Zheng fd295741ca
[Features]Add OneCycleLR (#296)
* [Features]Add OnecycleLR

* [Features]Add OnecycleLR

* yapf disable

* build_iter_from_epoch

* add epoch

* fix args

* fix according to comments;

* lr-param

* fix according to comments

* defaults -> default to

* remove epoch and steps per step

* variabel names
2022-06-13 21:23:59 +08:00
Mashiro 3e3866c1b9
[Feature] Add optimizer wrapper (#265)
* Support multiple optimizers

* minor refinement

* improve unit tests

* minor fix

* Update unit tests for resuming or saving ckpt for multiple optimizers

* refine docstring

* refine docstring

* fix typo

* update docstring

* refactor the logic to build multiple optimizers

* resolve comments

* ParamSchedulers spports multiple optimizers

* add optimizer_wrapper

* fix comment and docstirng

* fix unit test

* add unit test

* refine docstring

* RuntimeInfoHook supports printing multi learning rates

* resolve comments

* add optimizer_wrapper

* fix mypy

* fix lint

* fix OptimizerWrapperDict docstring and add unit test

* rename OptimizerWrapper to OptimWrapper, OptimWrapperDict inherit OptimWrapper, and fix as comment

* Fix AmpOptimizerWrapper

* rename build_optmizer_wrapper to build_optim_wrapper

* refine optimizer wrapper

* fix AmpOptimWrapper.step, docstring

* resolve confict

* rename DefaultOptimConstructor

* fix as comment

* rename clig grad auguments

* refactor optim_wrapper config

* fix docstring of DefaultOptimWrapperConstructor

fix docstring of DefaultOptimWrapperConstructor

* add get_lr method to OptimWrapper and OptimWrapperDict

* skip some amp unit test

* fix unit test

* fix get_lr, get_momentum docstring

* refactor get_lr, get_momentum, fix as comment

* fix error message

Co-authored-by: zhouzaida <zhouzaida@163.com>
2022-06-01 18:04:38 +08:00
RangiLyu 1912660db9
[Feature] Support convert epoch-based schedulers to iter-based. (#221)
* [Feature] Support convert epoch-based schedulers to iter-based.

* Support convert and refactor LR and Momentum to mixin.

* Add unit tests

* fix args and add runner ut

* resolve comments
2022-05-10 15:17:51 +08:00
Tong Gao c3aff4fc9a
[Enhancement] Add PolyParamScheduler, PolyMomentum and PolyLR (#188)
* [Enhancement] Add PolyParamScheduler, PolyMomentum and PolyLR

* min_lr -> eta_min, refined docstr
2022-04-25 13:44:15 +08:00
RangiLyu 49b7d0ce6f
Support default_scope when building optimizer and evaluator. (#109)
* Support default_scope when building optimizer and evaluator.

* add docstring

* fix

* fix
2022-03-08 16:05:29 +08:00
RangiLyu c2c5664fad
Fix pt1.5 unit tests. (#65)
* Fix pt1.5 unit tests.

* move to mmengine.testing
2022-03-01 11:28:21 +08:00
Zaida Zhou 1e79b97444
Mock unimplemented modules and fix unit tests (#54)
* Mock unimplemented modules and fix unit tests

* add a comment
2022-02-25 15:24:27 +08:00
RangiLyu 7353778b7c
[Feature]: Add optimzier and constructor. (#25)
* [Feature]: Add optimzier and constructor.

* refactor unit tests

* add optimizer doc

* add parrots wrapper

* add parrots wrapper

* solve comments

* resolve comments
2022-02-19 14:09:37 +08:00
RangiLyu 7905f039b6
[Feature]: Add parameter schedulers. (#22)
* [Feature]: Add parameter schedulers.

* update

* update

* update

* update

* add docstring to lr and momentum

* resolve comments
2022-02-16 23:43:59 +08:00
RangiLyu bbb7d625e6
add scheduler unit test (#13)
* tmp

* add scheduler unit test

* disable yapf

* add more test

* add more test

* not use torch test case

* solve comments

* update file

* add more unit tests

* resolve comments

* update cosine ut

* fix typo

* solve comments

* solve comments

* resolve comments
2022-02-16 22:33:46 +08:00