Commit Graph

17 Commits (e877862d5b94876fda62c47ec7f5c0aa0c5d3275)

Author SHA1 Message Date
Mashiro 312f264ecd
[Feature] Add autocast wrapper (#307)
* add autocast wrapper

* fix docstring

* fix docstring

* fix compare version

* fix unit test

* fix incompatible arguments

* fix as comment

* fix unit test

* rename auto_cast to autocast
2022-06-22 19:49:20 +08:00
Alex Yang ef946404e6
[Feat] Support FSDP Training (#304)
* [Feat] Support FSDP Training

* fix version comparison

* change param format and move `FSDP_WRAP_POLICY` to wrapper file

* add docstring and type hint,reformat code

* fix type hint

* fix typo, reformat code
2022-06-21 15:32:56 +08:00
Mashiro b7866021c4
[Refactor] Refactor the accumulate gradient implemention of OptimWrapper (#284)
* merge context

* update unit test

* add docstring

* fix bug in AmpOptimWrapper

* add docstring for backward

* add warning and docstring for accumuate gradient

* fix docstring

* fix docstring

* add params_group method

* fix as comment

* fix as comment

* make default_value of loss_scale to dynamic

* Fix docstring

* decouple should update and should no sync

* rename attribute in OptimWrapper

* fix docstring

* fix comment

* fix comment

* fix as comment

* fix as comment and add unit test
2022-06-13 23:20:53 +08:00
RangiLyu 819e10c24c
[Fix] Fix image dtype when enable_normalize=False. (#301)
* [Fix] Fix image dtype when enable_normalize=False.

* update ut

* move to collate

* update ut
2022-06-13 21:21:19 +08:00
Mashiro bcab813242
[Feature] Add ModuleList Sequential and ModuleDict (#299)
* add module list

* add module list

* fix docstring
2022-06-13 13:51:07 +08:00
Mashiro 931db99005
[Enhance] Enhance img data preprocessor (#290)
* fix BaseDataPreprocessor

* fix BaseDataPreprocessor

* change device type to torch.device

* change device type to torch.device

* fix cpu method of base model

* Allow ImgDataPreprocessor do not normalize

* remove unnecessary type ignore

* make mean and std optional

* refine docstring
2022-06-09 20:12:15 +08:00
Mashiro a9afdad7a8
[Fix] Fix BaseDataPreprocessor and BaseModel (#285)
* fix BaseDataPreprocessor

* fix BaseDataPreprocessor

* change device type to torch.device

* change device type to torch.device

* fix cpu method of base model
2022-06-09 11:45:19 +08:00
Mashiro 6ee675430f
[Refactor]: change order of BaseModel arguments (#282) 2022-06-08 13:28:00 +08:00
Mashiro f04fec736d
[Feature]: add base model, ddp model wrapper and unit test (#268)
* add base model, ddp model and unit test

* add unit test

* fix unit test

* fix docstring

* fix cpu unit test

* refine base data preprocessor

* refine base data preprocessor

* refine interface of ddp module

* remove optimizer hook

* add forward

* fix as comment

* fix unit test

* fix as comment

* fix build optimizer wrapper

* rebase main and fix unit test

* stack_batch support stacking ndim tensor, add docstring for merge dict

* fix lint

* fix test loop

* make precision_context effective to data_preprocessor

* fix as comment

* fix as comment

* refine docstring

* change collate_data output typehints

* rename to_rgb to bgr_to_rgb and rgb_to_bgr

* support build basemodel with built DataPreprocessor

* fix as comment

* fix docstring
2022-06-07 22:13:53 +08:00
Alex Yang 13606040ac
[Feat]:Add base module (#277) 2022-06-06 10:51:23 +08:00
Mashiro 3e3866c1b9
[Feature] Add optimizer wrapper (#265)
* Support multiple optimizers

* minor refinement

* improve unit tests

* minor fix

* Update unit tests for resuming or saving ckpt for multiple optimizers

* refine docstring

* refine docstring

* fix typo

* update docstring

* refactor the logic to build multiple optimizers

* resolve comments

* ParamSchedulers spports multiple optimizers

* add optimizer_wrapper

* fix comment and docstirng

* fix unit test

* add unit test

* refine docstring

* RuntimeInfoHook supports printing multi learning rates

* resolve comments

* add optimizer_wrapper

* fix mypy

* fix lint

* fix OptimizerWrapperDict docstring and add unit test

* rename OptimizerWrapper to OptimWrapper, OptimWrapperDict inherit OptimWrapper, and fix as comment

* Fix AmpOptimizerWrapper

* rename build_optmizer_wrapper to build_optim_wrapper

* refine optimizer wrapper

* fix AmpOptimWrapper.step, docstring

* resolve confict

* rename DefaultOptimConstructor

* fix as comment

* rename clig grad auguments

* refactor optim_wrapper config

* fix docstring of DefaultOptimWrapperConstructor

fix docstring of DefaultOptimWrapperConstructor

* add get_lr method to OptimWrapper and OptimWrapperDict

* skip some amp unit test

* fix unit test

* fix get_lr, get_momentum docstring

* refactor get_lr, get_momentum, fix as comment

* fix error message

Co-authored-by: zhouzaida <zhouzaida@163.com>
2022-06-01 18:04:38 +08:00
RangiLyu 172b9ded4a
[Fix] Fix ema state dict swapping in EMAHook and torch1.5 ut. (#266)
* [Fix] Fix ema state dict swapping in EMAHook.

* fix pt1.5 ut

* add more comments
2022-05-30 16:51:06 +08:00
RangiLyu 0279ac2e8d
[Feature] Support EMA and SWA. (#239)
* [Feature] Support EMA and SWA.

* add ema hook

* add avg model ut

* add more unit tests

* resolve comments

* fix warmup ema

* rename

* fix comments

* add assert

* fix typehint

* add comments
2022-05-19 18:53:04 +08:00
Zaida Zhou 86ffc19c9c
Add pyupgrade pre-commit hook (#232)
* Add pyupgrade pre-commit hook

* fix ut

* remove comments
2022-05-19 17:56:31 +08:00
RangiLyu 3d830a28b6
[Fix]: Fix is_model_wrapper and add DistSamplerSeedHook to default hooks. (#172)
* [Fix]: Fix model_wrapper and add DistSamplerSeedHook as default hook.

* add comments
2022-04-08 22:18:23 +08:00
Haian Huang(深度眸) 5170676a2f
fix mmdp unittest (#60) 2022-02-27 20:34:52 +08:00
Haian Huang(深度眸) 492b2f2fa8
[Feature] Add MMDataParallel and MMDistributedDataParallel (#44)
* add mmddp

* update

* update code

* update unittest

* fix comment
2022-02-26 10:21:20 +08:00