Commit Graph

233 Commits (d09af9ead4b1871d2105e3fd03dc88287325c859)
 

Author SHA1 Message Date
Zaida Zhou d09af9ead4
[Doc]: update root registries in docs (#316) 2022-06-21 15:12:49 +08:00
Tao Gong 45f5859b50
[Doc]: refactor docs for basedataset (#318) 2022-06-21 14:58:10 +08:00
Mashiro 44538e56c5
[Doc]: refine logging doc (#320) 2022-06-21 14:55:21 +08:00
Jiazhen Wang e1422a34a3
[Fix]: Fix missing schedulers in __init__.py of schedulers (#319) 2022-06-21 14:40:00 +08:00
RangiLyu e470c3aa1b
[Fix]: fix SWA in pytorch 1.6 (#312) 2022-06-21 14:35:22 +08:00
Mashiro bc763758d8
Fix resource package in windows (#308)
* move import resource

* move import resource
2022-06-17 14:43:27 +08:00
Mashiro 4a4d6b1ab2
[Enhance] dump messagehub in runner.resume (#237)
* [Enhance] dump messagehub in runner.resume

* delete unnecessary code

* delete debugging code

Co-authored-by: imabackstabber <312276423@qq.com>
2022-06-17 11:10:37 +08:00
Mashiro 7129a98e36
[Fix]: fix log processor to log average time and grad norm (#292) 2022-06-17 10:54:20 +08:00
Jiazhen Wang 7b55c5bdbf
[Feature] Support resume from Ceph (#294)
* support resume from ceph

* move func and refine

* delete symlink

* fix unittest

* perserve _allow_symlink and symlink
2022-06-17 10:37:19 +08:00
Jiazhen Wang d0d7174274
[Feature] Support MLU Devices (#288)
* support mlu

* add ut and refine docstring
2022-06-16 20:28:09 +08:00
Mashiro e1ed5669d5
set resource limit in runner (#306) 2022-06-15 21:01:13 +08:00
Mashiro 7d3224bf46
[Fix] Fix setLevel of MMLogger (#297)
* Fix setLevel of MMLogger

Fix setLevel of MMLogger

* add docstring and comment
2022-06-14 14:54:25 +08:00
RangiLyu 1c18f30854
[Enhance] Support infinite dataloader iterator wrapper for IterBasedTrainLoop. (#289) 2022-06-14 14:52:59 +08:00
Alex Yang 5016332588
[Feat] support registering function (#302) 2022-06-14 14:50:24 +08:00
RangiLyu 4cd91ffe15
[Feature] Dump predictions to a pickle file for offline evaluation. (#293)
* [Feature] Dump predictions to pickle file for offline evaluation.

* print_log
2022-06-14 14:48:21 +08:00
Mashiro b7866021c4
[Refactor] Refactor the accumulate gradient implemention of OptimWrapper (#284)
* merge context

* update unit test

* add docstring

* fix bug in AmpOptimWrapper

* add docstring for backward

* add warning and docstring for accumuate gradient

* fix docstring

* fix docstring

* add params_group method

* fix as comment

* fix as comment

* make default_value of loss_scale to dynamic

* Fix docstring

* decouple should update and should no sync

* rename attribute in OptimWrapper

* fix docstring

* fix comment

* fix comment

* fix as comment

* fix as comment and add unit test
2022-06-13 23:20:53 +08:00
Miao Zheng fd295741ca
[Features]Add OneCycleLR (#296)
* [Features]Add OnecycleLR

* [Features]Add OnecycleLR

* yapf disable

* build_iter_from_epoch

* add epoch

* fix args

* fix according to comments;

* lr-param

* fix according to comments

* defaults -> default to

* remove epoch and steps per step

* variabel names
2022-06-13 21:23:59 +08:00
Mashiro 8b0c9c5f6f
[Fix] fix build train_loop during test (#295)
* fix build train_loop during test

* fix build train_loop during test

* fix build train_loop during test

* fix build train_loop during test

* Fix as comment
2022-06-13 21:23:46 +08:00
RangiLyu 819e10c24c
[Fix] Fix image dtype when enable_normalize=False. (#301)
* [Fix] Fix image dtype when enable_normalize=False.

* update ut

* move to collate

* update ut
2022-06-13 21:21:19 +08:00
Mashiro bcab813242
[Feature] Add ModuleList Sequential and ModuleDict (#299)
* add module list

* add module list

* fix docstring
2022-06-13 13:51:07 +08:00
Alex Yang df0c510444
[Feat]:support customizing evaluator (#287)
* [Feat]:support customizing evaluator

* fix keyname of determining using default evaluator or not

* add assertion

* fix typo
2022-06-10 15:34:10 +08:00
liukuikun c90b95a44b
[Fix]: fix label data and support empty tensor in label_to_onehot (#291) 2022-06-10 15:12:41 +08:00
RangiLyu 2f16ec69fb
[Feature] Support overwrite default scope with "_scope_". (#275)
* [Feature] Support overwrite default scope with "_scope_".

* add ut

* add ut
2022-06-09 20:16:31 +08:00
jbwang1997 7a5d3c83ea
[Fix] Replace auto_scale_lr_cfg to auto_scale_lr (#286)
* Replace auto_scale_lr_cfg to auto_scale_lr

* Update
2022-06-09 20:15:36 +08:00
Mashiro 931db99005
[Enhance] Enhance img data preprocessor (#290)
* fix BaseDataPreprocessor

* fix BaseDataPreprocessor

* change device type to torch.device

* change device type to torch.device

* fix cpu method of base model

* Allow ImgDataPreprocessor do not normalize

* remove unnecessary type ignore

* make mean and std optional

* refine docstring
2022-06-09 20:12:15 +08:00
Yixiao Fang 8b3675a2aa
[Enhance] support to add custom settings to param_group (#283) 2022-06-09 20:11:19 +08:00
Mashiro a9afdad7a8
[Fix] Fix BaseDataPreprocessor and BaseModel (#285)
* fix BaseDataPreprocessor

* fix BaseDataPreprocessor

* change device type to torch.device

* change device type to torch.device

* fix cpu method of base model
2022-06-09 11:45:19 +08:00
RangiLyu 6f321f88ee
[Enhance] Optimize parameter updating speed in AveragedModel. (#281)
* [Enhance] Optimize parameter updating speed in AveragedModel.

* add docstring
2022-06-08 16:38:27 +08:00
Mashiro 6ee675430f
[Refactor]: change order of BaseModel arguments (#282) 2022-06-08 13:28:00 +08:00
Mashiro f04fec736d
[Feature]: add base model, ddp model wrapper and unit test (#268)
* add base model, ddp model and unit test

* add unit test

* fix unit test

* fix docstring

* fix cpu unit test

* refine base data preprocessor

* refine base data preprocessor

* refine interface of ddp module

* remove optimizer hook

* add forward

* fix as comment

* fix unit test

* fix as comment

* fix build optimizer wrapper

* rebase main and fix unit test

* stack_batch support stacking ndim tensor, add docstring for merge dict

* fix lint

* fix test loop

* make precision_context effective to data_preprocessor

* fix as comment

* fix as comment

* refine docstring

* change collate_data output typehints

* rename to_rgb to bgr_to_rgb and rgb_to_bgr

* support build basemodel with built DataPreprocessor

* fix as comment

* fix docstring
2022-06-07 22:13:53 +08:00
RangiLyu ad965a5309
[Enhance] Enhance checkpoint meta info. (#279) 2022-06-07 18:48:50 +08:00
Mashiro 538ff48aec
[Fix] Rename data_list and support loading from ceph in dataset (#240)
* rename datalist and support load ceph

* rename datalist and support load ceph

* remove check disk file path in _load_metainfo

* fix rename error

* fix rename error

* unit test error

* fix rename error

* remove unnecessary code

* fix lint
2022-06-07 17:09:33 +08:00
jbwang1997 bd3c53b385
[Fix] Fix CI after merging support auto scale lr and support custom runner (#280) 2022-06-07 16:03:51 +08:00
jbwang1997 8f3fcee301
[Feature] Add auto scale lr fucntion (#270)
* Add auto scale lr fucntion

* Update

* Update

* Update

* Update

* Update

* Update

* Update

* Update

* Update

* Update

Co-authored-by: wangjiabao1.vendor <wangjiabao@pjlab.org.cn>
2022-06-06 22:27:15 +08:00
Jiazhen Wang 65bc95036c
[Enhance] Support Custom Runner (#258)
* support custom runner

* change build_runner_from_cfg

* refine docstring

* refine docstring
2022-06-06 14:33:32 +08:00
Haian Huang(深度眸) 94c7c3be2c
[Enhance]: remove warning in vis_backend (#273)
* fix visbackend warning

* fix
2022-06-06 14:02:15 +08:00
RangiLyu 70c4ea191f
[Refactor]: Modify val_interval and val_begin to be the attributes of TrainLoop. (#274)
* Modify val_interval and val_begin to be the attributes of TrainLoop.

* update doc

* fix lint

* type hint
2022-06-06 11:13:25 +08:00
Alex Yang 13606040ac
[Feat]:Add base module (#277) 2022-06-06 10:51:23 +08:00
Mashiro 80a46c4848
[Fix] fix build optimizer wrapper without type (#272)
* fix build optimizer wrapper without type

* refine logic

* fix as comment

* fix optim_wrapper config error in docstring and unit test

* refine docstring of build_optim_wrapper
2022-06-05 22:35:16 +08:00
Mashiro 3e3866c1b9
[Feature] Add optimizer wrapper (#265)
* Support multiple optimizers

* minor refinement

* improve unit tests

* minor fix

* Update unit tests for resuming or saving ckpt for multiple optimizers

* refine docstring

* refine docstring

* fix typo

* update docstring

* refactor the logic to build multiple optimizers

* resolve comments

* ParamSchedulers spports multiple optimizers

* add optimizer_wrapper

* fix comment and docstirng

* fix unit test

* add unit test

* refine docstring

* RuntimeInfoHook supports printing multi learning rates

* resolve comments

* add optimizer_wrapper

* fix mypy

* fix lint

* fix OptimizerWrapperDict docstring and add unit test

* rename OptimizerWrapper to OptimWrapper, OptimWrapperDict inherit OptimWrapper, and fix as comment

* Fix AmpOptimizerWrapper

* rename build_optmizer_wrapper to build_optim_wrapper

* refine optimizer wrapper

* fix AmpOptimWrapper.step, docstring

* resolve confict

* rename DefaultOptimConstructor

* fix as comment

* rename clig grad auguments

* refactor optim_wrapper config

* fix docstring of DefaultOptimWrapperConstructor

fix docstring of DefaultOptimWrapperConstructor

* add get_lr method to OptimWrapper and OptimWrapperDict

* skip some amp unit test

* fix unit test

* fix get_lr, get_momentum docstring

* refactor get_lr, get_momentum, fix as comment

* fix error message

Co-authored-by: zhouzaida <zhouzaida@163.com>
2022-06-01 18:04:38 +08:00
RangiLyu 987e5b83f9
fix wrong import (#271) 2022-05-31 22:59:08 +08:00
Mashiro 73da44805f
[Enhancement] add TypeVar type hint for ManagerMixin (#269) 2022-05-31 17:00:45 +08:00
Alex Yang b01b3ff97c
[feat]:support display paramwise result in constructing optimizer (#262)
* [feat]:support display paramwise result in constructing optimizer

* [fix]:fix format issue

* delete unnecessary rank logic and fix format
2022-05-31 16:59:46 +08:00
Zaida Zhou f1da9a1d7f
[Feature] Support multiple optimizers (#235)
* Support multiple optimizers

* minor refinement

* improve unit tests

* minor fix

* Update unit tests for resuming or saving ckpt for multiple optimizers

* refine docstring

* refine docstring

* fix typo

* update docstring

* refactor the logic to build multiple optimizers

* resolve comments

* ParamSchedulers spports multiple optimizers

* refine docstring

* RuntimeInfoHook supports printing multi learning rates

* resolve comments

* fix typo
2022-05-31 16:54:39 +08:00
Jiazhen Wang f2190de787
[Enhance] Improve Exception in call_hook (#247)
* improve exception in call_hook

* refine unit test

* add test_call_hook

* refine

* update docstring and ut
2022-05-31 11:34:30 +08:00
jbwang1997 38b22d9e68
[Enhance] Enhance error report when a module has been registered in registery. (#264)
* Update

* Add unittest
2022-05-31 11:31:04 +08:00
RangiLyu 172b9ded4a
[Fix] Fix ema state dict swapping in EMAHook and torch1.5 ut. (#266)
* [Fix] Fix ema state dict swapping in EMAHook.

* fix pt1.5 ut

* add more comments
2022-05-30 16:51:06 +08:00
Jingwei Zhang 40daf46a45
Support validation only after some epoch/iteration in ValLoop (#257)
* add the epoch/iter that begins validating

* fix lint

* add property and fix unit test

* minor changes

* fix typos and add unit test

* add unit test about begin

* fix docstring
2022-05-27 15:10:12 +08:00
Haian Huang(深度眸) 08a3adb5d7
Fix error of 'Runner' object has no attribute 'log_buffer' (#259)
* fix 'Runner' object has no attribute 'log_buffer'

* update

* add train
2022-05-27 10:51:25 +08:00
RangiLyu 4705e1fe3d
[Enhance] Add RuntimeInfoHook to update runtime information. (#254)
* [Enhance] Add RuntimeInfoHook to update runtime information.

* move lr to runtime info

* docstring

* resolve comments

* update ut and doc
2022-05-26 14:35:37 +08:00