* [Enhance] Auto set the `end` of param schedulers.
* Add log output and unit test
* Update docstring
* Update unit tests of `CosineAnnealingParamScheduler`.
* first commit
* Support modify base config and add unit test
* remove import mmengine in config
* add unit test
* fix lint
* add unit test
* move RemoveAssignFromAST to config utils
* git add utils
* fix format issue in test file
* refine unit test
* refine unit test
* clean code
* fix as comment
* fix as comment
* add get_registry_by_scope method
* add unit test and docstring example
* rename get_registry_by_scope to switch_scope_and_registry
* move build function to registry/builder
* fix docstring
* rename builder->registry_builder, move build_from_cfg to registry_builder
rename builder->registry_builder, move build_from_cfg to registry_builder
* rename registry_builder to build_function
rename registry_builder to build_function
* fix docstring and type hint
* rename build_function to build_functions
* Fix ema hook and add unit test
* save state_dict of ema.module
save state_dict of ema.module
* replace warning.warn with MMLogger.warn
* fix as comment
* fix bug
* fix bug
* add autocast wrapper
* fix docstring
* fix docstring
* fix compare version
* fix unit test
* fix incompatible arguments
* fix as comment
* fix unit test
* rename auto_cast to autocast
* [Refactor]:Refactor `after_val_epoch` to make it output metric by epoch
* add an option for user to choose the way of outputing metric
* rename variable
* reformat docstring
* add type alias
* reformat code
* add test function
* add comment and test code
* add comment and test code
* [Feat] Support FSDP Training
* fix version comparison
* change param format and move `FSDP_WRAP_POLICY` to wrapper file
* add docstring and type hint,reformat code
* fix type hint
* fix typo, reformat code
* merge context
* update unit test
* add docstring
* fix bug in AmpOptimWrapper
* add docstring for backward
* add warning and docstring for accumuate gradient
* fix docstring
* fix docstring
* add params_group method
* fix as comment
* fix as comment
* make default_value of loss_scale to dynamic
* Fix docstring
* decouple should update and should no sync
* rename attribute in OptimWrapper
* fix docstring
* fix comment
* fix comment
* fix as comment
* fix as comment and add unit test
* fix build train_loop during test
* fix build train_loop during test
* fix build train_loop during test
* fix build train_loop during test
* Fix as comment
* fix BaseDataPreprocessor
* fix BaseDataPreprocessor
* change device type to torch.device
* change device type to torch.device
* fix cpu method of base model
* Allow ImgDataPreprocessor do not normalize
* remove unnecessary type ignore
* make mean and std optional
* refine docstring
* fix BaseDataPreprocessor
* fix BaseDataPreprocessor
* change device type to torch.device
* change device type to torch.device
* fix cpu method of base model
* add base model, ddp model and unit test
* add unit test
* fix unit test
* fix docstring
* fix cpu unit test
* refine base data preprocessor
* refine base data preprocessor
* refine interface of ddp module
* remove optimizer hook
* add forward
* fix as comment
* fix unit test
* fix as comment
* fix build optimizer wrapper
* rebase main and fix unit test
* stack_batch support stacking ndim tensor, add docstring for merge dict
* fix lint
* fix test loop
* make precision_context effective to data_preprocessor
* fix as comment
* fix as comment
* refine docstring
* change collate_data output typehints
* rename to_rgb to bgr_to_rgb and rgb_to_bgr
* support build basemodel with built DataPreprocessor
* fix as comment
* fix docstring