Ma Zerun 274a67223e
[Feature] Implement layer-wise learning rate decay optimizer constructor. (#1399)
* [Feature] Implement layer-wise learning rate decay optimizer constructor.

* Use num_layers instead of max_depth to avoid misleading

* Add UT

* Update docstring

* Update log info

* update LearningRateDecay configs

---------

Co-authored-by: fangyixiao18 <fangyx18@hotmail.com>
2023-03-07 17:30:39 +08:00

9 lines
297 B
Python

# Copyright (c) OpenMMLab. All rights reserved.
from .adan_t import Adan
from .lamb import Lamb
from .lars import LARS
from .layer_decay_optim_wrapper_constructor import \
LearningRateDecayOptimWrapperConstructor
__all__ = ['Lamb', 'Adan', 'LARS', 'LearningRateDecayOptimWrapperConstructor']