mmclassification/mmpretrain/engine/optimizers
Ma Zerun 274a67223e
[Feature] Implement layer-wise learning rate decay optimizer constructor. (#1399)
* [Feature] Implement layer-wise learning rate decay optimizer constructor.

* Use num_layers instead of max_depth to avoid misleading

* Add UT

* Update docstring

* Update log info

* update LearningRateDecay configs

---------

Co-authored-by: fangyixiao18 <fangyx18@hotmail.com>
2023-03-07 17:30:39 +08:00
..
__init__.py [Feature] Implement layer-wise learning rate decay optimizer constructor. (#1399) 2023-03-07 17:30:39 +08:00
adan_t.py Rename the package name to `mmpretrain`. 2023-02-17 15:20:55 +08:00
lamb.py Rename the package name to `mmpretrain`. 2023-02-17 15:20:55 +08:00
lars.py [Refactor] Add selfsup algorithms. (#1389) 2023-03-06 16:53:15 +08:00
layer_decay_optim_wrapper_constructor.py [Feature] Implement layer-wise learning rate decay optimizer constructor. (#1399) 2023-03-07 17:30:39 +08:00