Commit Graph

3 Commits (9ee846ff0cbbc05a99b45140aa6d84083bcf6488)

Author SHA1 Message Date
Ross Wightman a426511c95 More optimizer cleanup. Change all to no longer use .data. Improve (b)float16 use with adabelief. Add XLA compatible Lars. 2021-08-18 17:21:56 -07:00
Ross Wightman ac469b50da Optimizer improvements, additions, cleanup
* Add MADGRAD code
* Fix Lamb (non-fused variant) to work w/ PyTorch XLA
* Tweak optimizer factory args (lr/learning_rate and opt/optimizer_name), may break compat
* Use newer fn signatures for all add,addcdiv, addcmul in optimizers
* Use upcoming PyTorch native Nadam if it's available
* Cleanup lookahead opt
* Add optimizer tests
* Remove novograd.py impl as it was messy, keep nvnovograd
* Make AdamP/SGDP work in channels_last layout
* Add rectified adablief mode (radabelief)
* Support a few more PyTorch optim, adamax, adagrad
2021-08-17 17:51:20 -07:00
Sangdoo Yun e93e571f7a Add `adamp` and 'sgdp' optimizers.
Update requirements.txt

Update optim_factory.py

Add `adamp` optimizer

Update __init__.py

copy files of adamp & sgdp

Create adamp.py

Update __init__.py

Create sgdp.py

Update optim_factory.py

Update optim_factory.py

Update requirements.txt

Update adamp.py

Update sgdp.py

Update sgdp.py

Update adamp.py
2020-07-25 15:33:20 -07:00