Commit Graph

12 Commits (7cea88e2c40ea45188f0e904384902523aeaf43b)

Author SHA1 Message Date
alec.tu 74d6afb4cd Add Adan to __init__.py 2022-12-15 11:37:29 +08:00
Ross Wightman a16a753852 Add lamb/lars to optim init imports, remove stray comment 2021-08-18 22:55:02 -07:00
Ross Wightman 8a9eca5157 A few optimizer comments, dead import, missing import 2021-08-17 18:01:33 -07:00
Ross Wightman ac469b50da Optimizer improvements, additions, cleanup
* Add MADGRAD code
* Fix Lamb (non-fused variant) to work w/ PyTorch XLA
* Tweak optimizer factory args (lr/learning_rate and opt/optimizer_name), may break compat
* Use newer fn signatures for all add,addcdiv, addcmul in optimizers
* Use upcoming PyTorch native Nadam if it's available
* Cleanup lookahead opt
* Add optimizer tests
* Remove novograd.py impl as it was messy, keep nvnovograd
* Make AdamP/SGDP work in channels_last layout
* Add rectified adablief mode (radabelief)
* Support a few more PyTorch optim, adamax, adagrad
2021-08-17 17:51:20 -07:00
juntang addfc7c1ac adabelief 2021-04-04 23:48:15 -04:00
Ross Wightman 288682796f Update benchmark script to add precision arg. Fix some downstream (DeiT) compat issues with latest changes. Bump version to 0.4.7 2021-04-01 16:40:12 -07:00
Ross Wightman 0e16d4e9fb Add benchmark.py script, and update optimizer factory to be more friendly to use outside of argparse interface. 2021-02-23 15:38:12 -08:00
Ross Wightman 80078c47bb Add Adafactor and Adahessian optimizers, cleanup optimizer arg passing, add gradient clipping support. 2020-10-09 17:24:43 -07:00
Sangdoo Yun e93e571f7a Add `adamp` and 'sgdp' optimizers.
Update requirements.txt

Update optim_factory.py

Add `adamp` optimizer

Update __init__.py

copy files of adamp & sgdp

Create adamp.py

Update __init__.py

Create sgdp.py

Update optim_factory.py

Update optim_factory.py

Update requirements.txt

Update adamp.py

Update sgdp.py

Update sgdp.py

Update adamp.py
2020-07-25 15:33:20 -07:00
Ross Wightman 64966f61f7 Add Nvidia's NovogGrad impl from Jasper (cleaner/faster than current) and Apex Fused optimizers 2019-08-29 15:21:38 -07:00
Ross Wightman fac58f609a Add RAdam, NovoGrad, Lookahead, and AdamW optimizers, a few ResNet tweaks and scheduler factory tweak.
* Add some of the trendy new optimizers. Decent results but not clearly better than the standards.
* Can create a None scheduler for constant LR
* ResNet defaults to zero_init of last BN in residual
* add resnet50d config
2019-08-28 00:14:10 -07:00
Ross Wightman aa4354f466 Big re-org, working towards making pip/module as 'timm' 2019-06-19 17:20:51 -07:00