Ross Wightman
cd3dc4979f
Fix adabelief imports, remove prints, preserve memory format is the default arg for zeros_like
2021-04-12 08:25:31 -07:00
juntang
addfc7c1ac
adabelief
2021-04-04 23:48:15 -04:00
Ross Wightman
37c71a5609
Some further create_optimizer_v2 tweaks, remove some redudnant code, add back safe model str. Benchmark step times per batch.
2021-04-01 22:34:55 -07:00
Ross Wightman
288682796f
Update benchmark script to add precision arg. Fix some downstream (DeiT) compat issues with latest changes. Bump version to 0.4.7
2021-04-01 16:40:12 -07:00
Ross Wightman
0e16d4e9fb
Add benchmark.py script, and update optimizer factory to be more friendly to use outside of argparse interface.
2021-02-23 15:38:12 -08:00
Jasha
7c56c718f3
Configure create_optimizer with args.opt_args
...
Closes #301
2020-12-08 00:03:09 -06:00
Ross Wightman
30ab4a1494
Fix issue in optim factory with sgd / eps flag. Bump version to 0.3.1
2020-10-31 18:05:30 -07:00
Ross Wightman
f944242cb0
Fix #262 , num_classes arg mixup. Make vision_transformers a bit closer to other models wrt get/reset classfier/forward_features. Fix torchscript for ViT.
2020-10-29 13:58:28 -07:00
Ross Wightman
477a78ed81
Fix optimizer factory regressin for optimizers like sgd/momentum that don't have an eps arg
2020-10-22 15:59:47 -07:00
Ross Wightman
a4d8fea61e
Add model based wd skip support. Improve cross version compat of optimizer factory. Fix #247
2020-10-13 12:49:47 -07:00
Ross Wightman
80078c47bb
Add Adafactor and Adahessian optimizers, cleanup optimizer arg passing, add gradient clipping support.
2020-10-09 17:24:43 -07:00
Ross Wightman
7995295968
Merge branch 'logger' into features. Change 'logger' to '_logger'.
2020-07-27 18:00:46 -07:00
Ross Wightman
6c17d57a2c
Fix some attributions, add copyrights to some file docstrings
2020-07-27 13:44:56 -07:00
Sangdoo Yun
e93e571f7a
Add `adamp` and 'sgdp' optimizers.
...
Update requirements.txt
Update optim_factory.py
Add `adamp` optimizer
Update __init__.py
copy files of adamp & sgdp
Create adamp.py
Update __init__.py
Create sgdp.py
Update optim_factory.py
Update optim_factory.py
Update requirements.txt
Update adamp.py
Update sgdp.py
Update sgdp.py
Update adamp.py
2020-07-25 15:33:20 -07:00
Ross Wightman
e6f24e5578
Add 'momentum' optimizer (SGD w/o nesterov) for stable EfficientDet training defaults
2020-04-25 19:42:13 -07:00
Ross Wightman
64966f61f7
Add Nvidia's NovogGrad impl from Jasper (cleaner/faster than current) and Apex Fused optimizers
2019-08-29 15:21:38 -07:00
Ross Wightman
ba3c97c3ad
Some Lookahead cleanup and fixes
2019-08-29 15:14:35 -07:00
Ross Wightman
fac58f609a
Add RAdam, NovoGrad, Lookahead, and AdamW optimizers, a few ResNet tweaks and scheduler factory tweak.
...
* Add some of the trendy new optimizers. Decent results but not clearly better than the standards.
* Can create a None scheduler for constant LR
* ResNet defaults to zero_init of last BN in residual
* add resnet50d config
2019-08-28 00:14:10 -07:00
Ross Wightman
aa4354f466
Big re-org, working towards making pip/module as 'timm'
2019-06-19 17:20:51 -07:00