614 Commits

Author SHA1 Message Date
Ross Wightman
f1808e0970 Post crossvit merge cleanup, change model names to reflect input size, cleanup img size vs scale handling, fix tests 2021-09-13 11:49:54 -07:00
Richard Chen
9fe5798bee fix bug for reset classifier and fix for validating the dimension 2021-09-08 21:58:17 -04:00
Richard Chen
3718c5a5bd fix loading pretrained model 2021-09-08 11:53:05 -04:00
Richard Chen
bb50b69a57 fix for torch script 2021-09-08 11:20:59 -04:00
Richard Chen
7ab9d4555c add crossvit 2021-09-01 17:13:12 -04:00
Ross Wightman
78933122c9 Fix silly typo 2021-08-27 09:22:20 -07:00
Ross Wightman
708d87a813 Fix ViT SAM weight compat as weights at URL changed to not use repr layer. Fix #825. Tweak optim test. 2021-08-27 09:20:13 -07:00
Ross Wightman
d667351eac Tweak accuracy topk safety. Fix #807 2021-08-19 14:18:53 -07:00
Yohann Lereclus
35c9740826 Fix accuracy when topk > num_classes 2021-08-19 11:58:59 +02:00
Ross Wightman
a16a753852 Add lamb/lars to optim init imports, remove stray comment 2021-08-18 22:55:02 -07:00
Ross Wightman
c207e02782 MOAR optimizer changes. Woo! 2021-08-18 22:20:35 -07:00
Ross Wightman
a426511c95 More optimizer cleanup. Change all to no longer use .data. Improve (b)float16 use with adabelief. Add XLA compatible Lars. 2021-08-18 17:21:56 -07:00
Ross Wightman
9541f4963b One more scalar -> tensor fix for lamb optimizer 2021-08-18 11:20:25 -07:00
Ross Wightman
8f68193c91
Update lamp.py comment 2021-08-18 09:27:40 -07:00
Ross Wightman
4d284017b8
Merge pull request #813 from rwightman/opt_cleanup
Optimizer cleanup and additions
2021-08-18 09:12:00 -07:00
Ross Wightman
a6af48be64 add madgradw optimizer 2021-08-17 22:19:27 -07:00
Ross Wightman
55fb5eedf6 Remove experiment from lamb impl 2021-08-17 21:48:26 -07:00
Ross Wightman
8a9eca5157 A few optimizer comments, dead import, missing import 2021-08-17 18:01:33 -07:00
Ross Wightman
ac469b50da Optimizer improvements, additions, cleanup
* Add MADGRAD code
* Fix Lamb (non-fused variant) to work w/ PyTorch XLA
* Tweak optimizer factory args (lr/learning_rate and opt/optimizer_name), may break compat
* Use newer fn signatures for all add,addcdiv, addcmul in optimizers
* Use upcoming PyTorch native Nadam if it's available
* Cleanup lookahead opt
* Add optimizer tests
* Remove novograd.py impl as it was messy, keep nvnovograd
* Make AdamP/SGDP work in channels_last layout
* Add rectified adablief mode (radabelief)
* Support a few more PyTorch optim, adamax, adagrad
2021-08-17 17:51:20 -07:00
Sepehr Sameni
abf3e044bb
Update scheduler_factory.py
remove duplicate code from create_scheduler()
2021-08-14 22:53:17 +02:00
Ross Wightman
3cdaf5ed56 Add mmax config key to auto_augment for increasing upper bound of RandAugment magnitude beyond 10. Make AugMix uniform sampling default not override config setting. 2021-08-12 15:39:05 -07:00
Ross Wightman
1042b8a146 Add non fused LAMB optimizer option 2021-08-09 13:13:43 -07:00
Ross Wightman
01cb46a9a5 Add gc_efficientnetv2_rw_t weights (global context instead of SE attn). Add TF XL weights even though the fine-tuned ones don't validate that well. Change default arg for GlobalContext to use scal (mul) mode. 2021-08-07 16:45:29 -07:00
Ross Wightman
d3f7440650 Add EfficientNetV2 XL model defs 2021-07-22 13:15:24 -07:00
Ross Wightman
72b227dcf5
Merge pull request #750 from drjinying/master
Specify "interpolation" mode in vision_transformer's resize_pos_embed
2021-07-13 11:01:20 -07:00
Ross Wightman
2907c1f967
Merge pull request #746 from samarth4149/master
Adding a Multi Step LR Scheduler
2021-07-13 10:55:54 -07:00
Ross Wightman
748ab852ca Allow act_layer switch for xcit, fix in_chans for some variants 2021-07-12 13:27:29 -07:00
Ying Jin
20b2d4b69d Use bicubic interpolation in resize_pos_embed() 2021-07-12 10:38:31 -07:00
Ross Wightman
d3255adf8e Merge branch 'xcit' of https://github.com/alexander-soare/pytorch-image-models into alexander-soare-xcit 2021-07-12 08:30:30 -07:00
Ross Wightman
f8039c7492 Fix gc effv2 model cfg name 2021-07-11 12:14:31 -07:00
Alexander Soare
3a55a30ed1 add notes from author 2021-07-11 14:25:58 +01:00
Alexander Soare
899cf84ccc bug fix - missing _dist postfix for many of the 224_dist models 2021-07-11 12:41:51 +01:00
Alexander Soare
623e8b8eb8 wip xcit 2021-07-11 09:39:38 +01:00
Ross Wightman
392368e210 Add efficientnetv2_rw_t defs w/ weights, and gc variant, as well as gcresnet26ts for experiments. Version 0.4.13 2021-07-09 16:46:52 -07:00
samarth
daab57a6d9 1. Added a simple multi step LR scheduler 2021-07-09 16:18:27 -04:00
Ross Wightman
6d8272e92c Add SAM pretrained model defs/weights for ViT B16 and B32 models. 2021-07-08 11:51:12 -07:00
Ross Wightman
ee4d8fc69a Remove unecessary line from nest post refactor 2021-07-05 21:22:46 -07:00
Ross Wightman
8165cacd82 Realized LayerNorm2d won't work in all cases as is, fixed. 2021-07-05 18:21:34 -07:00
Ross Wightman
81cd6863c8 Move aggregation (convpool) for nest into NestLevel, cleanup and enable features_only use. Finalize weight url. 2021-07-05 18:20:49 -07:00
Ross Wightman
6ae0ac6420 Merge branch 'nested_transformer' of https://github.com/alexander-soare/pytorch-image-models into alexander-soare-nested_transformer 2021-07-03 12:45:26 -07:00
Alexander Soare
7b8a0017f1 wip to review 2021-07-03 12:10:12 +01:00
Alexander Soare
b11d949a06 wip checkpoint with some feature extraction work 2021-07-03 11:45:19 +01:00
Alexander Soare
23bb72ce5e nested_transformer wip 2021-07-02 20:12:29 +01:00
Ross Wightman
766b4d3262 Fix features for resnetv2_50t 2021-06-28 15:56:24 -07:00
Ross Wightman
e8045e712f Fix BatchNorm for ResNetV2 non GN models, add more ResNetV2 model defs for future experimentation, fix zero_init of last residual for pre-act. 2021-06-28 10:52:45 -07:00
Ross Wightman
20a2be14c3 Add gMLP-S weights, 79.6 top-1 2021-06-23 10:40:30 -07:00
Ross Wightman
85f894e03d Fix ViT in21k representation (pre_logits) layer handling across old and new npz checkpoints 2021-06-23 10:38:34 -07:00
Ross Wightman
b41cffaa93 Fix a few issues loading pretrained vit/bit npz weights w/ num_classes=0 __init__ arg. Missed a few other small classifier handling detail on Mlp, GhostNet, Levit. Should fix #713 2021-06-22 23:16:05 -07:00
Ross Wightman
9c9755a808 AugReg release 2021-06-20 17:46:06 -07:00
Ross Wightman
381b279785 Add hybrid model fwds back 2021-06-19 22:28:44 -07:00