Ross Wightman
35fb00c779
Add flexivit to non-std tests list
2022-12-22 21:32:31 -08:00
Fredo Guan
84178fca60
Merge branch 'rwightman:main' into main
2022-12-12 23:13:58 -08:00
Fredo Guan
c43340ddd4
Davit std ( #5 )
...
* Update davit.py
* Update test_models.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* starting point
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update test_models.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Davit revised (#4 )
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
clean up
* Update test_models.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update test_models.py
* Update davit.py
2022-12-11 03:03:22 -08:00
Ross Wightman
d5e7d6b27e
Merge remote-tracking branch 'origin/main' into refactor-imports
2022-12-09 14:49:44 -08:00
Fredo Guan
edea013dd1
Davit std ( #3 )
...
Davit with all features working
2022-12-09 02:53:21 -08:00
Ross Wightman
98047ef5e3
Add EVA FT results, hopefully fix BEiT test failures
2022-12-07 08:54:06 -08:00
Ross Wightman
927f031293
Major module / path restructure, timm.models.layers -> timm.layers, add _ prefix to all non model modules in timm.models
2022-12-06 15:00:06 -08:00
Ross Wightman
0dadb4a6e9
Initial multi-weight support, handled so old pretraing config handling co-exists with new tags.
2022-12-05 10:21:34 -08:00
Ross Wightman
da6f8f5a40
Fix beitv2 tests
2022-09-07 08:09:47 -07:00
Ross Wightman
cac0a4570a
More test fixes, pool size for 256x256 maxvit models
2022-08-23 13:38:26 -07:00
Ross Wightman
8c9696c9df
More model and test fixes
2022-08-22 17:40:31 -07:00
Ross Wightman
f332fc2db7
Fix some test failures, torchscript issues
2022-08-18 16:19:46 -07:00
Ross Wightman
29afe79c8b
Attempt to fix unit tests by removing subset of tests on mac runner
2022-07-17 14:55:47 -07:00
Ross Wightman
c0211b0bf7
Swin-V2 test fixes, typo
2022-05-12 22:31:55 -07:00
Ross Wightman
39b725e1c9
Fix tests for rank-4 output where feature channels dim is -1 (3) and not 1
2022-05-09 15:20:24 -07:00
okojoalg
2fec08e923
Add Sequencer to non std filters
2022-05-06 23:08:10 +09:00
Ross Wightman
b049a5c5c6
Merge remote-tracking branch 'origin/master' into norm_norm_norm
2022-03-21 13:41:43 -07:00
Ross Wightman
372ad5fa0d
Significant model refactor and additions:
...
* All models updated with revised foward_features / forward_head interface
* Vision transformer and MLP based models consistently output sequence from forward_features (pooling or token selection considered part of 'head')
* WIP param grouping interface to allow consistent grouping of parameters for layer-wise decay across all model types
* Add gradient checkpointing support to a significant % of models, especially popular architectures
* Formatting and interface consistency improvements across models
* layer-wise LR decay impl part of optimizer factory w/ scale support in scheduler
* Poolformer and Volo architectures added
2022-02-28 13:56:23 -08:00
Ross Wightman
1420c118df
Missed comitting outstanding changes to default_cfg keys and test exclusions for swin v2
2022-02-23 19:50:26 -08:00
Ross Wightman
5f81d4de23
Move DeiT to own file, vit getting crowded. Working towards fixing #1029 , make pooling interface for transformers and mlp closer to convnets. Still working through some details...
2022-01-26 22:53:57 -08:00
Ross Wightman
95cfc9b3e8
Merge remote-tracking branch 'origin/master' into norm_norm_norm
2022-01-25 22:20:45 -08:00
Ross Wightman
abc9ba2544
Transitioning default_cfg -> pretrained_cfg. Improving handling of pretrained_cfg source (HF-Hub, files, timm config, etc). Checkpoint handling tweaks.
2022-01-25 21:54:13 -08:00
Ross Wightman
010b486590
Add Dino pretrained weights (no head) for vit models. Add support to tests and helpers for models w/ no classifier (num_classes=0 in pretrained cfg)
2022-01-17 12:20:02 -08:00
Ross Wightman
a8d103e18b
Giant/gigantic vits snuck through in a test a broke GitHub test runner, add filter
2022-01-14 17:23:35 -08:00
Ross Wightman
ef72ad4177
Extra vit_huge model likely to cause test issue (non in21k variant), adding to filters
2022-01-14 16:28:27 -08:00
Ross Wightman
e967c72875
Update REAMDE.md. Sneak in g/G (giant / gigantic?) ViT defs from scaling paper
2022-01-14 16:28:27 -08:00
Ross Wightman
4df51f3932
Add lcnet_100 and mnasnet_small weights
2022-01-06 22:21:05 -08:00
Ross Wightman
5ccf682a8f
Remove deprecated bn-tf train arg and create_model handler. Add evos/evob models back into fx test filter until norm_norm_norm branch merged.
2022-01-06 18:08:39 -08:00
Ross Wightman
25d1526092
Update pytest for GitHub runner to use --forked with xdist, hopefully eliminate memory buildup
2022-01-06 16:04:23 -08:00
Ross Wightman
cd059cbe9c
Add FX backward tests back
2021-12-01 14:58:56 -08:00
Ross Wightman
58ffa2bfb7
Update pytest for GitHub runner to use --forked with xdist, hopefully eliminate memory buildup
2021-12-01 12:09:23 -08:00
Ross Wightman
f7d210d759
Remove evonorm models from FX tests
2021-11-24 13:21:24 -08:00
Ross Wightman
f83b0b01e3
Would like to pass GitHub tests again disabling both FX feature extract backward and torchscript tests
2021-11-23 22:24:58 -08:00
Ross Wightman
147e1059a8
Remove FX backward test from GitHub actions runs for now.
2021-11-23 14:32:32 -08:00
Ross Wightman
878bee1d5e
Add patch8 vit model to FX exclusion filter
2021-11-22 14:00:27 -08:00
Ross Wightman
ce76a810c2
New FX test strategy, filter based on param count
2021-11-22 11:48:40 -08:00
Ross Wightman
1e51c2d02e
More FX test tweaks
2021-11-22 09:46:43 -08:00
Ross Wightman
90448031ea
Filter more large models from FX tests
2021-11-21 21:26:44 -08:00
Ross Wightman
8dc269c303
Filter more models for FX tests
2021-11-21 19:49:33 -08:00
Ross Wightman
2482652027
Add nfnet_f2 to FX test exclusion
2021-11-21 14:08:53 -08:00
Ross Wightman
05092e2fbe
Add more models to FX filter
2021-11-20 15:51:48 -08:00
Ross Wightman
3819bef93e
Add FX test exclusion since it uses more ram and barfs on GitHub actions. Will take a few iterations to include needed models :(
2021-11-19 17:35:41 -08:00
Ross Wightman
9b3519545d
Attempt to reduce memory footprint of FX tests for GitHub actions runs
2021-11-19 14:24:12 -08:00
Ross Wightman
bdd3dff0ca
beit_large models killing GitHub actions test, filter out
2021-11-19 08:39:48 -08:00
Ross Wightman
f2006b2437
Cleanup qkv_bias cat in beit model so it can be traced
2021-11-18 21:25:00 -08:00
Ross Wightman
1076a65df1
Minor post FX merge cleanup
2021-11-18 19:47:07 -08:00
Alexander Soare
0262a0e8e1
fx ready for review
2021-11-13 00:06:33 +00:00
Alexander Soare
d2994016e9
Add try/except guards
2021-11-12 21:16:53 +00:00
Alexander Soare
b25ff96768
wip - pre-rebase
2021-11-12 20:45:05 +00:00
Alexander Soare
a6c24b936b
Tests to enforce all models FX traceable
2021-11-12 20:45:05 +00:00
Alexander Soare
6d2acec1bb
Fix ordering of tests
2021-10-02 16:10:11 +01:00
Alexander Soare
65c3d78b96
Freeze unfreeze functionality finalized. Tests added
2021-10-02 15:55:08 +01:00
Ross Wightman
24720abe3b
Merge branch 'master' into attn_update
2021-09-13 16:51:10 -07:00
Ross Wightman
1c9284c640
Add BeiT 'finetuned' 1k weights and pretrained 22k weights, pretraining specific (masked) model excluded for now
2021-09-13 16:38:23 -07:00
Ross Wightman
7ab2491ab7
Better handling of crossvit for tests / forward_features, fix torchscript regression in my changes
2021-09-13 13:01:05 -07:00
Ross Wightman
f1808e0970
Post crossvit merge cleanup, change model names to reflect input size, cleanup img size vs scale handling, fix tests
2021-09-13 11:49:54 -07:00
Ross Wightman
a897e0ebcc
Merge branch 'feature/crossvit' of https://github.com/chunfuchen/pytorch-image-models into chunfuchen-feature/crossvit
2021-09-10 17:38:37 -07:00
Ross Wightman
8642401e88
Swap botnet 26/50 weights/models after realizing a mistake in arch def, now figuring out why they were so low...
2021-09-05 15:17:19 -07:00
Ross Wightman
5f12de4875
Add initial AttentionPool2d that's being trialed. Fix comment and still trying to improve reliability of sgd test.
2021-09-05 12:41:14 -07:00
Ross Wightman
54e90e82a5
Another attempt at sgd momentum test passing...
2021-09-03 20:50:26 -07:00
Richard Chen
7ab9d4555c
add crossvit
2021-09-01 17:13:12 -04:00
Ross Wightman
fc894c375c
Another attempt at sgd momentum test passing...
2021-08-27 10:39:31 -07:00
Ross Wightman
708d87a813
Fix ViT SAM weight compat as weights at URL changed to not use repr layer. Fix #825 . Tweak optim test.
2021-08-27 09:20:13 -07:00
Ross Wightman
c207e02782
MOAR optimizer changes. Woo!
2021-08-18 22:20:35 -07:00
Ross Wightman
42c1f0cf6c
Fix lars tests
2021-08-18 21:05:34 -07:00
Ross Wightman
a426511c95
More optimizer cleanup. Change all to no longer use .data. Improve (b)float16 use with adabelief. Add XLA compatible Lars.
2021-08-18 17:21:56 -07:00
Ross Wightman
a6af48be64
add madgradw optimizer
2021-08-17 22:19:27 -07:00
Ross Wightman
55fb5eedf6
Remove experiment from lamb impl
2021-08-17 21:48:26 -07:00
Ross Wightman
959eaff121
Add optimizer tests and update testing to pytorch 1.9
2021-08-17 17:59:15 -07:00
Ross Wightman
01cb46a9a5
Add gc_efficientnetv2_rw_t weights (global context instead of SE attn). Add TF XL weights even though the fine-tuned ones don't validate that well. Change default arg for GlobalContext to use scal (mul) mode.
2021-08-07 16:45:29 -07:00
Ross Wightman
ef1e2e12be
Attempt to fix xcit test failures on github runner by filter largest models
2021-07-13 16:33:55 -07:00
Alexander Soare
623e8b8eb8
wip xcit
2021-07-11 09:39:38 +01:00
Alexander Soare
7b8a0017f1
wip to review
2021-07-03 12:10:12 +01:00
Ross Wightman
b41cffaa93
Fix a few issues loading pretrained vit/bit npz weights w/ num_classes=0 __init__ arg. Missed a few other small classifier handling detail on Mlp, GhostNet, Levit. Should fix #713
2021-06-22 23:16:05 -07:00
Ross Wightman
381b279785
Add hybrid model fwds back
2021-06-19 22:28:44 -07:00
Ross Wightman
0020268d9b
Try lower max size for non_std default_cfg test
2021-06-12 23:31:24 -07:00
Ross Wightman
8880f696b6
Refactoring, cleanup, improved test coverage.
...
* Add eca_nfnet_l2 weights, 84.7 @ 384x384
* All 'non-std' (ie transformer / mlp) models have classifier / default_cfg test added
* Fix #694 reset_classifer / num_features / forward_features / num_classes=0 consistency for transformer / mlp models
* Add direct loading of npz to vision transformer (pure transformer so far, hybrid to come)
* Rename vit_deit* to deit_*
* Remove some deprecated vit hybrid model defs
* Clean up classifier flatten for conv classifiers and unusual cases (mobilenetv3/ghostnet)
* Remove explicit model fns for levit conv, just pass in arg
2021-06-12 16:40:02 -07:00
Ross Wightman
17dc47c8e6
Missed comma in test filters.
2021-05-30 22:00:43 -07:00
Ross Wightman
8bf63b6c6c
Able to use other attn layer in EfficientNet now. Create test ECA + GC B0 configs. Make ECA more configurable.
2021-05-30 12:47:02 -07:00
Ross Wightman
9c78de8c02
Fix #661 , move hardswish out of default args for LeViT. Enable native torch support for hardswish, hardsigmoid, mish if present.
2021-05-26 15:28:42 -07:00
Ross Wightman
5db7452173
Fix visformer in_chans stem handling
2021-05-25 14:11:36 -07:00
Ross Wightman
fd92ba0de8
Filter large vit models from torchscript tests
2021-05-25 12:52:07 -07:00
Ross Wightman
99d97e0d67
Hopefully the last test update for this PR...
2021-05-25 11:10:17 -07:00
Ross Wightman
d400f1dbdd
Filter test models before creation for backward/torchscript tests
2021-05-25 10:14:45 -07:00
Ross Wightman
c4572cc5aa
Add Visformer-small weighs, tweak torchscript jit test img size.
2021-05-24 22:50:12 -07:00
Ross Wightman
83487e2a0d
Lower max backward size for tests.
2021-05-24 21:36:56 -07:00
Ross Wightman
bfc72f75d3
Expand scope of testing for non-std vision transformer / mlp models. Some related cleanup and create fn cleanup for all vision transformer and mlp models. More CoaT weights.
2021-05-24 21:13:26 -07:00
Ross Wightman
f45de37690
Merge branch 'master' into levit_visformer_rednet
2021-05-22 16:34:31 -07:00
Ross Wightman
306c86b668
Merge branch 'convit' of https://github.com/amaarora/pytorch-image-models into amaarora-convit
2021-05-21 16:27:10 -07:00
Aman Arora
50d6aab0ef
Add convit to non-std filters as vit_
2021-05-21 03:46:47 +00:00
Aman Arora
1633317489
update tests and exclude convit_base
2021-05-21 01:11:56 +00:00
李鑫杰
d046498e0b
update test_models.py
2021-05-20 11:20:39 +08:00
Ross Wightman
6d81374b88
Update tests for new mlp models
2021-05-19 11:09:42 -07:00
Ross Wightman
ecc7552c5c
Add levit, levit_c, and visformer model defs. Largely untested and not finished cleanup.
2021-05-14 17:16:34 -07:00
Ross Wightman
d45e50b9db
Update test for cait 448x448 model
2021-05-05 17:51:23 -07:00
Ross Wightman
5fcddb96a8
Merge branch 'master' into cait
2021-05-05 17:29:38 -07:00
Ross Wightman
2d8b09fe8b
Add official pretrained weights to MLP-Mixer, complete model cfgs.
2021-05-05 15:59:40 -07:00
Ross Wightman
1daa15ecc3
Initial Cait commit. Still some cleanup to do.
2021-05-04 11:19:27 -07:00
Ross Wightman
67d0665b46
Post ResNet-RS merge cleanup. Add weight urls, adjust train/test/crop pct.
2021-05-04 11:04:23 -07:00
Aman Arora
560eae38f5
[WIP] Add ResNet-RS models ( #554 )
...
* Add ResNet-RS models
* Only include resnet-rs changes
* remove whitespace diff
* EOF newline
* Update time
* increase time
* Add first conv
* Try running only resnetv2_101x1_bitm on Linux runner
* Add to exclude filter
* Run test_model_forward_features for all
* Add to exclude ftrs
* back to defaults
* only run test_forward_features
* run all tests
* Run all tests
* Add bigger resnetrs to model filters to fix Github CLI
* Remove resnetv2_101x1_bitm from exclude feat features
* Remove hardcoded values
* Make sure reduction ratio in resnetrs is 0.25
* There is no bias in replaced maxpool so remove it
2021-05-04 10:59:44 -07:00