Ross Wightman
a5a2ad2e48
Fix consistency, testing for forward_head w/ pre_logits, reset_classifier, models with pre_logits size != unpooled feature size
...
* add test that model supports forward_head(x, pre_logits=True)
* add head_hidden_size attr to all models and set differently from num_features attr when head has hidden layers
* test forward_features() feat dim == model.num_features and pre_logits feat dim == self.head_hidden_size
* more consistency in reset_classifier signature, add typing
* asserts in some heads where pooling cannot be disabled
Fix #2194
2024-06-07 13:53:00 -07:00
Ross Wightman
c838c4233f
Add typing to reset_classifier() on other models
2024-05-12 11:12:00 -07:00
Ross Wightman
cb57a96862
Fix early stop for efficientnet/mobilenetv3 fwd inter. Fix indices typing for all fwd inter.
2024-05-04 10:21:58 -07:00
Ross Wightman
c719f7eb86
More forward_intermediates() updates
...
* add convnext, resnet, efficientformer, levit support
* remove kwargs only for fn so that torchscript isn't broken for all :(
* use reset_classifier() consistently in prune
2024-05-03 16:22:32 -07:00
Ross Wightman
67332fce24
Add features_intermediate() support to coatnet, maxvit, swin* models. Refine feature interface. Start prep of new vit weights.
2024-04-30 16:56:33 -07:00
user-miner1
740f4983b3
Assert messages added
2024-04-30 10:10:02 +03:00
Ross Wightman
ef9c6fb846
forward_head(), consistent pre_logits handling to reduce likelihood of people manually replacing .head module having issues
2024-04-09 21:54:59 -07:00
Yassine
884ef88818
fix all SDPA dropouts
2023-10-05 08:58:41 -07:00
Ross Wightman
c153cd4a3e
Add more advanced interpolation method from BEiT and support non-square window & image size adaptation for
...
* beit/beit-v2
* maxxvit/coatnet
* swin transformer
And non-square windows for swin-v2
2023-08-08 16:41:16 -07:00
Ross Wightman
cf1884bfeb
Add 21k maxvit tf weights
2023-05-10 18:23:32 -07:00
Ross Wightman
e4e43190ce
Add typing to all model entrypoint fns, add old cache check env var to builder
2023-05-08 08:52:38 -07:00
Ross Wightman
965d0a2d36
fast_attn -> fused_attn, implement global config for enable/disable fused_attn, add to more models. vit clip openai 336 weights.
2023-04-10 12:04:33 -07:00
Ross Wightman
572f05096a
Swin and FocalNet weights on HF hub. Add model deprecation functionality w/ some registry tweaks.
2023-03-18 14:55:09 -07:00
Ross Wightman
122621daef
Add Final annotation to attn_fas to avoid symbol lookup of new scaled_dot_product_attn fn on old PyTorch in jit
2023-02-16 16:57:42 -08:00
Ross Wightman
621e1b2182
Add ideas from 'Scaling ViT to 22-B Params', testing PyTorch 2.0 fused F.scaled_dot_product_attention impl in vit, vit_relpos, maxxvit / coatnet.
2023-02-16 16:57:42 -08:00
Ross Wightman
2cb2699dc8
Apply fix from #1649 to main
2023-02-03 11:28:57 -08:00
Ross Wightman
6f28b562c6
Factor NormMlpClassifierHead from MaxxViT and use across MaxxViT / ConvNeXt / DaViT, refactor some type hints & comments
2023-01-27 14:57:01 -08:00
Ross Wightman
bed350f5e5
Push all MaxxViT weights to HF hub, cleanup impl, add feature map extraction support and prompote to 'std' architecture. Fix norm head for proper embedding / feat map output. Add new in12k + ft 1k weights.
2023-01-20 14:45:25 -08:00
Ross Wightman
1825b5e314
maxxvit type
2023-01-09 08:57:31 -08:00
Ross Wightman
5078b28f8a
More kwarg handling tweaks, maxvit_base_rw def added
2023-01-09 08:57:31 -08:00
Ross Wightman
c0d7388a1b
Improving kwarg merging in more models
2023-01-09 08:57:31 -08:00
Ross Wightman
9a51e4ea2e
Add FlexiViT models and weights, refactoring, push more weights
...
* push all vision_transformer*.py weights to HF hub
* finalize more pretrained tags for pushed weights
* refactor pos_embed files and module locations, move some pos embed modules to layers
* tweak hf hub helpers to aid bulk uploading and updating
2022-12-22 17:23:09 -08:00
Ross Wightman
927f031293
Major module / path restructure, timm.models.layers -> timm.layers, add _ prefix to all non model modules in timm.models
2022-12-06 15:00:06 -08:00
Ross Wightman
755570e2d6
Rename _pretrained.py -> pretrained.py, not feasible to change the other files to same scheme without breaking uses
2022-12-05 10:21:34 -08:00
Ross Wightman
72cfa57761
Add ported Tensorflow MaxVit weights. Add a few more CLIP ViT fine-tunes. Tweak some model tag names. Improve model tag name sorting. Update HF hub push config layout.
2022-12-05 10:21:34 -08:00
Ross Wightman
4d5c395160
MaxVit, ViT, ConvNeXt, and EfficientNet-v2 updates
...
* Add support for TF weights and modelling specifics to MaxVit (testing ported weights)
* More fine-tuned CLIP ViT configs
* ConvNeXt and MaxVit updated to new pretrained cfgs use
* EfficientNetV2, MaxVit and ConvNeXt high res models use squash crop/resize
2022-12-05 10:21:34 -08:00
Ross Wightman
9914f744dc
Add more maxxvit weights includ ConvNeXt conv block based experiments.
2022-10-10 21:49:18 -07:00
Ross Wightman
fa8c84eede
Update maxvit_tiny_256 weight to better iter, add coatnet / maxvit / maxxvit model defs for future runs
2022-09-07 12:37:37 -07:00
Ross Wightman
c1b3cea19d
Add maxvit_rmlp_tiny_rw_256 model def and weights w/ 84.2 top-1 @ 256, 84.8 @ 320
2022-09-07 10:27:11 -07:00
Ross Wightman
dc90816f26
Add `maxvit_tiny_rw_224` weights 83.5 @ 224 and `maxvit_rmlp_pico_rw_256` relpos weights, 80.5 @ 256, 81.3 @ 320
2022-09-06 16:14:41 -07:00
Ross Wightman
7f1b223c02
Add maxvit_rmlp_nano_rw_256 model def & weights, make window/grid size dynamic wrt img_size by default
2022-08-29 15:49:32 -07:00
Ross Wightman
f1d2160d85
Update a few maxxvit comments, rename PartitionAttention -> PartitionAttenionCl for consistency with other blocks
2022-08-26 12:53:49 -07:00
Ross Wightman
eca6f0a25c
Fix syntax error (extra dataclass comma) in maxxvit.py
2022-08-26 11:29:09 -07:00
Ross Wightman
7c2660576d
Tweak init for convnext block using maxxvit/coatnext.
2022-08-25 15:30:59 -07:00
Ross Wightman
527f9a4cb2
Updated to correct maxvit_nano weights...
2022-08-24 12:42:11 -07:00
Ross Wightman
b2e8426fca
Make k=stride=2 ('avg2') pooling default for coatnet/maxvit. Add weight links. Rename 'combined' partition to 'parallel'.
2022-08-24 11:01:20 -07:00
Ross Wightman
cac0a4570a
More test fixes, pool size for 256x256 maxvit models
2022-08-23 13:38:26 -07:00
Ross Wightman
e939ed19b9
Rename internal creation fn for maxvit, has not been just coatnet for a while...
2022-08-22 17:44:51 -07:00
Ross Wightman
ffaf97f813
MaxxVit! A very configurable MaxVit and CoAtNet impl with lots of goodies..
2022-08-22 17:42:10 -07:00