889 Commits

Author SHA1 Message Date
Ross Wightman
cf324ea38f Fix grad checkpointing in focalnet 2023-02-17 16:26:26 -08:00
Ross Wightman
848d200767 Overhaul FocalNet implementation 2023-02-17 16:24:59 -08:00
Ross Wightman
7266c5c716 Merge branch 'main' into focalnet_and_swin_refactor 2023-02-17 09:20:14 -08:00
Ross Wightman
7d9e321b76 Improve tracing of window attn models with simpler reshape logic 2023-02-17 07:59:06 -08:00
Ross Wightman
2e38d53dca Remove dead line 2023-02-16 16:57:42 -08:00
Ross Wightman
f77c04ff36 Torchscript fixes/hacks for rms_norm, refactor ParallelScalingBlock with manual combination of input projections, closer paper match 2023-02-16 16:57:42 -08:00
Ross Wightman
122621daef Add Final annotation to attn_fas to avoid symbol lookup of new scaled_dot_product_attn fn on old PyTorch in jit 2023-02-16 16:57:42 -08:00
Ross Wightman
621e1b2182 Add ideas from 'Scaling ViT to 22-B Params', testing PyTorch 2.0 fused F.scaled_dot_product_attention impl in vit, vit_relpos, maxxvit / coatnet. 2023-02-16 16:57:42 -08:00
testbot
a09d403c24 changed warning to info 2023-02-16 16:20:31 -08:00
testbot
8470e29541 Add support to load safetensors weights 2023-02-16 16:20:31 -08:00
Ross Wightman
624266148d Remove unused imports from _hub helpers 2023-02-09 17:47:26 -08:00
Ross Wightman
2cfff0581b Add grad_checkpointing support to features_only, test in EfficientDet. 2023-02-09 17:45:40 -08:00
Ross Wightman
9c14654a0d Improve support for custom dataset label name/description through HF hub export, via pretrained_cfg 2023-02-08 08:29:20 -08:00
Ross Wightman
0d33127df2 Add 384x384 convnext_large_mlp laion2b fine-tune on in1k 2023-02-06 22:01:04 -08:00
Ross Wightman
7a0bd095cb Update model prune loader to use pkgutil 2023-02-06 17:45:16 -08:00
Ross Wightman
13acac8c5e Update head metadata for effformerv2 2023-02-04 23:11:51 -08:00
Ross Wightman
8682528096 Add first conv metadata for efficientformer_v2 2023-02-04 23:02:02 -08:00
Ross Wightman
72fba669a8 is_scripting() guard on checkpoint_seq 2023-02-04 14:21:49 -08:00
Ross Wightman
95ec255f7f Finish timm mode api for efficientformer_v2, add grad checkpointing support to both efficientformers 2023-02-03 21:21:23 -08:00
Ross Wightman
9d03c6f526 Merge remote-tracking branch 'origin/main' into levit_efficientformer_redux 2023-02-03 14:47:01 -08:00
Ross Wightman
086bd55a94 Add EfficientFormer-V2, refactor EfficientFormer and Levit for more uniformity across the 3 related arch. Add features_out support to levit conv models and efficientformer_v2. All weights on hub. 2023-02-03 14:12:29 -08:00
Ross Wightman
2cb2699dc8 Apply fix from #1649 to main 2023-02-03 11:28:57 -08:00
Ross Wightman
b3042081b4 Add laion -> in1k fine-tuned base and large_mlp weights for convnext 2023-02-03 10:58:02 -08:00
Ross Wightman
316bdf8955 Add mlp head support for convnext_large, add laion2b CLIP weights, prep fine-tuned weight tags 2023-02-01 08:27:02 -08:00
Ross Wightman
6f28b562c6 Factor NormMlpClassifierHead from MaxxViT and use across MaxxViT / ConvNeXt / DaViT, refactor some type hints & comments 2023-01-27 14:57:01 -08:00
Ross Wightman
9a53c3f727 Finalize DaViT, some formatting and modelling simplifications (separate PatchEmbed to Stem + Downsample, weights on HF hub. 2023-01-27 13:54:04 -08:00
Fredo Guan
fb717056da Merge remote-tracking branch 'upstream/main' 2023-01-26 10:49:15 -08:00
Ross Wightman
64667bfa0e Add 'gigantic' vit clip variant for feature extraction and future fine-tuning 2023-01-25 18:02:10 -08:00
Ross Wightman
36989cfae4 Factor out readme generation in hub helper, add more readme fields 2023-01-20 14:49:40 -08:00
Ross Wightman
32f252381d Change order of checkpoitn filtering fn application in builder, try dict, model variant first 2023-01-20 14:48:54 -08:00
Ross Wightman
bed350f5e5 Push all MaxxViT weights to HF hub, cleanup impl, add feature map extraction support and prompote to 'std' architecture. Fix norm head for proper embedding / feat map output. Add new in12k + ft 1k weights. 2023-01-20 14:45:25 -08:00
Ross Wightman
ca38e1e73f Update ClassifierHead module, add reset() method, update in_chs -> in_features for consistency 2023-01-20 14:44:05 -08:00
Ross Wightman
8ab573cd26 Add convnext_tiny and convnext_small 384x384 fine-tunes of in12k weights, fix pool size for laion CLIP convnext weights 2023-01-20 14:40:16 -08:00
Fredo Guan
81ca323751
Davit update formatting and fix grad checkpointing (#7)
fixed head to gap->norm->fc as per convnext, along with option for norm->gap->fc
failed tests due to clip convnext models, davit tests passed
2023-01-15 14:34:56 -08:00
Ross Wightman
e9aac412de Correct mean/std for CLIP convnexts 2023-01-14 22:53:56 -08:00
Ross Wightman
42bd8f7bcb Add convnext_base CLIP image tower weights for fine-tuning / features 2023-01-14 21:16:29 -08:00
Ross Wightman
a2c14c2064 Add tiny/small in12k pretrained and fine-tuned ConvNeXt models 2023-01-11 14:50:39 -08:00
Ross Wightman
01fdf44438 Initial focalnet import, more refactoring needed for timm. 2023-01-09 16:18:19 -08:00
Ross Wightman
2e83bba142 Revert head norm changes to ConvNeXt as it broke some downstream use, alternate workaround for fcmae weights 2023-01-09 13:37:40 -08:00
Ross Wightman
1825b5e314 maxxvit type 2023-01-09 08:57:31 -08:00
Ross Wightman
5078b28f8a More kwarg handling tweaks, maxvit_base_rw def added 2023-01-09 08:57:31 -08:00
Ross Wightman
c0d7388a1b Improving kwarg merging in more models 2023-01-09 08:57:31 -08:00
Ross Wightman
60ebb6cefa Re-order vit pretrained entries for more sensible default weights (no .tag specified) 2023-01-06 16:12:33 -08:00
Ross Wightman
e861b74cf8 Pass through --model-kwargs (and --opt-kwargs for train) from command line through to model __init__. Update some models to improve arg overlay. Cleanup along the way. 2023-01-06 16:12:33 -08:00
Ross Wightman
add3fb864e Working on improved model card template for push_to_hf_hub 2023-01-06 16:12:33 -08:00
Ross Wightman
6e5553da5f
Add ConvNeXt-V2 support (model additions and weights) (#1614)
* Add ConvNeXt-V2 support (model additions and weights)

* ConvNeXt-V2 weights on HF Hub, tweaking some tests

* Update README, fixing convnextv2 tests
2023-01-05 07:53:32 -08:00
Ross Wightman
6902c48a5f Fix ResNet based models to work w/ norm layers w/o affine params. Reformat long arg lists into vertical form. 2022-12-29 16:32:26 -08:00
Ross Wightman
8ece53e194 Switch BEiT to HF hub weights 2022-12-22 21:43:04 -08:00
Ross Wightman
9a51e4ea2e Add FlexiViT models and weights, refactoring, push more weights
* push all vision_transformer*.py weights to HF hub
* finalize more pretrained tags for pushed weights
* refactor pos_embed files and module locations, move some pos embed modules to layers
* tweak hf hub helpers to aid bulk uploading and updating
2022-12-22 17:23:09 -08:00
Fredo Guan
10b3f696b4
Davit std (#6)
Separate patch_embed module
2022-12-16 21:50:28 -08:00