1389 Commits

Author SHA1 Message Date
Ross Wightman
82ae247879 MambaOut weights on hub, configs finalized 2024-10-11 11:07:40 -07:00
Ross Wightman
7efb60c299 Add first_conv for mambaout 2024-10-09 14:11:40 -07:00
Ross Wightman
5dc5ee5b42 Add global_pool to mambaout __init__ and pass to heads 2024-10-09 14:11:40 -07:00
Ross Wightman
9d1dfe8dbe Incorrectly named head_hidden_size 2024-10-09 14:11:40 -07:00
Ross Wightman
91e743f2dd Mambaout tweaks 2024-10-09 14:11:40 -07:00
Ross Wightman
4542cf03f9 Add features_only, other bits to mambaout, define different base alternatives 2024-10-09 14:11:40 -07:00
Ross Wightman
c2da12c7e1 Update rw models, fix heads 2024-10-09 14:11:40 -07:00
Ross Wightman
f2086f51a0 Add mambaout builder support, pretrained weight remap 2024-10-09 14:11:40 -07:00
Ross Wightman
c6ef54eefa Initial mambaout work 2024-10-09 14:11:40 -07:00
Ross Wightman
d9321b0e10 Add weights for fine-tuned siglip so400m. Add webli_i18n pretrained tags for the multi-lingual model variants (incl older base) 2024-10-09 09:04:44 -07:00
Ross Wightman
01b62264af Add i18n variant of so400m model w/ weights. Add two in1k fine-tunes of original so400m 384x384 but at 378x378 (better matches patch14) 2024-10-08 23:40:24 -07:00
Ross Wightman
72f0edb7e8 missed first_conv for rnv2 18d 2024-10-08 12:38:54 -07:00
Ross Wightman
3ed603a2ce Add resnet18/18d pre-act model configs for potential training. Fix #2289 2024-10-08 11:28:07 -07:00
Ross Wightman
41a79e0fcb Add overlapped stem convnext zepto weights 2024-10-08 11:26:34 -07:00
Ross Wightman
545bd4056c Tag along test_vit3 weights 2024-09-30 12:03:32 -07:00
Ross Wightman
69b687d4cc Add zepto weights 2024-09-30 11:43:23 -07:00
Ross Wightman
c6e5557a5a Mismatch pretrained_cfg 2024-09-30 11:43:23 -07:00
Ross Wightman
5d7bd2973e convnext zepto, rmsnorm experiments 2024-09-30 11:43:23 -07:00
Ross Wightman
e3242a5258
Merge pull request #2277 from huggingface/more_tiny_test_models
Adding some more tiny test models to train...
2024-09-22 10:28:29 -07:00
Ross Wightman
c1cb5641c7 Add weight fore mobilenetv4 small 0.5, change 0.25 -> 0.35 2024-09-22 10:27:01 -07:00
Ross Wightman
a22ce0a329 Merge branch 'patch-1' of https://github.com/baorepo/pytorch-image-models into baorepo-patch-1 2024-09-22 10:14:35 -07:00
Ross Wightman
9067be6a30 Add weights for new tiny test models 2024-09-22 07:59:23 -07:00
Ross Wightman
65564f7da5 Fix reversed H & W padding for swin patch merging 2024-09-21 16:51:02 -07:00
Ross Wightman
a2f539f055 Add a few more test model defs in prep for weight upload 2024-09-21 11:38:38 -07:00
Ross Wightman
6ab2af610d Adding some more tiny test models to train 2024-09-06 15:35:57 -07:00
alias pillar1989
d6b8816eda MobilenetV4: add two more lightweight models
Mobilenetv4 is very fast and ideal for embedded devices. However, for many low-cost, low-power embedded MCU devices, smaller models are required. Hopefully this PR will merge.
2024-09-05 02:34:11 +00:00
Ross Wightman
f81cbdcca9
Merge pull request #2274 from huggingface/bulk_runner_tweaks
Better all res resolution for bulk runner
2024-09-03 12:11:56 -07:00
Ross Wightman
a50713ce6e Fix #2272 2024-09-02 13:20:05 -07:00
Ross Wightman
ebbe530ee4 Add MobileNetV3 RA4 (mnv4 recipe) weights 2024-09-02 13:10:34 -07:00
Ross Wightman
fa4a1e597f Better all res resolution for bulk runner 2024-08-26 22:28:01 -07:00
Ross Wightman
76b0e9931a Placeholder for new mnv3 model 2024-08-23 10:11:20 -07:00
Ross Wightman
39e92f0c0d mobilenet_edgetpu can use group_size override, more consistency in arg wrap/sadface w/ extra group_size arg 2024-08-22 11:44:02 -07:00
Ross Wightman
b9f020a509 Allow group_size override for more efficientnet and mobilenetv3 based models 2024-08-21 16:51:38 -07:00
Ross Wightman
17923a66bb Add layer scale to hieradet 2024-08-21 11:23:39 -07:00
Ross Wightman
47e6958263 Add hierdet_small (non sam) model def 2024-08-21 11:05:54 -07:00
Ross Wightman
9fcbf39cdc Add remaining sbb vit betwixt/mediumd fine-tunes 2024-08-21 10:09:38 -07:00
Ross Wightman
dc94cca0e5 Remaining Hiera sbb weights uploaded 2024-08-21 10:06:27 -07:00
Ross Wightman
a256e50457 Move padding back in front of windowing 2024-08-17 11:22:53 -07:00
Ross Wightman
7d83749207 pool size test fixes 2024-08-17 08:27:13 -07:00
Ross Wightman
1bd92bca0e Add fused_attn flag to HieraDet attn block 2024-08-16 22:57:49 -07:00
Ross Wightman
691bb54443 Larger min input size needed 2024-08-16 17:09:19 -07:00
Ross Wightman
de3a91a7a0 Add min_input_size of 128 for hieradet/sam2 2024-08-16 15:13:56 -07:00
Ross Wightman
0b05122cda Fixing hieradet (sam2) tests 2024-08-16 14:33:40 -07:00
Ross Wightman
e035381171 Move padding out of windowing code for hieradet, fix torchscript typing issues, make pooling MaxPool unique instances across two modules 2024-08-16 13:36:33 -07:00
Ross Wightman
146c2fbe34 Add resnet50d and efficientnet_b1 ra4 (mnv4) hparam weights 2024-08-16 12:10:00 -07:00
Ross Wightman
962958723c More Hiera updates. Add forward_intermediates to hieradat/sam2 impl. Make both use same classifier module. Add coarse bool to intermediates. 2024-08-16 11:10:04 -07:00
Ross Wightman
f2cfb4c677 Add WIP HieraDet impl (SAM2 backbone support) 2024-08-15 17:58:15 -07:00
Ross Wightman
a50e53d41f Rename global pos embed for Hiera abswin, factor out commonly used vit weight init fns to layers. Add a channels-last ver of normmlp head. 2024-08-15 17:46:36 -07:00
Ross Wightman
2f3fed43b8 Fix hiera init with num_classes=0, fix weight tag names for sbb2 hiera/vit weights, add LayerScale/LayerScale2d to layers 2024-08-15 11:14:38 -07:00
Ross Wightman
fee91fdd41 Update Hiera model for abswin, more stable weight init, layer-scale. ImageNet-12k weights for hiera_small_abswin, and two of the sbb vits with improved reg4 init. 2024-08-14 12:22:40 -07:00