Ross Wightman
3eaf729f3f
F.sdpa for visformer fails w/o contiguous on qkv, make experimental
2023-05-11 11:37:37 -07:00
Ross Wightman
cf1884bfeb
Add 21k maxvit tf weights
2023-05-10 18:23:32 -07:00
Ross Wightman
6c2edf4d74
Missed hub_id entries for byoanet models
2023-05-10 15:58:55 -07:00
Ross Wightman
850ab4931f
Missed a few pretrained tags...
2023-05-10 12:16:30 -07:00
Ross Wightman
ff2464e2a0
Throw when pretrained weights not available and pretrained=True (principle of least surprise).
2023-05-10 10:44:34 -07:00
Ross Wightman
e0ec0f7252
Merge pull request #1643 from nateraw/docstrings-update
...
Update Docstring for create_model
2023-05-09 21:33:20 -07:00
Ross Wightman
627b6315ba
Add typing to dinov2 entrypt fns, use hf hub for mae & dinov2 weights
2023-05-09 20:42:11 -07:00
Ross Wightman
960a882510
Remove label offsets and remove old weight url for 1001 class (background + in1k) TF origin weights
2023-05-09 18:00:41 -07:00
Ross Wightman
a01d8f86f4
Tweak DinoV2 add, add MAE ViT weights, add initial intermediate layer getter experiment
2023-05-09 17:59:22 -07:00
Ross Wightman
59bea4c306
Merge branch 'main' into dot_nine_cleanup
2023-05-09 12:27:32 -07:00
Leng Yue
5cc87e6485
Add dinov2 pretrained models ( #1797 )
...
* add dinov2 small, base, and large
* fix input size
* fix swiglu & dinov2 vit giant
* use SwiGLUPacked to replace GluMlp
* clean up & add ffn_layer placeholder for ParallelScalingBlock
2023-05-09 12:24:47 -07:00
Ross Wightman
21e57c0b9e
Add missing beitv2 in1k -> in1k models
2023-05-08 17:03:51 -07:00
Ross Wightman
8c6fccb879
Allow passing state_dict directly via pretrained cfg mechanism as an override
2023-05-08 15:15:44 -07:00
Ross Wightman
e4e43190ce
Add typing to all model entrypoint fns, add old cache check env var to builder
2023-05-08 08:52:38 -07:00
Ross Wightman
cb3f9c23bb
Metaformer baselines for vision (final PR with cleanup) ( #1793 )
...
* update
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* merge with poolformer, initial version
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Revert "Update metaformers.py"
This reverts commit 2916f37f8d
.
* Revert "Update metaformers.py"
This reverts commit 1d882eb494
.
* Revert "Update metaformers.py"
This reverts commit 2209d0830e
.
* Revert "Update metaformers.py"
This reverts commit 32bede4e27
.
* Revert "Update metaformers.py"
This reverts commit 4ed934e000
.
* Revert "Update metaformers.py"
This reverts commit 3f0b075367
.
* Revert "Update metaformers.py"
This reverts commit 2fef9006d7
.
* Update metaformers.py
* Update metaformers.py
* rename model
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Stem/Downsample rework
* Update metaformers.py
* try NHWC
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Squashed commit of the following:
commit b7696a30a7
Author: Fredo Guan <fredo.guan@hotmail.com>
Date: Fri Feb 10 01:46:44 2023 -0800
Update metaformers.py
commit 41fe5c3626
Author: Fredo Guan <fredo.guan@hotmail.com>
Date: Fri Feb 10 01:03:47 2023 -0800
Update metaformers.py
commit a3aee37c35
Author: Fredo Guan <fredo.guan@hotmail.com>
Date: Fri Feb 10 00:32:04 2023 -0800
Update metaformers.py
commit f938beb81b
Author: Fredo Guan <fredo.guan@hotmail.com>
Date: Fri Feb 10 00:24:58 2023 -0800
Update metaformers.py
commit 10bde717e5
Author: Fredo Guan <fredo.guan@hotmail.com>
Date: Sun Feb 5 02:11:28 2023 -0800
Update metaformers.py
commit 39274bd45e
Author: Fredo Guan <fredo.guan@hotmail.com>
Date: Sun Feb 5 02:06:58 2023 -0800
Update metaformers.py
commit a2329ab8ec
Author: Fredo Guan <fredo.guan@hotmail.com>
Date: Sun Feb 5 02:03:34 2023 -0800
Update metaformers.py
commit 53b8ce5b8a
Author: Fredo Guan <fredo.guan@hotmail.com>
Date: Sun Feb 5 02:02:37 2023 -0800
Update metaformers.py
commit ab6225b941
Author: Fredo Guan <fredo.guan@hotmail.com>
Date: Sun Feb 5 01:04:55 2023 -0800
try NHWC
commit 02fcc30eaa
Author: Fredo Guan <fredo.guan@hotmail.com>
Date: Sat Feb 4 23:47:06 2023 -0800
Update metaformers.py
commit 366aae9304
Author: Fredo Guan <fredo.guan@hotmail.com>
Date: Sat Feb 4 23:37:30 2023 -0800
Stem/Downsample rework
commit 26a8e481a5
Author: Fredo Guan <fredo.guan@hotmail.com>
Date: Wed Feb 1 07:42:07 2023 -0800
Update metaformers.py
commit a913f5d438
Author: Fredo Guan <fredo.guan@hotmail.com>
Date: Wed Feb 1 07:41:24 2023 -0800
Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* channels first for whole network
* Channels first
* Update metaformers.py
* Use buffer for randformer
* Update metaformers.py
* Remove einsum
* don't test randformer for feature extraction
* arbitrary input sizes for randformer
* Squashed commit of the following:
commit 6c089ca4325ab10942fe56e0999dcc1a11e1d2f0
Author: Fredo Guan <fredo.guan@hotmail.com>
Date: Mon Mar 6 02:11:17 2023 -0800
Update metaformers.py
commit 521528a900e49ef8f462f5ccd795efb3a5d14214
Author: Fredo Guan <fredo.guan@hotmail.com>
Date: Mon Mar 6 02:06:08 2023 -0800
Update metaformers.py
commit 3827eec7963698ff727fbb13ace53594ceb374d5
Author: Fredo Guan <fredo.guan@hotmail.com>
Date: Mon Mar 6 02:03:08 2023 -0800
Update metaformers.py
commit ac1c6fea8adcd846e031ea0f5fa81ffe63d3c4bb
Author: Fredo Guan <fredo.guan@hotmail.com>
Date: Mon Mar 6 02:01:04 2023 -0800
Update metaformers.py
commit 26f3d343cdc46183543f83482187f669f3181ddf
Merge: d577129 f736730
Author: Fredo Guan <fredo.guan@hotmail.com>
Date: Mon Mar 6 01:57:29 2023 -0800
Merge branch 'metaformer_workspace' of https://github.com/fffffgggg54/pytorch-image-models into metaformer_workspace
commit d577129aaa23fb348a8bb93bcd17cf1d5a4e8ff8
Author: Fredo Guan <fredo.guan@hotmail.com>
Date: Mon Mar 6 01:57:20 2023 -0800
Update metaformers.py
commit f7367304e8f3b7a9a7f16e0a032bb72546afcc2a
Author: Fredo Guan <fredo.guan@hotmail.com>
Date: Mon Mar 6 01:56:11 2023 -0800
Metaformer baselines for vision (#12 )
* formatting, cleanup, fix dropout
* fix regression, pass kwargs
* fix poolformerv1 weights, formatting
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* some cleanup
* SDPA from ViT, fix imports
* Update metaformers.py
* fix head reset
* fast norm bias patch for metaformers
* Metaformer refactor, remove rand/ident models, fix issues, remove old poolformer
* Switch to hub weights
---------
Co-authored-by: Fredo Guan <fredo.guan@hotmail.com>
2023-05-05 11:18:26 -07:00
Ross Wightman
320bf9c469
Remove redundant types, kwargs back in own section (lesser of many evils?)
2023-05-01 14:21:48 -07:00
Ross Wightman
8fa86a28a8
Add datacomp L/14 (79.2 zs) image tower weights
2023-05-01 10:24:08 -07:00
Ross Wightman
5e64777804
0.8.21dev0
2023-04-28 13:46:59 -07:00
Ross Wightman
493c730ffc
Fix pit regression
2023-04-26 23:16:06 -07:00
Ross Wightman
437d344e03
Always some torchscript issues
2023-04-26 20:42:34 -07:00
Ross Wightman
528faa0e04
Some fixes
2023-04-26 17:46:20 -07:00
Ross Wightman
3386af8c86
Final push to get remaining models using multi-weight pretrained configs, almost all weights on HF hub
2023-04-26 15:52:13 -07:00
Ross Wightman
7ad7ddb7ad
DenseNet, DPN, VoVNet, Aligned Xception weights on HF hub. DenseNet grad_checkpointing using timm API
2023-04-21 16:56:44 -07:00
Ross Wightman
864bfd43d0
hardcore nas weights on hf hub
2023-04-21 14:35:10 -07:00
Ross Wightman
6e4529ae35
TResNet weights now on HF hub, modified to remove InplaceABN dependency
2023-04-21 14:20:48 -07:00
Ross Wightman
04dcbc02ec
Fix weight remap for tresnet_v2_l
2023-04-21 09:05:04 -07:00
Ross Wightman
a08e5aed1d
More models w/ multi-weight support, moving to HF hub. Removing inplace_abn from all models including TResNet
2023-04-20 22:44:49 -07:00
Ross Wightman
34df125be6
cait, volo, xvit hub weights
2023-04-14 10:13:13 -07:00
Ross Wightman
f6d5767551
cspnet models on HF hub w/ multi-weight support
2023-04-12 14:02:38 -07:00
Ross Wightman
80b247d843
Update swin_v2 attn_mask buffer change in #1790 to apply to updated checkpoints in hub
2023-04-11 14:40:32 -07:00
Ross Wightman
1a1aca0cee
Merge pull request #1761 from huggingface/patch_drop_refactor
...
Implement patch dropout for eva / vision_transformer, refactor dropout args
2023-04-11 14:37:36 -07:00
Ross Wightman
c0670822d2
Small factory handling fix for pretrained tag vs cfg
2023-04-11 07:42:13 -07:00
Ross Wightman
2f25f73b90
Missed a fused_attn update in relpos vit
2023-04-10 23:30:50 -07:00
Ross Wightman
0b65b5c0ac
Add finalized eva CLIP weights pointing to remapped timm hub models
2023-04-10 23:13:12 -07:00
Ross Wightman
965d0a2d36
fast_attn -> fused_attn, implement global config for enable/disable fused_attn, add to more models. vit clip openai 336 weights.
2023-04-10 12:04:33 -07:00
Ross Wightman
4d135421a3
Implement patch dropout for eva / vision_transformer, refactor / improve consistency of dropout args across all vit based models
2023-04-07 20:27:23 -07:00
Marco Forte
c76818a592
skip attention mask buffers
...
Allows more flexibility in the resolutions accepted by SwinV2.
2023-04-07 18:50:02 +02:00
Ross Wightman
1bb3989b61
Improve kwarg passthrough for swin, vit, deit, beit, eva
2023-04-05 21:37:16 -07:00
Ross Wightman
35c94b836c
Update warning message for deprecated model names
2023-04-05 17:24:17 -07:00
Ross Wightman
9eaab795c2
Add some vit model deprecations
2023-04-05 17:21:03 -07:00
Ross Wightman
abff3f12ec
Wrong pool_size for 288 ft
2023-04-05 16:07:51 -07:00
Ross Wightman
356309959c
ResNet models on HF hub, multi-weight support, add torchvision v2 weights, new 12k pretrained and fine-tuned timm anti-aliased weights
2023-04-05 14:19:42 -07:00
Ross Wightman
beef7f0a22
Add ImageNet-12k intermediate fine-tunes of convnext base & large CLIP models, add first 1k fine-tune of xxlarge
2023-03-31 16:45:01 -07:00
Ross Wightman
9aa1133bd2
Fix #1750 , uncomment weight that exists on HF hub, add FIXME to 3 others that are still on local storage
2023-03-31 14:49:30 -07:00
Ross Wightman
7326470514
Merge pull request #1746 from huggingface/eva02
...
Adding EVA02 weights and model defs
2023-03-31 12:17:00 -07:00
Ross Wightman
adeb9de7c6
Mismatch in eva pretrained_cfg vs model for one of the clip variants
2023-03-31 10:30:30 -07:00
Ross Wightman
0737bd3ec8
eva02 non-CLIP weights on HF hub, add initial eva02 clip model configs w/ postnorm variant & attn LN
2023-03-30 23:43:59 -07:00
Ross Wightman
ac67098147
Add final attr for fast_attn on beit / eva
2023-03-28 08:40:40 -07:00
Ross Wightman
1885bdc431
Merge pull request #1745 from huggingface/mw-mlp_mixer
...
MLP-Mixer multi-weight support, HF hub push
2023-03-28 07:55:17 -07:00
Ross Wightman
e9f427b953
Add hf hub entries for mlp_mixer
2023-03-27 22:50:43 -07:00
Ross Wightman
cff81deb78
multi-weight and hf hub for deit / deit3
2023-03-27 22:47:16 -07:00
Ross Wightman
3863d63516
Adding EVA02 weights and model defs, move beit based eva_giant to same eva.py file. Cleanup rotary pos, add lang oriented freq bands to be compat with eva design choice. Fix #1738
2023-03-27 17:16:07 -07:00
Ross Wightman
b12060996c
MLP-Mixer multi-weight support, hf hub push
2023-03-27 16:42:13 -07:00
Ross Wightman
d196fa536d
Fix last min torchscript regression in nfnet changes
2023-03-24 00:10:17 -07:00
Ross Wightman
33ada0cbca
Add group_matcher to focalnet for proper layer-wise LR decay
2023-03-23 23:21:49 -07:00
Ross Wightman
b271dc0e16
NFNet multi-weight support + HF hub push
2023-03-23 23:20:38 -07:00
Ross Wightman
dbd33e4b62
Update crop settings for new rexnet weights
2023-03-22 15:39:49 -07:00
Ross Wightman
da6bdd4560
Update resnetv2.py for multi-weight and HF hub weights
2023-03-22 15:38:04 -07:00
Ross Wightman
b3e816d6d7
Improve filtering behaviour for tag + non-tagged model wildcard consistency.
2023-03-22 10:21:22 -07:00
Ross Wightman
7aba64ebdb
Add update byobnet.py w/ models pushed to HF hub
2023-03-22 10:00:00 -07:00
Ross Wightman
e7ef8335bf
regnet.py multi-weight conversion, new ImageNet-12k pretrain/ft from timm for y_120 and y_160, also new tv v2, swag, & seer weights for push to Hf hub.
2023-03-21 15:51:49 -07:00
Ross Wightman
c78319adce
Add ImageNet-12k ReXNet-R 200 & 300 weights, and push existing ReXNet models to HF hub. Dilation support added to rexnet
2023-03-20 13:48:17 -07:00
Ross Wightman
041de79f9e
Fix numel use in helpers for checkpoint remap
2023-03-20 09:36:48 -07:00
Ross Wightman
49b9c3be80
Include pretrained tag in deprecated mapping warning
2023-03-19 21:21:19 -07:00
Ross Wightman
572f05096a
Swin and FocalNet weights on HF hub. Add model deprecation functionality w/ some registry tweaks.
2023-03-18 14:55:09 -07:00
Ross Wightman
5aebad3fbc
return_map back to out_map for _feature helpers
2023-03-16 14:50:55 -07:00
Ross Wightman
acfd85ad68
All swin models support spatial output, add output_fmt to v1/v2 and use ClassifierHead.
...
* update ClassifierHead to allow different input format
* add output format support to patch embed
* fix some flatten issues for a few conv head models
* add Format enum and helpers for tensor format (layout) choices
2023-03-15 23:21:51 -07:00
Ross Wightman
c30a160d3e
Merge remote-tracking branch 'origin/main' into focalnet_and_swin_refactor
2023-03-15 15:58:39 -07:00
Ross Wightman
ad94d737b7
Add support to ConvNextBlock for downsample and channel expansion to improve stand alone use. Fix #1699
2023-03-13 14:06:24 -07:00
Piotr Sebastian Kluska
992bf7c3d4
chore: Modify the MobileVitV2Block to be coreml exportable
...
based on is_exportable() set variable controlling behaviour of the block
CoreMLTools support im2col from 6.2 version, unfortunately col2im
is still not supported.
Tested with exporting to ONNX, Torchscript, CoreML, and TVM.
2023-03-03 09:38:24 +01:00
Ross Wightman
4b8cfa6c0a
Add convnext_xxlarge CLIP image tower weights, version 0.8.15dev0
2023-02-26 21:51:48 -08:00
Ross Wightman
1c13ef7b46
Add default norm_eps=1e-5 for convnext_xxlarge, improve kwarg merging for all convnext models
2023-02-26 12:11:49 -08:00
Benjamin Bossan
a5b01ec04e
Add type annotations to _registry.py
...
Description
Add type annotations to _registry.py so that they will pass mypy
--strict.
Comment
I was reading the code and felt that this module would be easier to
understand with type annotations. Therefore, I went ahead and added the
annotations.
The idea with this PR is to start small to see if we can align on _how_
to annotate types. I've seen people in the past disagree on how strictly
to annotate the code base, so before spending too much time on this, I
wanted to check if you agree, Ross.
Most of the added types should be straightforward. Some notes on the
non-trivial changes:
- I made no assumption about the fn passed to register_model, but maybe
the type could be stricter. Are all models nn.Modules?
- If I'm not mistaken, the type hint for get_arch_name was incorrect
- I had to add a # type: ignore to model.__all__ = ...
- I made some minor code changes to list_models to facilitate the
typing. I think the changes should not affect the logic of the function.
- I removed list from list(sorted(...)) because sorted returns always a
list.
2023-02-22 09:19:30 -08:00
Ross Wightman
4d9c3ae2fb
Add laion2b 320x320 ConvNeXt-Large CLIP weights
2023-02-18 16:34:03 -08:00
Ross Wightman
d0b45c9b4d
Make safetensor import option for now. Improve avg/clean checkpoints ext handling a bit (more consistent).
2023-02-18 16:06:42 -08:00
Ross Wightman
947c1d757a
Merge branch 'main' into focalnet_and_swin_refactor
2023-02-17 16:28:52 -08:00
Ross Wightman
cf324ea38f
Fix grad checkpointing in focalnet
2023-02-17 16:26:26 -08:00
Ross Wightman
848d200767
Overhaul FocalNet implementation
2023-02-17 16:24:59 -08:00
Ross Wightman
7266c5c716
Merge branch 'main' into focalnet_and_swin_refactor
2023-02-17 09:20:14 -08:00
Ross Wightman
7d9e321b76
Improve tracing of window attn models with simpler reshape logic
2023-02-17 07:59:06 -08:00
Ross Wightman
2e38d53dca
Remove dead line
2023-02-16 16:57:42 -08:00
Ross Wightman
f77c04ff36
Torchscript fixes/hacks for rms_norm, refactor ParallelScalingBlock with manual combination of input projections, closer paper match
2023-02-16 16:57:42 -08:00
Ross Wightman
122621daef
Add Final annotation to attn_fas to avoid symbol lookup of new scaled_dot_product_attn fn on old PyTorch in jit
2023-02-16 16:57:42 -08:00
Ross Wightman
621e1b2182
Add ideas from 'Scaling ViT to 22-B Params', testing PyTorch 2.0 fused F.scaled_dot_product_attention impl in vit, vit_relpos, maxxvit / coatnet.
2023-02-16 16:57:42 -08:00
testbot
a09d403c24
changed warning to info
2023-02-16 16:20:31 -08:00
testbot
8470e29541
Add support to load safetensors weights
2023-02-16 16:20:31 -08:00
Ross Wightman
624266148d
Remove unused imports from _hub helpers
2023-02-09 17:47:26 -08:00
Ross Wightman
2cfff0581b
Add grad_checkpointing support to features_only, test in EfficientDet.
2023-02-09 17:45:40 -08:00
Ross Wightman
9c14654a0d
Improve support for custom dataset label name/description through HF hub export, via pretrained_cfg
2023-02-08 08:29:20 -08:00
Ross Wightman
0d33127df2
Add 384x384 convnext_large_mlp laion2b fine-tune on in1k
2023-02-06 22:01:04 -08:00
Ross Wightman
7a0bd095cb
Update model prune loader to use pkgutil
2023-02-06 17:45:16 -08:00
Ross Wightman
13acac8c5e
Update head metadata for effformerv2
2023-02-04 23:11:51 -08:00
Ross Wightman
8682528096
Add first conv metadata for efficientformer_v2
2023-02-04 23:02:02 -08:00
Ross Wightman
72fba669a8
is_scripting() guard on checkpoint_seq
2023-02-04 14:21:49 -08:00
Ross Wightman
95ec255f7f
Finish timm mode api for efficientformer_v2, add grad checkpointing support to both efficientformers
2023-02-03 21:21:23 -08:00
Ross Wightman
9d03c6f526
Merge remote-tracking branch 'origin/main' into levit_efficientformer_redux
2023-02-03 14:47:01 -08:00
Ross Wightman
086bd55a94
Add EfficientFormer-V2, refactor EfficientFormer and Levit for more uniformity across the 3 related arch. Add features_out support to levit conv models and efficientformer_v2. All weights on hub.
2023-02-03 14:12:29 -08:00
Ross Wightman
2cb2699dc8
Apply fix from #1649 to main
2023-02-03 11:28:57 -08:00
Ross Wightman
b3042081b4
Add laion -> in1k fine-tuned base and large_mlp weights for convnext
2023-02-03 10:58:02 -08:00
Ross Wightman
316bdf8955
Add mlp head support for convnext_large, add laion2b CLIP weights, prep fine-tuned weight tags
2023-02-01 08:27:02 -08:00
Ross Wightman
6f28b562c6
Factor NormMlpClassifierHead from MaxxViT and use across MaxxViT / ConvNeXt / DaViT, refactor some type hints & comments
2023-01-27 14:57:01 -08:00
Ross Wightman
9a53c3f727
Finalize DaViT, some formatting and modelling simplifications (separate PatchEmbed to Stem + Downsample, weights on HF hub.
2023-01-27 13:54:04 -08:00
Fredo Guan
fb717056da
Merge remote-tracking branch 'upstream/main'
2023-01-26 10:49:15 -08:00
nateraw
14b84e8895
📝 update docstrings
2023-01-26 00:49:44 -05:00
nateraw
f0dc8a8267
📝 update docstrings for create_model
2023-01-25 21:10:41 -05:00
Ross Wightman
64667bfa0e
Add 'gigantic' vit clip variant for feature extraction and future fine-tuning
2023-01-25 18:02:10 -08:00
Ross Wightman
36989cfae4
Factor out readme generation in hub helper, add more readme fields
2023-01-20 14:49:40 -08:00
Ross Wightman
32f252381d
Change order of checkpoitn filtering fn application in builder, try dict, model variant first
2023-01-20 14:48:54 -08:00
Ross Wightman
bed350f5e5
Push all MaxxViT weights to HF hub, cleanup impl, add feature map extraction support and prompote to 'std' architecture. Fix norm head for proper embedding / feat map output. Add new in12k + ft 1k weights.
2023-01-20 14:45:25 -08:00
Ross Wightman
ca38e1e73f
Update ClassifierHead module, add reset() method, update in_chs -> in_features for consistency
2023-01-20 14:44:05 -08:00
Ross Wightman
8ab573cd26
Add convnext_tiny and convnext_small 384x384 fine-tunes of in12k weights, fix pool size for laion CLIP convnext weights
2023-01-20 14:40:16 -08:00
Fredo Guan
81ca323751
Davit update formatting and fix grad checkpointing ( #7 )
...
fixed head to gap->norm->fc as per convnext, along with option for norm->gap->fc
failed tests due to clip convnext models, davit tests passed
2023-01-15 14:34:56 -08:00
Ross Wightman
e9aac412de
Correct mean/std for CLIP convnexts
2023-01-14 22:53:56 -08:00
Ross Wightman
42bd8f7bcb
Add convnext_base CLIP image tower weights for fine-tuning / features
2023-01-14 21:16:29 -08:00
Ross Wightman
a2c14c2064
Add tiny/small in12k pretrained and fine-tuned ConvNeXt models
2023-01-11 14:50:39 -08:00
Ross Wightman
01fdf44438
Initial focalnet import, more refactoring needed for timm.
2023-01-09 16:18:19 -08:00
Ross Wightman
2e83bba142
Revert head norm changes to ConvNeXt as it broke some downstream use, alternate workaround for fcmae weights
2023-01-09 13:37:40 -08:00
Ross Wightman
1825b5e314
maxxvit type
2023-01-09 08:57:31 -08:00
Ross Wightman
5078b28f8a
More kwarg handling tweaks, maxvit_base_rw def added
2023-01-09 08:57:31 -08:00
Ross Wightman
c0d7388a1b
Improving kwarg merging in more models
2023-01-09 08:57:31 -08:00
Ross Wightman
60ebb6cefa
Re-order vit pretrained entries for more sensible default weights (no .tag specified)
2023-01-06 16:12:33 -08:00
Ross Wightman
e861b74cf8
Pass through --model-kwargs (and --opt-kwargs for train) from command line through to model __init__. Update some models to improve arg overlay. Cleanup along the way.
2023-01-06 16:12:33 -08:00
Ross Wightman
add3fb864e
Working on improved model card template for push_to_hf_hub
2023-01-06 16:12:33 -08:00
Ross Wightman
6e5553da5f
Add ConvNeXt-V2 support (model additions and weights) ( #1614 )
...
* Add ConvNeXt-V2 support (model additions and weights)
* ConvNeXt-V2 weights on HF Hub, tweaking some tests
* Update README, fixing convnextv2 tests
2023-01-05 07:53:32 -08:00
Ross Wightman
6902c48a5f
Fix ResNet based models to work w/ norm layers w/o affine params. Reformat long arg lists into vertical form.
2022-12-29 16:32:26 -08:00
Ross Wightman
8ece53e194
Switch BEiT to HF hub weights
2022-12-22 21:43:04 -08:00
Ross Wightman
9a51e4ea2e
Add FlexiViT models and weights, refactoring, push more weights
...
* push all vision_transformer*.py weights to HF hub
* finalize more pretrained tags for pushed weights
* refactor pos_embed files and module locations, move some pos embed modules to layers
* tweak hf hub helpers to aid bulk uploading and updating
2022-12-22 17:23:09 -08:00
Fredo Guan
10b3f696b4
Davit std ( #6 )
...
Separate patch_embed module
2022-12-16 21:50:28 -08:00
Ross Wightman
656e1776de
Convert mobilenetv3 to multi-weight, tweak PretrainedCfg metadata
2022-12-16 09:29:13 -08:00
Ross Wightman
6a01101905
Update efficientnet.py and convnext.py to multi-weight, add ImageNet-12k pretrained EfficientNet-B5 and ConvNeXt-Nano.
2022-12-14 20:33:23 -08:00
Fredo Guan
84178fca60
Merge branch 'rwightman:main' into main
2022-12-12 23:13:58 -08:00
Fredo Guan
c43340ddd4
Davit std ( #5 )
...
* Update davit.py
* Update test_models.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* starting point
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update test_models.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Davit revised (#4 )
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
clean up
* Update test_models.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update davit.py
* Update test_models.py
* Update davit.py
2022-12-11 03:03:22 -08:00
Ross Wightman
d5e7d6b27e
Merge remote-tracking branch 'origin/main' into refactor-imports
2022-12-09 14:49:44 -08:00
Ross Wightman
cda39b35bd
Add a deprecation phase to module re-org
2022-12-09 14:39:45 -08:00
Fredo Guan
edea013dd1
Davit std ( #3 )
...
Davit with all features working
2022-12-09 02:53:21 -08:00
Ross Wightman
7c4ed4d5a4
Add EVA-large models
2022-12-08 16:21:30 -08:00
Fredo Guan
434a03937d
Merge branch 'rwightman:main' into main
2022-12-08 08:05:16 -08:00
Ross Wightman
98047ef5e3
Add EVA FT results, hopefully fix BEiT test failures
2022-12-07 08:54:06 -08:00
Ross Wightman
3cc4d7a894
Fix missing register for 224 eva model
2022-12-07 08:54:06 -08:00
Ross Wightman
eba07b0de7
Add eva models to beit.py
2022-12-07 08:54:06 -08:00
Fredo Guan
3bd96609c8
Davit ( #1 )
...
Implement the davit model from https://arxiv.org/abs/2204.03645 and https://github.com/dingmyu/davit
2022-12-06 17:19:25 -08:00
Ross Wightman
927f031293
Major module / path restructure, timm.models.layers -> timm.layers, add _ prefix to all non model modules in timm.models
2022-12-06 15:00:06 -08:00
Ross Wightman
3785c234d7
Remove clip vit models that won't be ft and comment two that aren't uploaded yet
2022-12-05 10:21:34 -08:00
Ross Wightman
755570e2d6
Rename _pretrained.py -> pretrained.py, not feasible to change the other files to same scheme without breaking uses
2022-12-05 10:21:34 -08:00
Ross Wightman
72cfa57761
Add ported Tensorflow MaxVit weights. Add a few more CLIP ViT fine-tunes. Tweak some model tag names. Improve model tag name sorting. Update HF hub push config layout.
2022-12-05 10:21:34 -08:00
Ross Wightman
4d5c395160
MaxVit, ViT, ConvNeXt, and EfficientNet-v2 updates
...
* Add support for TF weights and modelling specifics to MaxVit (testing ported weights)
* More fine-tuned CLIP ViT configs
* ConvNeXt and MaxVit updated to new pretrained cfgs use
* EfficientNetV2, MaxVit and ConvNeXt high res models use squash crop/resize
2022-12-05 10:21:34 -08:00
Ross Wightman
9da7e3a799
Add crop_mode for pretraind config / image transforms. Add support for dynamo compilation to benchmark/train/validate
2022-12-05 10:21:34 -08:00
Ross Wightman
b2b6285af7
Add two more FT clip weights
2022-12-05 10:21:34 -08:00
Ross Wightman
5895056dc4
Add openai b32 ft
2022-12-05 10:21:34 -08:00
Ross Wightman
9dea5143d5
Adding more clip ft variants
2022-12-05 10:21:34 -08:00