Ross Wightman
c28324a150
Update efficient_vit (msra), hf hub weights
2023-08-18 16:45:37 -07:00
Ross Wightman
e700a32626
Cleanup of efficient_vit (mit), tweak eps for better AMP behaviour, formatting/cleanup, weights on hf hub
2023-08-18 16:06:07 -07:00
方曦
00f670fa69
fix bug in ci for efficientvits
2023-08-17 14:40:17 +08:00
Chengpeng Chen
e7f97cb5ce
Fix typos RepGhost models
2023-08-16 14:27:45 +08:00
Chengpeng Chen
d1d0193615
Add RepGhost models and weights
2023-08-16 11:54:53 +08:00
Minseo Kang
7938f28542
Fix typo in efficientformer_v2
2023-08-16 03:29:01 +09:00
yehuitang
b407794e3a
Add GhostNetV2
2023-08-13 18:20:27 +08:00
yehuitang
fc865282e5
Add ghostnetv2.py
2023-08-13 18:16:26 +08:00
Ross Wightman
da75cdd212
Merge pull request #1900 from huggingface/swin_maxvit_resize
...
Add support for resizing swin transformer, maxvit, coatnet at creation time
2023-08-11 15:05:28 -07:00
Ross Wightman
78a04a0e7d
Merge pull request #1911 from dsuess/1910-fixes-batchnormact-fx
...
Register norm_act layers as leaf modules
2023-08-11 14:34:16 -07:00
Yonghye Kwon
2048f6f20f
set self.num_features to neck_chans if neck_chans > 0
2023-08-11 13:45:06 +09:00
Ross Wightman
3a44e6c602
Fix #1912 CoaT model not loading w/ return_interm_layers
2023-08-10 11:15:58 -07:00
Daniel Suess
986de90360
Register orm_act layers as leaf modules
2023-08-10 15:37:26 +10:00
Ross Wightman
c692715388
Some RepVit tweaks
...
* add head dropout to RepVit as all models have that arg
* default train to non-distilled head output via distilled_training flag (set_distilled_training) so fine-tune works by default w/o distillation script
* camel case naming tweaks to match other models
2023-08-09 12:41:12 -07:00
Ross Wightman
c153cd4a3e
Add more advanced interpolation method from BEiT and support non-square window & image size adaptation for
...
* beit/beit-v2
* maxxvit/coatnet
* swin transformer
And non-square windows for swin-v2
2023-08-08 16:41:16 -07:00
alec.tu
bb2b6b5f09
fix num_classes not found
2023-08-07 15:16:03 +08:00
Ross Wightman
1dab536cb1
Fix torch.fx for swin padding change
2023-08-05 13:09:55 -07:00
Ross Wightman
7c0f492dbb
Fix type annotation for torchscript
2023-08-04 23:03:52 -07:00
Ross Wightman
7790ea709b
Add support for resizing swin transformer img_size and window_size on init and load from pretrained weights. Add support for non-square window_size to both swin v1/v2
2023-08-04 22:10:46 -07:00
Ross Wightman
81089b10a2
Remove unecessary LongTensor in EfficientFormer. Possibly maybe fix #1878
2023-08-03 16:38:53 -07:00
Ross Wightman
d138a9bf88
Add gluon hrnet small weights, fix #1895
2023-08-03 12:15:04 -07:00
Ross Wightman
150356c493
Fix unfortunate selecsls case bug caused by aggressive IDE rename
2023-08-03 10:37:06 -07:00
Ross Wightman
6e8c53d0d3
Comment out beit url, no longer valid as now require long query string, leave for reference, must use HF hub now.
2023-08-03 10:00:46 -07:00
方曦
a56e2bbf19
fix efficientvit_msra pretrained load
2023-08-03 18:44:38 +08:00
方曦
e94c60b546
efficientvit_msra refactor
2023-08-03 17:45:50 +08:00
方曦
047bab6ab2
efficientvit_mit stage refactor
2023-08-03 14:59:35 +08:00
方曦
e8fb866ccf
fix efficientvit_msra pool
2023-08-02 14:40:01 +08:00
方曦
43443f64eb
fix efficientvits
2023-08-02 14:12:37 +08:00
方曦
82d1e99e1a
add efficientvit(msra)
2023-08-01 18:51:08 +08:00
方曦
b91a77fab7
add EfficientVit (MIT)
2023-08-01 12:42:21 +08:00
Ross Wightman
a9d0615f42
Fix ijepa vit issue with 448 model, minor formatting fixes
2023-07-26 20:46:27 -07:00
Ross Wightman
5874d1bfc7
Merge pull request #1876 from jameslahm/main
...
Add RepViT models
2023-07-26 14:38:41 -07:00
Ross Wightman
b10310cc27
Add proper pool size for new resnexts
2023-07-26 14:36:03 -07:00
Ross Wightman
b71d60cdb7
Two small fixes, num_classes in base class, add model tag
2023-07-26 13:18:49 -07:00
Ross Wightman
3561f8e885
Add seresnextaa201d_32x8d 12k and 1k weights
2023-07-26 13:17:05 -07:00
jameslahm
3318e7614d
Add RepViT models
2023-07-21 14:56:53 +08:00
Ruslan Baikulov
158bf129c4
Replace deprecated NumPy aliases of builtin types
2023-07-03 22:24:25 +03:00
Ross Wightman
c241081251
Merge pull request #1850 from huggingface/effnet_improve_features_only
...
Support other features only modes for EfficientNet. Fix #1848 fix #1849
2023-06-23 22:56:08 -07:00
Ross Wightman
47517dbefd
Clean more feature extract issues
...
* EfficientNet/MobileNetV3/HRNetFeatures cls and FX mode support -ve index
* MobileNetV3 allows feature_cfg mode to bypass MobileNetV3Features
2023-06-14 14:46:22 -07:00
Ross Wightman
a09c88ed0f
Support other features only modes for EfficientNet
2023-06-14 12:57:39 -07:00
SeeFun
c3f24a5ae5
‘add ViT weight from I-JEPA pretrain’
2023-06-14 22:30:31 +08:00
Lengyue
c308dbc6f2
update dinov2 layerscale init values
2023-05-24 12:20:17 -04:00
Ross Wightman
e9373b1b92
Cleanup before samvit merge. Resize abs posembed on the fly, undo some line-wraps, remove redundant unbind, fix HF hub weight load
2023-05-18 16:43:48 -07:00
方曦
c1c6eeb909
fix loading pretrained weight for samvit
2023-05-18 08:49:29 +08:00
方曦
15de561f2c
fix unit test for samvit
2023-05-17 12:51:12 +08:00
方曦
ea1f52df3e
add ViT for Segment-Anything Model
2023-05-17 11:39:29 +08:00
Ross Wightman
c5d3ee47f3
Add B/16 datacompxl CLIP weights
2023-05-16 11:27:20 -07:00
Philip Keller
fc77e9ecc5
Update hub.py
...
fixed import of _hub modules
2023-05-12 21:48:46 +02:00
Ross Wightman
f744bda994
use torch.jit.Final instead of Final for beit, eva
2023-05-12 09:12:14 -07:00
Ross Wightman
2e99bcaedd
Update README, prep for version 0.9.0 release
2023-05-11 15:22:50 -07:00
Ross Wightman
3eaf729f3f
F.sdpa for visformer fails w/o contiguous on qkv, make experimental
2023-05-11 11:37:37 -07:00
Ross Wightman
cf1884bfeb
Add 21k maxvit tf weights
2023-05-10 18:23:32 -07:00
Ross Wightman
6c2edf4d74
Missed hub_id entries for byoanet models
2023-05-10 15:58:55 -07:00
Ross Wightman
850ab4931f
Missed a few pretrained tags...
2023-05-10 12:16:30 -07:00
Ross Wightman
ff2464e2a0
Throw when pretrained weights not available and pretrained=True (principle of least surprise).
2023-05-10 10:44:34 -07:00
Ross Wightman
e0ec0f7252
Merge pull request #1643 from nateraw/docstrings-update
...
Update Docstring for create_model
2023-05-09 21:33:20 -07:00
Ross Wightman
627b6315ba
Add typing to dinov2 entrypt fns, use hf hub for mae & dinov2 weights
2023-05-09 20:42:11 -07:00
Ross Wightman
960a882510
Remove label offsets and remove old weight url for 1001 class (background + in1k) TF origin weights
2023-05-09 18:00:41 -07:00
Ross Wightman
a01d8f86f4
Tweak DinoV2 add, add MAE ViT weights, add initial intermediate layer getter experiment
2023-05-09 17:59:22 -07:00
Ross Wightman
59bea4c306
Merge branch 'main' into dot_nine_cleanup
2023-05-09 12:27:32 -07:00
Leng Yue
5cc87e6485
Add dinov2 pretrained models ( #1797 )
...
* add dinov2 small, base, and large
* fix input size
* fix swiglu & dinov2 vit giant
* use SwiGLUPacked to replace GluMlp
* clean up & add ffn_layer placeholder for ParallelScalingBlock
2023-05-09 12:24:47 -07:00
Ross Wightman
21e57c0b9e
Add missing beitv2 in1k -> in1k models
2023-05-08 17:03:51 -07:00
Ross Wightman
8c6fccb879
Allow passing state_dict directly via pretrained cfg mechanism as an override
2023-05-08 15:15:44 -07:00
Ross Wightman
e4e43190ce
Add typing to all model entrypoint fns, add old cache check env var to builder
2023-05-08 08:52:38 -07:00
Ross Wightman
cb3f9c23bb
Metaformer baselines for vision (final PR with cleanup) ( #1793 )
...
* update
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* merge with poolformer, initial version
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Revert "Update metaformers.py"
This reverts commit 2916f37f8d
.
* Revert "Update metaformers.py"
This reverts commit 1d882eb494
.
* Revert "Update metaformers.py"
This reverts commit 2209d0830e
.
* Revert "Update metaformers.py"
This reverts commit 32bede4e27
.
* Revert "Update metaformers.py"
This reverts commit 4ed934e000
.
* Revert "Update metaformers.py"
This reverts commit 3f0b075367
.
* Revert "Update metaformers.py"
This reverts commit 2fef9006d7
.
* Update metaformers.py
* Update metaformers.py
* rename model
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Stem/Downsample rework
* Update metaformers.py
* try NHWC
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Squashed commit of the following:
commit b7696a30a7
Author: Fredo Guan <fredo.guan@hotmail.com>
Date: Fri Feb 10 01:46:44 2023 -0800
Update metaformers.py
commit 41fe5c3626
Author: Fredo Guan <fredo.guan@hotmail.com>
Date: Fri Feb 10 01:03:47 2023 -0800
Update metaformers.py
commit a3aee37c35
Author: Fredo Guan <fredo.guan@hotmail.com>
Date: Fri Feb 10 00:32:04 2023 -0800
Update metaformers.py
commit f938beb81b
Author: Fredo Guan <fredo.guan@hotmail.com>
Date: Fri Feb 10 00:24:58 2023 -0800
Update metaformers.py
commit 10bde717e5
Author: Fredo Guan <fredo.guan@hotmail.com>
Date: Sun Feb 5 02:11:28 2023 -0800
Update metaformers.py
commit 39274bd45e
Author: Fredo Guan <fredo.guan@hotmail.com>
Date: Sun Feb 5 02:06:58 2023 -0800
Update metaformers.py
commit a2329ab8ec
Author: Fredo Guan <fredo.guan@hotmail.com>
Date: Sun Feb 5 02:03:34 2023 -0800
Update metaformers.py
commit 53b8ce5b8a
Author: Fredo Guan <fredo.guan@hotmail.com>
Date: Sun Feb 5 02:02:37 2023 -0800
Update metaformers.py
commit ab6225b941
Author: Fredo Guan <fredo.guan@hotmail.com>
Date: Sun Feb 5 01:04:55 2023 -0800
try NHWC
commit 02fcc30eaa
Author: Fredo Guan <fredo.guan@hotmail.com>
Date: Sat Feb 4 23:47:06 2023 -0800
Update metaformers.py
commit 366aae9304
Author: Fredo Guan <fredo.guan@hotmail.com>
Date: Sat Feb 4 23:37:30 2023 -0800
Stem/Downsample rework
commit 26a8e481a5
Author: Fredo Guan <fredo.guan@hotmail.com>
Date: Wed Feb 1 07:42:07 2023 -0800
Update metaformers.py
commit a913f5d438
Author: Fredo Guan <fredo.guan@hotmail.com>
Date: Wed Feb 1 07:41:24 2023 -0800
Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* channels first for whole network
* Channels first
* Update metaformers.py
* Use buffer for randformer
* Update metaformers.py
* Remove einsum
* don't test randformer for feature extraction
* arbitrary input sizes for randformer
* Squashed commit of the following:
commit 6c089ca4325ab10942fe56e0999dcc1a11e1d2f0
Author: Fredo Guan <fredo.guan@hotmail.com>
Date: Mon Mar 6 02:11:17 2023 -0800
Update metaformers.py
commit 521528a900e49ef8f462f5ccd795efb3a5d14214
Author: Fredo Guan <fredo.guan@hotmail.com>
Date: Mon Mar 6 02:06:08 2023 -0800
Update metaformers.py
commit 3827eec7963698ff727fbb13ace53594ceb374d5
Author: Fredo Guan <fredo.guan@hotmail.com>
Date: Mon Mar 6 02:03:08 2023 -0800
Update metaformers.py
commit ac1c6fea8adcd846e031ea0f5fa81ffe63d3c4bb
Author: Fredo Guan <fredo.guan@hotmail.com>
Date: Mon Mar 6 02:01:04 2023 -0800
Update metaformers.py
commit 26f3d343cdc46183543f83482187f669f3181ddf
Merge: d577129 f736730
Author: Fredo Guan <fredo.guan@hotmail.com>
Date: Mon Mar 6 01:57:29 2023 -0800
Merge branch 'metaformer_workspace' of https://github.com/fffffgggg54/pytorch-image-models into metaformer_workspace
commit d577129aaa23fb348a8bb93bcd17cf1d5a4e8ff8
Author: Fredo Guan <fredo.guan@hotmail.com>
Date: Mon Mar 6 01:57:20 2023 -0800
Update metaformers.py
commit f7367304e8f3b7a9a7f16e0a032bb72546afcc2a
Author: Fredo Guan <fredo.guan@hotmail.com>
Date: Mon Mar 6 01:56:11 2023 -0800
Metaformer baselines for vision (#12 )
* formatting, cleanup, fix dropout
* fix regression, pass kwargs
* fix poolformerv1 weights, formatting
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* some cleanup
* SDPA from ViT, fix imports
* Update metaformers.py
* fix head reset
* fast norm bias patch for metaformers
* Metaformer refactor, remove rand/ident models, fix issues, remove old poolformer
* Switch to hub weights
---------
Co-authored-by: Fredo Guan <fredo.guan@hotmail.com>
2023-05-05 11:18:26 -07:00
Ross Wightman
320bf9c469
Remove redundant types, kwargs back in own section (lesser of many evils?)
2023-05-01 14:21:48 -07:00
Ross Wightman
8fa86a28a8
Add datacomp L/14 (79.2 zs) image tower weights
2023-05-01 10:24:08 -07:00
Ross Wightman
5e64777804
0.8.21dev0
2023-04-28 13:46:59 -07:00
Ross Wightman
493c730ffc
Fix pit regression
2023-04-26 23:16:06 -07:00
Ross Wightman
437d344e03
Always some torchscript issues
2023-04-26 20:42:34 -07:00
Ross Wightman
528faa0e04
Some fixes
2023-04-26 17:46:20 -07:00
Ross Wightman
3386af8c86
Final push to get remaining models using multi-weight pretrained configs, almost all weights on HF hub
2023-04-26 15:52:13 -07:00
Ross Wightman
7ad7ddb7ad
DenseNet, DPN, VoVNet, Aligned Xception weights on HF hub. DenseNet grad_checkpointing using timm API
2023-04-21 16:56:44 -07:00
Ross Wightman
864bfd43d0
hardcore nas weights on hf hub
2023-04-21 14:35:10 -07:00
Ross Wightman
6e4529ae35
TResNet weights now on HF hub, modified to remove InplaceABN dependency
2023-04-21 14:20:48 -07:00
Ross Wightman
04dcbc02ec
Fix weight remap for tresnet_v2_l
2023-04-21 09:05:04 -07:00
Ross Wightman
a08e5aed1d
More models w/ multi-weight support, moving to HF hub. Removing inplace_abn from all models including TResNet
2023-04-20 22:44:49 -07:00
Ross Wightman
34df125be6
cait, volo, xvit hub weights
2023-04-14 10:13:13 -07:00
Ross Wightman
f6d5767551
cspnet models on HF hub w/ multi-weight support
2023-04-12 14:02:38 -07:00
Ross Wightman
80b247d843
Update swin_v2 attn_mask buffer change in #1790 to apply to updated checkpoints in hub
2023-04-11 14:40:32 -07:00
Ross Wightman
1a1aca0cee
Merge pull request #1761 from huggingface/patch_drop_refactor
...
Implement patch dropout for eva / vision_transformer, refactor dropout args
2023-04-11 14:37:36 -07:00
Ross Wightman
c0670822d2
Small factory handling fix for pretrained tag vs cfg
2023-04-11 07:42:13 -07:00
Ross Wightman
2f25f73b90
Missed a fused_attn update in relpos vit
2023-04-10 23:30:50 -07:00
Ross Wightman
0b65b5c0ac
Add finalized eva CLIP weights pointing to remapped timm hub models
2023-04-10 23:13:12 -07:00
Ross Wightman
965d0a2d36
fast_attn -> fused_attn, implement global config for enable/disable fused_attn, add to more models. vit clip openai 336 weights.
2023-04-10 12:04:33 -07:00
Ross Wightman
4d135421a3
Implement patch dropout for eva / vision_transformer, refactor / improve consistency of dropout args across all vit based models
2023-04-07 20:27:23 -07:00
Marco Forte
c76818a592
skip attention mask buffers
...
Allows more flexibility in the resolutions accepted by SwinV2.
2023-04-07 18:50:02 +02:00
Ross Wightman
1bb3989b61
Improve kwarg passthrough for swin, vit, deit, beit, eva
2023-04-05 21:37:16 -07:00
Ross Wightman
35c94b836c
Update warning message for deprecated model names
2023-04-05 17:24:17 -07:00
Ross Wightman
9eaab795c2
Add some vit model deprecations
2023-04-05 17:21:03 -07:00
Ross Wightman
abff3f12ec
Wrong pool_size for 288 ft
2023-04-05 16:07:51 -07:00
Ross Wightman
356309959c
ResNet models on HF hub, multi-weight support, add torchvision v2 weights, new 12k pretrained and fine-tuned timm anti-aliased weights
2023-04-05 14:19:42 -07:00
Ross Wightman
beef7f0a22
Add ImageNet-12k intermediate fine-tunes of convnext base & large CLIP models, add first 1k fine-tune of xxlarge
2023-03-31 16:45:01 -07:00
Ross Wightman
9aa1133bd2
Fix #1750 , uncomment weight that exists on HF hub, add FIXME to 3 others that are still on local storage
2023-03-31 14:49:30 -07:00
Ross Wightman
7326470514
Merge pull request #1746 from huggingface/eva02
...
Adding EVA02 weights and model defs
2023-03-31 12:17:00 -07:00
Ross Wightman
adeb9de7c6
Mismatch in eva pretrained_cfg vs model for one of the clip variants
2023-03-31 10:30:30 -07:00
Ross Wightman
0737bd3ec8
eva02 non-CLIP weights on HF hub, add initial eva02 clip model configs w/ postnorm variant & attn LN
2023-03-30 23:43:59 -07:00
Ross Wightman
ac67098147
Add final attr for fast_attn on beit / eva
2023-03-28 08:40:40 -07:00
Ross Wightman
1885bdc431
Merge pull request #1745 from huggingface/mw-mlp_mixer
...
MLP-Mixer multi-weight support, HF hub push
2023-03-28 07:55:17 -07:00
Ross Wightman
e9f427b953
Add hf hub entries for mlp_mixer
2023-03-27 22:50:43 -07:00