Ross Wightman
70176a2dae
torchscript typing fixes
2024-05-23 11:43:05 -07:00
Ross Wightman
2a1a6b1236
Adding missing attention2d.py
2024-05-23 11:06:32 -07:00
Ross Wightman
cee79dada0
Merge remote-tracking branch 'origin/main' into efficientnet_x
2024-05-23 11:01:39 -07:00
Ross Wightman
6a8bb03330
Initial MobileNetV4 pass
2024-05-23 10:49:18 -07:00
Fernando Cossio
9b11801cb4
Credit earlier work with the same idea.
...
Hi, this earlier work has the same name and idea behind this layer. It could be useful for readers to keep both links here if they want to see the effects of introducing this layer on a very different domain. 😄
2024-05-16 22:50:34 +02:00
Ross Wightman
211d18d8ac
Move norm & pool into Hiera ClassifierHead. Misc fixes, update features_intermediate() naming
2024-05-11 23:37:35 -07:00
Ross Wightman
2bfa5e5d74
Remove JIT activations, take jit out of ME activations. Remove other instances of torch.jit.script. Breaks torch.compile and is much less performant. Remove SpaceToDepthModule
2024-05-06 16:32:49 -07:00
Ross Wightman
301d0bb21f
Stricter check on pool_type for adaptive pooling module. Fix #2159
2024-05-03 16:16:51 -07:00
Ross Wightman
4b2565e4cb
More forward_intermediates() / FeatureGetterNet work
...
* include relpos vit
* refactor reduction / size calcs so hybrid vits work and dynamic_img_size works
* fix -ve feature indices when pruning
* fix mvitv2 w/ class token
* refine naming
* add tests
2024-04-10 15:11:34 -07:00
Ross Wightman
d6c2cc91af
Make NormMlpClassifier head reset args consistent with ClassifierHead
2024-02-10 16:25:33 -08:00
Ross Wightman
7bc7798d0e
Type annotation correctness for create_act
2024-02-10 14:57:58 -08:00
Ross Wightman
88889de923
Fix meshgrid deprecation warnings and backward compat with explicit 'ndgrid' and 'meshgrid' fn w/o indexing arg
2024-01-27 13:48:33 -08:00
Ross Wightman
d4386219c6
Improve type handling for arange & rel pos embeds, keep calculations in float32 until application (may change to apply in float32 in future). Prevent arange type hijacking by DeepSpeed Zero
2024-01-26 16:35:51 -08:00
kalazus
7f19a4cce7
fix fast catavgmax selection
2024-01-16 10:30:08 -08:00
Ross Wightman
df7ae11eb2
Add device arg for patch embed resize, fix #2024
2023-12-04 11:42:13 -08:00
Ross Wightman
9fab8d8f58
Fix break of 2 years old torchvision installs :/
2023-11-04 02:32:09 -07:00
Ross Wightman
f7762fee78
Consistency handling None / empty string inputs to norm / act create fns
2023-11-03 11:01:41 -07:00
Ross Wightman
a2e4a4c148
Add quickgelu vit clip variants, simplify get_norm_layer and allow string args in vit norm/act. Add metaclip CLIP weights
2023-11-03 11:01:41 -07:00
a-r-r-o-w
d5f1525334
include suggestions from review
...
Co-Authored-By: Ross Wightman <rwightman@gmail.com>
2023-10-30 13:47:54 -07:00
a-r-r-o-w
5f14bdd564
include typing suggestions by @rwightman
2023-10-30 13:47:54 -07:00
Laureηt
fe92fd93e5
fix adaptive_avgmax_pool.py
...
remove extra whitespace in `SelectAdaptivePool2d`'s `__repr__`
2023-10-29 23:03:36 -07:00
Tush9905
89ba0da910
Fixed Typos
...
Fixed the typos in helpers.py and CONTRIBUTING.md
2023-10-21 21:46:31 -07:00
Ross Wightman
49a459e8f1
Merge remote-tracking branch 'upstream/main' into vit_siglip_and_reg
2023-10-17 09:36:48 -07:00
Ross Wightman
a58f9162d7
Missed __init__.py update for attention pooling layer add
2023-10-17 09:28:21 -07:00
Ross Wightman
71365165a2
Add SigLIP weights
2023-10-16 23:26:08 -07:00
lucapericlp
7ce65a83a2
Removing unused self.drop
2023-10-05 11:20:57 -07:00
Ross Wightman
9caf32b93f
Move levit style pos bias resize with other rel pos bias utils
2023-09-01 11:05:56 -07:00
方曦
170a5b6e27
add tinyvit
2023-09-01 11:05:56 -07:00
Ross Wightman
fc5d705b83
dynamic_size -> dynamic_img_size, add dynamic_img_pad for padding option
2023-08-27 15:58:35 -07:00
Ross Wightman
1f4512fca3
Support dynamic_resize in eva.py models
2023-08-27 15:58:35 -07:00
Ross Wightman
fdd8c7c2da
Initial impl of dynamic resize for existing vit models (incl vit-resnet hybrids)
2023-08-27 15:58:35 -07:00
Ross Wightman
c153cd4a3e
Add more advanced interpolation method from BEiT and support non-square window & image size adaptation for
...
* beit/beit-v2
* maxxvit/coatnet
* swin transformer
And non-square windows for swin-v2
2023-08-08 16:41:16 -07:00
Ross Wightman
76d166981d
Fix missing norm call in Mlp forward (not used by default, but can be enabled for normformer MLP scale). Fix #1851 fix #1852
2023-08-03 11:36:30 -07:00
Ross Wightman
8e4480e4b6
Patch and pos embed resample done in float32 always (cast to float and back). Fix #1811
2023-08-03 11:32:17 -07:00
Sepehr Sameni
40a518c194
use float in resample_abs_pos_embed_nhwc
...
since F.interpolate doesn't always support BFloat16
2023-07-28 16:01:42 -07:00
Ross Wightman
e9373b1b92
Cleanup before samvit merge. Resize abs posembed on the fly, undo some line-wraps, remove redundant unbind, fix HF hub weight load
2023-05-18 16:43:48 -07:00
Ross Wightman
a01d8f86f4
Tweak DinoV2 add, add MAE ViT weights, add initial intermediate layer getter experiment
2023-05-09 17:59:22 -07:00
Leng Yue
5cc87e6485
Add dinov2 pretrained models ( #1797 )
...
* add dinov2 small, base, and large
* fix input size
* fix swiglu & dinov2 vit giant
* use SwiGLUPacked to replace GluMlp
* clean up & add ffn_layer placeholder for ParallelScalingBlock
2023-05-09 12:24:47 -07:00
Ross Wightman
af48246a9a
Add SwiGLUPacked to layers __init__
2023-05-08 13:52:34 -07:00
Ross Wightman
3fdb31de2e
Small SwiGLU tweak, remove default LN arg in unpacked variant, add packed alias for GluMLP
2023-05-08 12:28:00 -07:00
Ross Wightman
cb3f9c23bb
Metaformer baselines for vision (final PR with cleanup) ( #1793 )
...
* update
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* merge with poolformer, initial version
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Revert "Update metaformers.py"
This reverts commit 2916f37f8d
.
* Revert "Update metaformers.py"
This reverts commit 1d882eb494
.
* Revert "Update metaformers.py"
This reverts commit 2209d0830e
.
* Revert "Update metaformers.py"
This reverts commit 32bede4e27
.
* Revert "Update metaformers.py"
This reverts commit 4ed934e000
.
* Revert "Update metaformers.py"
This reverts commit 3f0b075367
.
* Revert "Update metaformers.py"
This reverts commit 2fef9006d7
.
* Update metaformers.py
* Update metaformers.py
* rename model
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Stem/Downsample rework
* Update metaformers.py
* try NHWC
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Squashed commit of the following:
commit b7696a30a7
Author: Fredo Guan <fredo.guan@hotmail.com>
Date: Fri Feb 10 01:46:44 2023 -0800
Update metaformers.py
commit 41fe5c3626
Author: Fredo Guan <fredo.guan@hotmail.com>
Date: Fri Feb 10 01:03:47 2023 -0800
Update metaformers.py
commit a3aee37c35
Author: Fredo Guan <fredo.guan@hotmail.com>
Date: Fri Feb 10 00:32:04 2023 -0800
Update metaformers.py
commit f938beb81b
Author: Fredo Guan <fredo.guan@hotmail.com>
Date: Fri Feb 10 00:24:58 2023 -0800
Update metaformers.py
commit 10bde717e5
Author: Fredo Guan <fredo.guan@hotmail.com>
Date: Sun Feb 5 02:11:28 2023 -0800
Update metaformers.py
commit 39274bd45e
Author: Fredo Guan <fredo.guan@hotmail.com>
Date: Sun Feb 5 02:06:58 2023 -0800
Update metaformers.py
commit a2329ab8ec
Author: Fredo Guan <fredo.guan@hotmail.com>
Date: Sun Feb 5 02:03:34 2023 -0800
Update metaformers.py
commit 53b8ce5b8a
Author: Fredo Guan <fredo.guan@hotmail.com>
Date: Sun Feb 5 02:02:37 2023 -0800
Update metaformers.py
commit ab6225b941
Author: Fredo Guan <fredo.guan@hotmail.com>
Date: Sun Feb 5 01:04:55 2023 -0800
try NHWC
commit 02fcc30eaa
Author: Fredo Guan <fredo.guan@hotmail.com>
Date: Sat Feb 4 23:47:06 2023 -0800
Update metaformers.py
commit 366aae9304
Author: Fredo Guan <fredo.guan@hotmail.com>
Date: Sat Feb 4 23:37:30 2023 -0800
Stem/Downsample rework
commit 26a8e481a5
Author: Fredo Guan <fredo.guan@hotmail.com>
Date: Wed Feb 1 07:42:07 2023 -0800
Update metaformers.py
commit a913f5d438
Author: Fredo Guan <fredo.guan@hotmail.com>
Date: Wed Feb 1 07:41:24 2023 -0800
Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* channels first for whole network
* Channels first
* Update metaformers.py
* Use buffer for randformer
* Update metaformers.py
* Remove einsum
* don't test randformer for feature extraction
* arbitrary input sizes for randformer
* Squashed commit of the following:
commit 6c089ca4325ab10942fe56e0999dcc1a11e1d2f0
Author: Fredo Guan <fredo.guan@hotmail.com>
Date: Mon Mar 6 02:11:17 2023 -0800
Update metaformers.py
commit 521528a900e49ef8f462f5ccd795efb3a5d14214
Author: Fredo Guan <fredo.guan@hotmail.com>
Date: Mon Mar 6 02:06:08 2023 -0800
Update metaformers.py
commit 3827eec7963698ff727fbb13ace53594ceb374d5
Author: Fredo Guan <fredo.guan@hotmail.com>
Date: Mon Mar 6 02:03:08 2023 -0800
Update metaformers.py
commit ac1c6fea8adcd846e031ea0f5fa81ffe63d3c4bb
Author: Fredo Guan <fredo.guan@hotmail.com>
Date: Mon Mar 6 02:01:04 2023 -0800
Update metaformers.py
commit 26f3d343cdc46183543f83482187f669f3181ddf
Merge: d577129 f736730
Author: Fredo Guan <fredo.guan@hotmail.com>
Date: Mon Mar 6 01:57:29 2023 -0800
Merge branch 'metaformer_workspace' of https://github.com/fffffgggg54/pytorch-image-models into metaformer_workspace
commit d577129aaa23fb348a8bb93bcd17cf1d5a4e8ff8
Author: Fredo Guan <fredo.guan@hotmail.com>
Date: Mon Mar 6 01:57:20 2023 -0800
Update metaformers.py
commit f7367304e8f3b7a9a7f16e0a032bb72546afcc2a
Author: Fredo Guan <fredo.guan@hotmail.com>
Date: Mon Mar 6 01:56:11 2023 -0800
Metaformer baselines for vision (#12 )
* formatting, cleanup, fix dropout
* fix regression, pass kwargs
* fix poolformerv1 weights, formatting
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* Update metaformers.py
* some cleanup
* SDPA from ViT, fix imports
* Update metaformers.py
* fix head reset
* fast norm bias patch for metaformers
* Metaformer refactor, remove rand/ident models, fix issues, remove old poolformer
* Switch to hub weights
---------
Co-authored-by: Fredo Guan <fredo.guan@hotmail.com>
2023-05-05 11:18:26 -07:00
Ross Wightman
437d344e03
Always some torchscript issues
2023-04-26 20:42:34 -07:00
Ross Wightman
3386af8c86
Final push to get remaining models using multi-weight pretrained configs, almost all weights on HF hub
2023-04-26 15:52:13 -07:00
Ross Wightman
a08e5aed1d
More models w/ multi-weight support, moving to HF hub. Removing inplace_abn from all models including TResNet
2023-04-20 22:44:49 -07:00
Ross Wightman
aef6e562e4
Add onnx utils and export code, tweak padding and conv2d_same for better dynamic export with recent PyTorch
2023-04-11 17:03:57 -07:00
Ross Wightman
965d0a2d36
fast_attn -> fused_attn, implement global config for enable/disable fused_attn, add to more models. vit clip openai 336 weights.
2023-04-10 12:04:33 -07:00
Ross Wightman
4d135421a3
Implement patch dropout for eva / vision_transformer, refactor / improve consistency of dropout args across all vit based models
2023-04-07 20:27:23 -07:00
Ross Wightman
3863d63516
Adding EVA02 weights and model defs, move beit based eva_giant to same eva.py file. Cleanup rotary pos, add lang oriented freq bands to be compat with eva design choice. Fix #1738
2023-03-27 17:16:07 -07:00
Ross Wightman
8db20dc240
Fix #1726 , dropout not used in NormMlpClassifierHead. Make dropout more consistent across both classifier heads (nn.Dropout)
2023-03-20 09:37:05 -07:00
Ross Wightman
acfd85ad68
All swin models support spatial output, add output_fmt to v1/v2 and use ClassifierHead.
...
* update ClassifierHead to allow different input format
* add output format support to patch embed
* fix some flatten issues for a few conv head models
* add Format enum and helpers for tensor format (layout) choices
2023-03-15 23:21:51 -07:00