Ross Wightman
7a13be67a5
Update version.py
2023-02-05 10:06:15 -08:00
Ross Wightman
13acac8c5e
Update head metadata for effformerv2
2023-02-04 23:11:51 -08:00
Ross Wightman
8682528096
Add first conv metadata for efficientformer_v2
2023-02-04 23:02:02 -08:00
Ross Wightman
72fba669a8
is_scripting() guard on checkpoint_seq
2023-02-04 14:21:49 -08:00
Ross Wightman
95ec255f7f
Finish timm mode api for efficientformer_v2, add grad checkpointing support to both efficientformers
2023-02-03 21:21:23 -08:00
Ross Wightman
9d03c6f526
Merge remote-tracking branch 'origin/main' into levit_efficientformer_redux
2023-02-03 14:47:01 -08:00
Ross Wightman
086bd55a94
Add EfficientFormer-V2, refactor EfficientFormer and Levit for more uniformity across the 3 related arch. Add features_out support to levit conv models and efficientformer_v2. All weights on hub.
2023-02-03 14:12:29 -08:00
Ross Wightman
2cb2699dc8
Apply fix from #1649 to main
2023-02-03 11:28:57 -08:00
Ross Wightman
b3042081b4
Add laion -> in1k fine-tuned base and large_mlp weights for convnext
2023-02-03 10:58:02 -08:00
Ross Wightman
316bdf8955
Add mlp head support for convnext_large, add laion2b CLIP weights, prep fine-tuned weight tags
2023-02-01 08:27:02 -08:00
Ross Wightman
6f28b562c6
Factor NormMlpClassifierHead from MaxxViT and use across MaxxViT / ConvNeXt / DaViT, refactor some type hints & comments
2023-01-27 14:57:01 -08:00
Ross Wightman
9a53c3f727
Finalize DaViT, some formatting and modelling simplifications (separate PatchEmbed to Stem + Downsample, weights on HF hub.
2023-01-27 13:54:04 -08:00
Fredo Guan
fb717056da
Merge remote-tracking branch 'upstream/main'
2023-01-26 10:49:15 -08:00
Ross Wightman
2bbc26dd82
version 0.8.8dev0
2023-01-25 18:02:48 -08:00
Ross Wightman
64667bfa0e
Add 'gigantic' vit clip variant for feature extraction and future fine-tuning
2023-01-25 18:02:10 -08:00
Ross Wightman
c2822568ec
Update version to 0.8.7dev0
2023-01-20 15:01:10 -08:00
Ross Wightman
36989cfae4
Factor out readme generation in hub helper, add more readme fields
2023-01-20 14:49:40 -08:00
Ross Wightman
32f252381d
Change order of checkpoitn filtering fn application in builder, try dict, model variant first
2023-01-20 14:48:54 -08:00
Ross Wightman
e9f1376cde
Cleanup resolve data config fns, add 'model' variant that takes model as first arg, make 'args' arg optional in original fn
2023-01-20 14:47:55 -08:00
Ross Wightman
bed350f5e5
Push all MaxxViT weights to HF hub, cleanup impl, add feature map extraction support and prompote to 'std' architecture. Fix norm head for proper embedding / feat map output. Add new in12k + ft 1k weights.
2023-01-20 14:45:25 -08:00
Ross Wightman
ca38e1e73f
Update ClassifierHead module, add reset() method, update in_chs -> in_features for consistency
2023-01-20 14:44:05 -08:00
Ross Wightman
8ab573cd26
Add convnext_tiny and convnext_small 384x384 fine-tunes of in12k weights, fix pool size for laion CLIP convnext weights
2023-01-20 14:40:16 -08:00
Fredo Guan
81ca323751
Davit update formatting and fix grad checkpointing ( #7 )
...
fixed head to gap->norm->fc as per convnext, along with option for norm->gap->fc
failed tests due to clip convnext models, davit tests passed
2023-01-15 14:34:56 -08:00
Ross Wightman
e9aac412de
Correct mean/std for CLIP convnexts
2023-01-14 22:53:56 -08:00
Ross Wightman
42bd8f7bcb
Add convnext_base CLIP image tower weights for fine-tuning / features
2023-01-14 21:16:29 -08:00
Ross Wightman
e520553e3e
Update batchnorm freezing to handle NormAct variants, Add GroupNorm1Act, update BatchNormAct2d tracing change from PyTorch
2023-01-12 16:55:47 -08:00
Ross Wightman
a2c14c2064
Add tiny/small in12k pretrained and fine-tuned ConvNeXt models
2023-01-11 14:50:39 -08:00
Ross Wightman
01aea8c1bf
Version 0.8.6dev0
2023-01-09 13:38:31 -08:00
Ross Wightman
2e83bba142
Revert head norm changes to ConvNeXt as it broke some downstream use, alternate workaround for fcmae weights
2023-01-09 13:37:40 -08:00
Ross Wightman
1825b5e314
maxxvit type
2023-01-09 08:57:31 -08:00
Ross Wightman
5078b28f8a
More kwarg handling tweaks, maxvit_base_rw def added
2023-01-09 08:57:31 -08:00
Ross Wightman
c0d7388a1b
Improving kwarg merging in more models
2023-01-09 08:57:31 -08:00
Ross Wightman
ae9153052f
Update version.py
2023-01-06 17:17:35 -08:00
Ross Wightman
60ebb6cefa
Re-order vit pretrained entries for more sensible default weights (no .tag specified)
2023-01-06 16:12:33 -08:00
Ross Wightman
e861b74cf8
Pass through --model-kwargs (and --opt-kwargs for train) from command line through to model __init__. Update some models to improve arg overlay. Cleanup along the way.
2023-01-06 16:12:33 -08:00
Ross Wightman
add3fb864e
Working on improved model card template for push_to_hf_hub
2023-01-06 16:12:33 -08:00
Ross Wightman
dd0bb327e9
Update version.py
...
Ver 0.8.4dev0
2023-01-05 07:55:18 -08:00
Ross Wightman
6e5553da5f
Add ConvNeXt-V2 support (model additions and weights) ( #1614 )
...
* Add ConvNeXt-V2 support (model additions and weights)
* ConvNeXt-V2 weights on HF Hub, tweaking some tests
* Update README, fixing convnextv2 tests
2023-01-05 07:53:32 -08:00
Ross Wightman
6902c48a5f
Fix ResNet based models to work w/ norm layers w/o affine params. Reformat long arg lists into vertical form.
2022-12-29 16:32:26 -08:00
Ross Wightman
d5aa17e415
Remove print from auto_augment
2022-12-28 17:11:35 -08:00
Ross Wightman
7c846d9970
Better vmap compat across recent torch versions
2022-12-24 14:37:04 -08:00
Ross Wightman
4e24f75289
Merge pull request #1593 from rwightman/multi-weight_effnet_convnext
...
Update efficientnet.py and convnext.py to multi-weight, add new 12k pretrained weights
2022-12-23 10:09:08 -08:00
Ross Wightman
8ece53e194
Switch BEiT to HF hub weights
2022-12-22 21:43:04 -08:00
Ross Wightman
d1bfa9a000
Support HF datasets and TFSD w/ a sub-path by fixing split, fix #1598 ... add class mapping support to HF datasets in case class label isn't in info.
2022-12-22 21:34:13 -08:00
Ross Wightman
e2fc43bc63
Version 0.8.2dev0
2022-12-22 17:34:09 -08:00
Ross Wightman
9a51e4ea2e
Add FlexiViT models and weights, refactoring, push more weights
...
* push all vision_transformer*.py weights to HF hub
* finalize more pretrained tags for pushed weights
* refactor pos_embed files and module locations, move some pos embed modules to layers
* tweak hf hub helpers to aid bulk uploading and updating
2022-12-22 17:23:09 -08:00
Fredo Guan
10b3f696b4
Davit std ( #6 )
...
Separate patch_embed module
2022-12-16 21:50:28 -08:00
Ross Wightman
656e1776de
Convert mobilenetv3 to multi-weight, tweak PretrainedCfg metadata
2022-12-16 09:29:13 -08:00
Fredo Guan
546590c5f5
Merge branch 'rwightman:main' into main
2022-12-14 23:44:15 -08:00
Ross Wightman
6a01101905
Update efficientnet.py and convnext.py to multi-weight, add ImageNet-12k pretrained EfficientNet-B5 and ConvNeXt-Nano.
2022-12-14 20:33:23 -08:00