pytorch-image-models/timm/models
Ross Wightman 493c730ffc Fix pit regression 2023-04-26 23:16:06 -07:00
..
_pruned
layers
__init__.py Final push to get remaining models using multi-weight pretrained configs, almost all weights on HF hub 2023-04-26 15:52:13 -07:00
_builder.py
_efficientnet_blocks.py
_efficientnet_builder.py
_factory.py Small factory handling fix for pretrained tag vs cfg 2023-04-11 07:42:13 -07:00
_features.py
_features_fx.py
_helpers.py Fix numel use in helpers for checkpoint remap 2023-03-20 09:36:48 -07:00
_hub.py More models w/ multi-weight support, moving to HF hub. Removing inplace_abn from all models including TResNet 2023-04-20 22:44:49 -07:00
_manipulate.py Swin and FocalNet weights on HF hub. Add model deprecation functionality w/ some registry tweaks. 2023-03-18 14:55:09 -07:00
_pretrained.py Swin and FocalNet weights on HF hub. Add model deprecation functionality w/ some registry tweaks. 2023-03-18 14:55:09 -07:00
_prune.py
_registry.py Update warning message for deprecated model names 2023-04-05 17:24:17 -07:00
beit.py fast_attn -> fused_attn, implement global config for enable/disable fused_attn, add to more models. vit clip openai 336 weights. 2023-04-10 12:04:33 -07:00
byoanet.py More models w/ multi-weight support, moving to HF hub. Removing inplace_abn from all models including TResNet 2023-04-20 22:44:49 -07:00
byobnet.py Add update byobnet.py w/ models pushed to HF hub 2023-03-22 10:00:00 -07:00
cait.py cait, volo, xvit hub weights 2023-04-14 10:13:13 -07:00
coat.py Final push to get remaining models using multi-weight pretrained configs, almost all weights on HF hub 2023-04-26 15:52:13 -07:00
convit.py Final push to get remaining models using multi-weight pretrained configs, almost all weights on HF hub 2023-04-26 15:52:13 -07:00
convmixer.py Final push to get remaining models using multi-weight pretrained configs, almost all weights on HF hub 2023-04-26 15:52:13 -07:00
convnext.py Add ImageNet-12k intermediate fine-tunes of convnext base & large CLIP models, add first 1k fine-tune of xxlarge 2023-03-31 16:45:01 -07:00
crossvit.py Final push to get remaining models using multi-weight pretrained configs, almost all weights on HF hub 2023-04-26 15:52:13 -07:00
cspnet.py cspnet models on HF hub w/ multi-weight support 2023-04-12 14:02:38 -07:00
davit.py fast_attn -> fused_attn, implement global config for enable/disable fused_attn, add to more models. vit clip openai 336 weights. 2023-04-10 12:04:33 -07:00
deit.py Improve kwarg passthrough for swin, vit, deit, beit, eva 2023-04-05 21:37:16 -07:00
densenet.py DenseNet, DPN, VoVNet, Aligned Xception weights on HF hub. DenseNet grad_checkpointing using timm API 2023-04-21 16:56:44 -07:00
dla.py Final push to get remaining models using multi-weight pretrained configs, almost all weights on HF hub 2023-04-26 15:52:13 -07:00
dpn.py DenseNet, DPN, VoVNet, Aligned Xception weights on HF hub. DenseNet grad_checkpointing using timm API 2023-04-21 16:56:44 -07:00
edgenext.py Final push to get remaining models using multi-weight pretrained configs, almost all weights on HF hub 2023-04-26 15:52:13 -07:00
efficientformer.py Implement patch dropout for eva / vision_transformer, refactor / improve consistency of dropout args across all vit based models 2023-04-07 20:27:23 -07:00
efficientformer_v2.py Implement patch dropout for eva / vision_transformer, refactor / improve consistency of dropout args across all vit based models 2023-04-07 20:27:23 -07:00
efficientnet.py Some fixes 2023-04-26 17:46:20 -07:00
eva.py Add finalized eva CLIP weights pointing to remapped timm hub models 2023-04-10 23:13:12 -07:00
factory.py
features.py
focalnet.py Implement patch dropout for eva / vision_transformer, refactor / improve consistency of dropout args across all vit based models 2023-04-07 20:27:23 -07:00
fx_features.py
gcvit.py
ghostnet.py Final push to get remaining models using multi-weight pretrained configs, almost all weights on HF hub 2023-04-26 15:52:13 -07:00
hardcorenas.py hardcore nas weights on hf hub 2023-04-21 14:35:10 -07:00
helpers.py
hrnet.py Some fixes 2023-04-26 17:46:20 -07:00
hub.py
inception_resnet_v2.py Final push to get remaining models using multi-weight pretrained configs, almost all weights on HF hub 2023-04-26 15:52:13 -07:00
inception_v3.py Final push to get remaining models using multi-weight pretrained configs, almost all weights on HF hub 2023-04-26 15:52:13 -07:00
inception_v4.py Final push to get remaining models using multi-weight pretrained configs, almost all weights on HF hub 2023-04-26 15:52:13 -07:00
levit.py Swin and FocalNet weights on HF hub. Add model deprecation functionality w/ some registry tweaks. 2023-03-18 14:55:09 -07:00
maxxvit.py fast_attn -> fused_attn, implement global config for enable/disable fused_attn, add to more models. vit clip openai 336 weights. 2023-04-10 12:04:33 -07:00
mlp_mixer.py Implement patch dropout for eva / vision_transformer, refactor / improve consistency of dropout args across all vit based models 2023-04-07 20:27:23 -07:00
mobilenetv3.py Swin and FocalNet weights on HF hub. Add model deprecation functionality w/ some registry tweaks. 2023-03-18 14:55:09 -07:00
mobilevit.py Final push to get remaining models using multi-weight pretrained configs, almost all weights on HF hub 2023-04-26 15:52:13 -07:00
mvitv2.py Implement patch dropout for eva / vision_transformer, refactor / improve consistency of dropout args across all vit based models 2023-04-07 20:27:23 -07:00
nasnet.py Final push to get remaining models using multi-weight pretrained configs, almost all weights on HF hub 2023-04-26 15:52:13 -07:00
nest.py Final push to get remaining models using multi-weight pretrained configs, almost all weights on HF hub 2023-04-26 15:52:13 -07:00
nfnet.py Fix last min torchscript regression in nfnet changes 2023-03-24 00:10:17 -07:00
pit.py Fix pit regression 2023-04-26 23:16:06 -07:00
pnasnet.py Final push to get remaining models using multi-weight pretrained configs, almost all weights on HF hub 2023-04-26 15:52:13 -07:00
poolformer.py Implement patch dropout for eva / vision_transformer, refactor / improve consistency of dropout args across all vit based models 2023-04-07 20:27:23 -07:00
pvt_v2.py Always some torchscript issues 2023-04-26 20:42:34 -07:00
registry.py
regnet.py regnet.py multi-weight conversion, new ImageNet-12k pretrain/ft from timm for y_120 and y_160, also new tv v2, swag, & seer weights for push to Hf hub. 2023-03-21 15:51:49 -07:00
res2net.py Some fixes 2023-04-26 17:46:20 -07:00
resnest.py Final push to get remaining models using multi-weight pretrained configs, almost all weights on HF hub 2023-04-26 15:52:13 -07:00
resnet.py Wrong pool_size for 288 ft 2023-04-05 16:07:51 -07:00
resnetv2.py Update resnetv2.py for multi-weight and HF hub weights 2023-03-22 15:38:04 -07:00
rexnet.py Update crop settings for new rexnet weights 2023-03-22 15:39:49 -07:00
selecsls.py Final push to get remaining models using multi-weight pretrained configs, almost all weights on HF hub 2023-04-26 15:52:13 -07:00
senet.py Final push to get remaining models using multi-weight pretrained configs, almost all weights on HF hub 2023-04-26 15:52:13 -07:00
sequencer.py Some fixes 2023-04-26 17:46:20 -07:00
sknet.py Final push to get remaining models using multi-weight pretrained configs, almost all weights on HF hub 2023-04-26 15:52:13 -07:00
swin_transformer.py fast_attn -> fused_attn, implement global config for enable/disable fused_attn, add to more models. vit clip openai 336 weights. 2023-04-10 12:04:33 -07:00
swin_transformer_v2.py Update swin_v2 attn_mask buffer change in #1790 to apply to updated checkpoints in hub 2023-04-11 14:40:32 -07:00
swin_transformer_v2_cr.py Implement patch dropout for eva / vision_transformer, refactor / improve consistency of dropout args across all vit based models 2023-04-07 20:27:23 -07:00
tnt.py Implement patch dropout for eva / vision_transformer, refactor / improve consistency of dropout args across all vit based models 2023-04-07 20:27:23 -07:00
tresnet.py TResNet weights now on HF hub, modified to remove InplaceABN dependency 2023-04-21 14:20:48 -07:00
twins.py Final push to get remaining models using multi-weight pretrained configs, almost all weights on HF hub 2023-04-26 15:52:13 -07:00
vgg.py Final push to get remaining models using multi-weight pretrained configs, almost all weights on HF hub 2023-04-26 15:52:13 -07:00
visformer.py Final push to get remaining models using multi-weight pretrained configs, almost all weights on HF hub 2023-04-26 15:52:13 -07:00
vision_transformer.py fast_attn -> fused_attn, implement global config for enable/disable fused_attn, add to more models. vit clip openai 336 weights. 2023-04-10 12:04:33 -07:00
vision_transformer_hybrid.py Improve kwarg passthrough for swin, vit, deit, beit, eva 2023-04-05 21:37:16 -07:00
vision_transformer_relpos.py Final push to get remaining models using multi-weight pretrained configs, almost all weights on HF hub 2023-04-26 15:52:13 -07:00
volo.py cait, volo, xvit hub weights 2023-04-14 10:13:13 -07:00
vovnet.py DenseNet, DPN, VoVNet, Aligned Xception weights on HF hub. DenseNet grad_checkpointing using timm API 2023-04-21 16:56:44 -07:00
xception.py Some fixes 2023-04-26 17:46:20 -07:00
xception_aligned.py DenseNet, DPN, VoVNet, Aligned Xception weights on HF hub. DenseNet grad_checkpointing using timm API 2023-04-21 16:56:44 -07:00
xcit.py cait, volo, xvit hub weights 2023-04-14 10:13:13 -07:00