pytorch-image-models/timm/models
Ross Wightman 7ad7ddb7ad DenseNet, DPN, VoVNet, Aligned Xception weights on HF hub. DenseNet grad_checkpointing using timm API 2023-04-21 16:56:44 -07:00
..
_pruned Major module / path restructure, timm.models.layers -> timm.layers, add _ prefix to all non model modules in timm.models 2022-12-06 15:00:06 -08:00
layers Davit update formatting and fix grad checkpointing (#7) 2023-01-15 14:34:56 -08:00
__init__.py ResNet models on HF hub, multi-weight support, add torchvision v2 weights, new 12k pretrained and fine-tuned timm anti-aliased weights 2023-04-05 14:19:42 -07:00
_builder.py All swin models support spatial output, add output_fmt to v1/v2 and use ClassifierHead. 2023-03-15 23:21:51 -07:00
_efficientnet_blocks.py Major module / path restructure, timm.models.layers -> timm.layers, add _ prefix to all non model modules in timm.models 2022-12-06 15:00:06 -08:00
_efficientnet_builder.py Major module / path restructure, timm.models.layers -> timm.layers, add _ prefix to all non model modules in timm.models 2022-12-06 15:00:06 -08:00
_factory.py Small factory handling fix for pretrained tag vs cfg 2023-04-11 07:42:13 -07:00
_features.py return_map back to out_map for _feature helpers 2023-03-16 14:50:55 -07:00
_features_fx.py All swin models support spatial output, add output_fmt to v1/v2 and use ClassifierHead. 2023-03-15 23:21:51 -07:00
_helpers.py Fix numel use in helpers for checkpoint remap 2023-03-20 09:36:48 -07:00
_hub.py More models w/ multi-weight support, moving to HF hub. Removing inplace_abn from all models including TResNet 2023-04-20 22:44:49 -07:00
_manipulate.py Swin and FocalNet weights on HF hub. Add model deprecation functionality w/ some registry tweaks. 2023-03-18 14:55:09 -07:00
_pretrained.py Swin and FocalNet weights on HF hub. Add model deprecation functionality w/ some registry tweaks. 2023-03-18 14:55:09 -07:00
_prune.py Update model prune loader to use pkgutil 2023-02-06 17:45:16 -08:00
_registry.py Update warning message for deprecated model names 2023-04-05 17:24:17 -07:00
beit.py fast_attn -> fused_attn, implement global config for enable/disable fused_attn, add to more models. vit clip openai 336 weights. 2023-04-10 12:04:33 -07:00
byoanet.py More models w/ multi-weight support, moving to HF hub. Removing inplace_abn from all models including TResNet 2023-04-20 22:44:49 -07:00
byobnet.py Add update byobnet.py w/ models pushed to HF hub 2023-03-22 10:00:00 -07:00
cait.py cait, volo, xvit hub weights 2023-04-14 10:13:13 -07:00
coat.py Implement patch dropout for eva / vision_transformer, refactor / improve consistency of dropout args across all vit based models 2023-04-07 20:27:23 -07:00
convit.py Implement patch dropout for eva / vision_transformer, refactor / improve consistency of dropout args across all vit based models 2023-04-07 20:27:23 -07:00
convmixer.py Implement patch dropout for eva / vision_transformer, refactor / improve consistency of dropout args across all vit based models 2023-04-07 20:27:23 -07:00
convnext.py Add ImageNet-12k intermediate fine-tunes of convnext base & large CLIP models, add first 1k fine-tune of xxlarge 2023-03-31 16:45:01 -07:00
crossvit.py Implement patch dropout for eva / vision_transformer, refactor / improve consistency of dropout args across all vit based models 2023-04-07 20:27:23 -07:00
cspnet.py cspnet models on HF hub w/ multi-weight support 2023-04-12 14:02:38 -07:00
davit.py fast_attn -> fused_attn, implement global config for enable/disable fused_attn, add to more models. vit clip openai 336 weights. 2023-04-10 12:04:33 -07:00
deit.py Improve kwarg passthrough for swin, vit, deit, beit, eva 2023-04-05 21:37:16 -07:00
densenet.py DenseNet, DPN, VoVNet, Aligned Xception weights on HF hub. DenseNet grad_checkpointing using timm API 2023-04-21 16:56:44 -07:00
dla.py All swin models support spatial output, add output_fmt to v1/v2 and use ClassifierHead. 2023-03-15 23:21:51 -07:00
dpn.py DenseNet, DPN, VoVNet, Aligned Xception weights on HF hub. DenseNet grad_checkpointing using timm API 2023-04-21 16:56:44 -07:00
edgenext.py Implement patch dropout for eva / vision_transformer, refactor / improve consistency of dropout args across all vit based models 2023-04-07 20:27:23 -07:00
efficientformer.py Implement patch dropout for eva / vision_transformer, refactor / improve consistency of dropout args across all vit based models 2023-04-07 20:27:23 -07:00
efficientformer_v2.py Implement patch dropout for eva / vision_transformer, refactor / improve consistency of dropout args across all vit based models 2023-04-07 20:27:23 -07:00
efficientnet.py Swin and FocalNet weights on HF hub. Add model deprecation functionality w/ some registry tweaks. 2023-03-18 14:55:09 -07:00
eva.py Add finalized eva CLIP weights pointing to remapped timm hub models 2023-04-10 23:13:12 -07:00
factory.py Add a deprecation phase to module re-org 2022-12-09 14:39:45 -08:00
features.py Add a deprecation phase to module re-org 2022-12-09 14:39:45 -08:00
focalnet.py Implement patch dropout for eva / vision_transformer, refactor / improve consistency of dropout args across all vit based models 2023-04-07 20:27:23 -07:00
fx_features.py Add a deprecation phase to module re-org 2022-12-09 14:39:45 -08:00
gcvit.py All swin models support spatial output, add output_fmt to v1/v2 and use ClassifierHead. 2023-03-15 23:21:51 -07:00
ghostnet.py Major module / path restructure, timm.models.layers -> timm.layers, add _ prefix to all non model modules in timm.models 2022-12-06 15:00:06 -08:00
gluon_xception.py Major module / path restructure, timm.models.layers -> timm.layers, add _ prefix to all non model modules in timm.models 2022-12-06 15:00:06 -08:00
hardcorenas.py hardcore nas weights on hf hub 2023-04-21 14:35:10 -07:00
helpers.py Add a deprecation phase to module re-org 2022-12-09 14:39:45 -08:00
hrnet.py Major module / path restructure, timm.models.layers -> timm.layers, add _ prefix to all non model modules in timm.models 2022-12-06 15:00:06 -08:00
hub.py Add a deprecation phase to module re-org 2022-12-09 14:39:45 -08:00
inception_resnet_v2.py Major module / path restructure, timm.models.layers -> timm.layers, add _ prefix to all non model modules in timm.models 2022-12-06 15:00:06 -08:00
inception_v3.py Major module / path restructure, timm.models.layers -> timm.layers, add _ prefix to all non model modules in timm.models 2022-12-06 15:00:06 -08:00
inception_v4.py Major module / path restructure, timm.models.layers -> timm.layers, add _ prefix to all non model modules in timm.models 2022-12-06 15:00:06 -08:00
levit.py Swin and FocalNet weights on HF hub. Add model deprecation functionality w/ some registry tweaks. 2023-03-18 14:55:09 -07:00
maxxvit.py fast_attn -> fused_attn, implement global config for enable/disable fused_attn, add to more models. vit clip openai 336 weights. 2023-04-10 12:04:33 -07:00
mlp_mixer.py Implement patch dropout for eva / vision_transformer, refactor / improve consistency of dropout args across all vit based models 2023-04-07 20:27:23 -07:00
mobilenetv3.py Swin and FocalNet weights on HF hub. Add model deprecation functionality w/ some registry tweaks. 2023-03-18 14:55:09 -07:00
mobilevit.py Implement patch dropout for eva / vision_transformer, refactor / improve consistency of dropout args across all vit based models 2023-04-07 20:27:23 -07:00
mvitv2.py Implement patch dropout for eva / vision_transformer, refactor / improve consistency of dropout args across all vit based models 2023-04-07 20:27:23 -07:00
nasnet.py Major module / path restructure, timm.models.layers -> timm.layers, add _ prefix to all non model modules in timm.models 2022-12-06 15:00:06 -08:00
nest.py Implement patch dropout for eva / vision_transformer, refactor / improve consistency of dropout args across all vit based models 2023-04-07 20:27:23 -07:00
nfnet.py Fix last min torchscript regression in nfnet changes 2023-03-24 00:10:17 -07:00
pit.py Implement patch dropout for eva / vision_transformer, refactor / improve consistency of dropout args across all vit based models 2023-04-07 20:27:23 -07:00
pnasnet.py Major module / path restructure, timm.models.layers -> timm.layers, add _ prefix to all non model modules in timm.models 2022-12-06 15:00:06 -08:00
poolformer.py Implement patch dropout for eva / vision_transformer, refactor / improve consistency of dropout args across all vit based models 2023-04-07 20:27:23 -07:00
pvt_v2.py Implement patch dropout for eva / vision_transformer, refactor / improve consistency of dropout args across all vit based models 2023-04-07 20:27:23 -07:00
registry.py Add a deprecation phase to module re-org 2022-12-09 14:39:45 -08:00
regnet.py regnet.py multi-weight conversion, new ImageNet-12k pretrain/ft from timm for y_120 and y_160, also new tv v2, swag, & seer weights for push to Hf hub. 2023-03-21 15:51:49 -07:00
res2net.py Davit update formatting and fix grad checkpointing (#7) 2023-01-15 14:34:56 -08:00
resnest.py Davit update formatting and fix grad checkpointing (#7) 2023-01-15 14:34:56 -08:00
resnet.py Wrong pool_size for 288 ft 2023-04-05 16:07:51 -07:00
resnetv2.py Update resnetv2.py for multi-weight and HF hub weights 2023-03-22 15:38:04 -07:00
rexnet.py Update crop settings for new rexnet weights 2023-03-22 15:39:49 -07:00
selecsls.py Major module / path restructure, timm.models.layers -> timm.layers, add _ prefix to all non model modules in timm.models 2022-12-06 15:00:06 -08:00
senet.py Major module / path restructure, timm.models.layers -> timm.layers, add _ prefix to all non model modules in timm.models 2022-12-06 15:00:06 -08:00
sequencer.py Major module / path restructure, timm.models.layers -> timm.layers, add _ prefix to all non model modules in timm.models 2022-12-06 15:00:06 -08:00
sknet.py Davit update formatting and fix grad checkpointing (#7) 2023-01-15 14:34:56 -08:00
swin_transformer.py fast_attn -> fused_attn, implement global config for enable/disable fused_attn, add to more models. vit clip openai 336 weights. 2023-04-10 12:04:33 -07:00
swin_transformer_v2.py Update swin_v2 attn_mask buffer change in #1790 to apply to updated checkpoints in hub 2023-04-11 14:40:32 -07:00
swin_transformer_v2_cr.py Implement patch dropout for eva / vision_transformer, refactor / improve consistency of dropout args across all vit based models 2023-04-07 20:27:23 -07:00
tnt.py Implement patch dropout for eva / vision_transformer, refactor / improve consistency of dropout args across all vit based models 2023-04-07 20:27:23 -07:00
tresnet.py TResNet weights now on HF hub, modified to remove InplaceABN dependency 2023-04-21 14:20:48 -07:00
twins.py fast_attn -> fused_attn, implement global config for enable/disable fused_attn, add to more models. vit clip openai 336 weights. 2023-04-10 12:04:33 -07:00
vgg.py Major module / path restructure, timm.models.layers -> timm.layers, add _ prefix to all non model modules in timm.models 2022-12-06 15:00:06 -08:00
visformer.py fast_attn -> fused_attn, implement global config for enable/disable fused_attn, add to more models. vit clip openai 336 weights. 2023-04-10 12:04:33 -07:00
vision_transformer.py fast_attn -> fused_attn, implement global config for enable/disable fused_attn, add to more models. vit clip openai 336 weights. 2023-04-10 12:04:33 -07:00
vision_transformer_hybrid.py Improve kwarg passthrough for swin, vit, deit, beit, eva 2023-04-05 21:37:16 -07:00
vision_transformer_relpos.py Missed a fused_attn update in relpos vit 2023-04-10 23:30:50 -07:00
volo.py cait, volo, xvit hub weights 2023-04-14 10:13:13 -07:00
vovnet.py DenseNet, DPN, VoVNet, Aligned Xception weights on HF hub. DenseNet grad_checkpointing using timm API 2023-04-21 16:56:44 -07:00
xception.py Major module / path restructure, timm.models.layers -> timm.layers, add _ prefix to all non model modules in timm.models 2022-12-06 15:00:06 -08:00
xception_aligned.py DenseNet, DPN, VoVNet, Aligned Xception weights on HF hub. DenseNet grad_checkpointing using timm API 2023-04-21 16:56:44 -07:00
xcit.py cait, volo, xvit hub weights 2023-04-14 10:13:13 -07:00