pytorch-image-models/timm/models/layers
Ross Wightman 0d87650fea Remove filter hack from BlurPool w/ non-persistent buffer. Use BlurPool2d instead of AntiAliasing.. for TResNet. Breaks PyTorch < 1.6. 2021-05-04 16:56:28 -07:00
..
__init__.py Remove filter hack from BlurPool w/ non-persistent buffer. Use BlurPool2d instead of AntiAliasing.. for TResNet. Breaks PyTorch < 1.6. 2021-05-04 16:56:28 -07:00
activations.py Fix inplace arg compat for GELU and PreLU via activation factory 2020-11-30 13:27:40 -08:00
activations_jit.py
activations_me.py Merge pull request #282 from tigert1998/patch-1 2021-02-04 12:18:40 -08:00
adaptive_avgmax_pool.py
blur_pool.py Remove filter hack from BlurPool w/ non-persistent buffer. Use BlurPool2d instead of AntiAliasing.. for TResNet. Breaks PyTorch < 1.6. 2021-05-04 16:56:28 -07:00
bottleneck_attn.py ByoaNet with bottleneck transformer, lambda resnet, and halo net experiments 2021-04-12 09:38:02 -07:00
cbam.py
classifier.py ResNetV2 pre-act and non-preact model, w/ BiT pretrained weights and support for ViT R50 model. Tweaks for in21k num_classes passing. More to do... tests failing. 2020-12-28 16:59:15 -08:00
cond_conv2d.py
config.py
conv2d_same.py
conv_bn_act.py Fix regression in models with 1001 class pretrained weights. Improve batchnorm arg and BatchNormAct layer handling in several models. 2021-02-09 16:22:52 -08:00
create_act.py Fix inplace arg compat for GELU and PreLU via activation factory 2020-11-30 13:27:40 -08:00
create_attn.py Fix regression in models with 1001 class pretrained weights. Improve batchnorm arg and BatchNormAct layer handling in several models. 2021-02-09 16:22:52 -08:00
create_conv2d.py Use in_channels for depthwise groups, allows using `out_channels=N * in_channels` (does not impact existing models). Fix #354. 2021-02-09 16:22:52 -08:00
create_norm_act.py Fix regression in models with 1001 class pretrained weights. Improve batchnorm arg and BatchNormAct layer handling in several models. 2021-02-09 16:22:52 -08:00
create_self_attn.py Fixup byoanet configs to pass unit tests. Add swin_attn and swinnet26t model for testing. 2021-04-29 21:08:37 -07:00
drop.py
eca.py
evo_norm.py
halo_attn.py Defaul lambda r=7. Define '26t' stage 4/5 256x256 variants for all of bot/halo/lambda nets for experiment. Add resnet50t for exp. Fix a few comments. 2021-04-29 10:58:49 -07:00
helpers.py update collections.abc import 2021-02-10 23:54:35 +11:00
inplace_abn.py Update README, fix iabn pip version print. 2021-03-07 16:17:06 -08:00
lambda_layer.py Defaul lambda r=7. Define '26t' stage 4/5 256x256 variants for all of bot/halo/lambda nets for experiment. Add resnet50t for exp. Fix a few comments. 2021-04-29 10:58:49 -07:00
linear.py A few more changes for 0.3.2 maint release. Linear layer change for mobilenetv3 and inception_v3, support no bias for linear wrapper. 2020-11-30 16:19:52 -08:00
median_pool.py
mixed_conv2d.py Use in_channels for depthwise groups, allows using `out_channels=N * in_channels` (does not impact existing models). Fix #354. 2021-02-09 16:22:52 -08:00
norm.py Missed norm.py 2021-04-12 09:57:56 -07:00
norm_act.py Fix regression in models with 1001 class pretrained weights. Improve batchnorm arg and BatchNormAct layer handling in several models. 2021-02-09 16:22:52 -08:00
padding.py
pool2d_same.py
se.py Initial Normalizer-Free Reg/ResNet impl. A bit of related layer refactoring. 2021-01-27 22:06:57 -08:00
selective_kernel.py
separable_conv.py Fix regression in models with 1001 class pretrained weights. Improve batchnorm arg and BatchNormAct layer handling in several models. 2021-02-09 16:22:52 -08:00
space_to_depth.py
split_attn.py
split_batchnorm.py
std_conv.py Add ECA-NFNet-L0 weights and update model name. Update README and bump version to 0.4.6 2021-03-17 13:55:32 -07:00
swin_attn.py Fixup byoanet configs to pass unit tests. Add swin_attn and swinnet26t model for testing. 2021-04-29 21:08:37 -07:00
test_time_pool.py Add new weights for ecaresnet26t/50t/269d models. Remove distinction between 't' and 'tn' (tiered models), tn is now t. Add test time img size spec to default cfg. 2021-02-06 16:30:02 -08:00
weight_init.py Cleanup experimental vit weight init a bit 2021-03-20 09:44:24 -07:00