Commit Graph

140 Commits (ebf82b84ac13d3aeed1714a67a5c52cf1193bbad)

Author SHA1 Message Date
Ross Wightman ebf82b84ac
Merge pull request #122 from mrT23/master
TResNet models
2020-04-12 18:23:46 -07:00
Alexey Chernov bdb165a8a4 Merge changes in feature extraction interface to MobileNetV3
Experimental feature extraction interface seems to be changed
a little bit with the most up to date version apparently found
in EfficientNet class. Here these changes are added to
MobileNetV3 class to make it support it and work again, too.
2020-04-13 02:02:14 +03:00
talrid 8a63c1add8 finalizing 2020-04-12 19:08:35 +03:00
talrid 6209146738 TResNet models 2020-04-12 18:44:12 +03:00
Ross Wightman 1a8f5900ab Update EfficientNet feature extraction for EfficientDet. Add needed MaxPoolSame as well. 2020-04-09 01:41:54 -07:00
Chris Ha 06a50a94a8 Fix minor typos in create_attn.py and resnet.py
'eca'->'ceca'
and
doest not-> does not
2020-04-07 21:15:57 +09:00
Ross Wightman c99a5abed4
Merge pull request #115 from rwightman/mobilenetv2-experiment
MobileNet-V2 experiments
2020-04-05 17:32:06 -07:00
Ross Wightman e34074b4da Add final weights for MobileNet-V2 experiments 2020-04-05 17:31:02 -07:00
Ross Wightman a6a5565de7 Fiddling... 2020-03-19 10:20:20 -07:00
Ross Wightman 5a16c533ff Add better resnext50_32x4d weights trained by andravin 2020-03-18 14:43:50 -07:00
Ross Wightman bc998cad91 Experimenting with some MobileNetV2 variations to compare against EfficientNet-Lite 2020-03-18 13:54:06 -07:00
Ross Wightman 3406e582cf Add EfficientNet-Lite results, update README 2020-03-18 13:12:30 -07:00
Ross Wightman bd05258f7b EfficientNet-Lite model added w/ converted checkpoints, validation in progress... 2020-03-17 23:31:45 -07:00
Ross Wightman 56e2ac3a6d
Merge pull request #94 from rwightman/lr_noise
Learning rate noise, MobileNetV3 weights, and activate MobileNetV3/EfficientNet weight init change
2020-02-29 20:41:05 -08:00
Ross Wightman c16f25ced2 Add MobileNetV3 Large weights, results, update README and sotabench for merge 2020-02-29 20:37:20 -08:00
Ross Wightman c60069c1eb Annotate types on drop fns to avoid torchscript error 2020-02-27 09:30:23 -08:00
Ross Wightman 9fee316752 Enable fixed fanout calc in EfficientNet/MobileNetV3 weight init by default. Fix #84 2020-02-24 15:11:26 -08:00
Ross Wightman 43225d110c Unify drop connect vs drop path under 'drop path' name, switch all EfficientNet/MobilenetV3 refs to 'drop_path'. Update factory to handle new drop args. 2020-02-18 14:00:26 -08:00
Ross Wightman f1d5f8a6c4 Update comments for Selective Kernel and DropBlock/Path impl, add skresnet34 weights 2020-02-18 13:58:30 -08:00
Ross Wightman 569419b38d Tweak some comments, add SKNet models with weights to sotabench, remove an unused branch 2020-02-15 21:18:25 -08:00
Ross Wightman 53c47479c4 Batch validation batch size adjustment, tweak L2 crop pct 2020-02-15 20:37:04 -08:00
Ross Wightman 08553e16b3 Merge branch 'master' into attention 2020-02-14 18:24:21 -08:00
Ross Wightman fa38f24967 Update SK network configs, add weights for skresnet8 and skresnext50 2020-02-14 15:37:00 -08:00
Ross Wightman ba15ca47e8 Add ported EfficientNet-L2, B0-B7 NoisyStudent weights from TF TPU 2020-02-12 11:26:38 -08:00
Ross Wightman 5e6dbbaf30 Add CBAM for experimentation 2020-02-10 16:23:09 -08:00
Ross Wightman d725991870 Remove debug print from ECA module 2020-02-10 16:21:33 -08:00
Ross Wightman 2a7d256fd5 Re-enable mem-efficient/jit activations after torchscript tests 2020-02-10 11:59:36 -08:00
Ross Wightman f902bcd54c Layer refactoring continues, ResNet downsample rewrite for proper dilation in 3x3 and avg_pool cases
* select_conv2d -> create_conv2d
* added create_attn to create attention module from string/bool/module
* factor padding helpers into own file, use in both conv2d_same and avg_pool2d_same
* add some more test eca resnet variants
* minor tweaks, naming, comments, consistency
2020-02-10 11:55:03 -08:00
Ross Wightman a99ec4e7d1 A bunch more layer reorg, splitting many layers into own files. Improve torchscript compatibility. 2020-02-09 14:46:28 -08:00
Ross Wightman 13746a33fc Big move, layer modules and fn to timm/models/layers 2020-02-09 13:13:08 -08:00
Ross Wightman f54612f648 Merge branch 'select_kernel' into attention 2020-02-09 12:59:24 -08:00
Ross Wightman 4defbbbaa8 Fix module name mistake, start layers sub-package 2020-02-09 12:44:26 -08:00
Ross Wightman 7011cd0902 A little bit of ECA cleanup 2020-02-09 12:41:18 -08:00
Ross Wightman 46471df7b2 Merge pull request #82 from VRandme/eca
ECA-Net Efficient Channel Attention
2020-02-09 12:31:05 -08:00
Ross Wightman d0eb59ef46 Remove unused default_init for EfficientNets, experimenting with fanout calc for #84 2020-02-09 11:33:32 -08:00
Chris Ha e6a762346a Implement Adaptive Kernel selection
When channel size is given,
calculate adaptive kernel size according to original paper.
Otherwise use the given kernel size(k_size), which defaults to 3
2020-02-09 11:58:03 +09:00
Ross Wightman 13e8da2b46 SelectKernel split_input works best when input channels split like grouped conv, but output is full width. Disable zero_init for SK nets, seems a bad combo. 2020-02-07 22:42:04 -08:00
Chris Ha 6db087a1ff Merge remote-tracking branch 'upstream/master' into eca 2020-02-07 19:36:35 +09:00
Chris Ha 904c618040 Update EcaModule.py
Make pylint happy
(commas, unused imports, missed imports)
2020-02-07 19:36:18 +09:00
Chris Ha db91ba053b EcaModule(CamelCase)
CamelCased EcaModule.
Renamed all instances of ecalayer to EcaModule.
eca_module.py->EcaModule.py
2020-02-07 19:28:07 +09:00
Ross Wightman 5c4991a088 Add PyTorch trained EfficientNet-ES weights from Andrew Lavin 2020-02-06 12:53:55 -08:00
Chris Ha d63ae121d5 Clean up eca_module code
functionally similar
adjusted rwightman's version of reshaping and viewing.
Use F.pad for circular eca version for cleaner code
2020-02-06 22:44:33 +09:00
Chris Ha f87fcd7e88 Implement Eca modules
implement ECA module by
1. adopting original eca_module.py into models folder
2. adding use_eca layer besides every instance of SE layer
2020-02-04 23:15:29 +09:00
Ross Wightman 7d07ebb660 Adding some configs to sknet, incl ResNet50 variants from 'Compounding ... Assembled Techniques' paper and original SKNet50 2020-02-01 23:28:48 -08:00
Ross Wightman a9d2424fd1 Add separate zero_init_last_bn function to support more block variety without a mess 2020-02-01 22:11:00 -08:00
Ross Wightman 355aa152d5 Just leave it float for now, will look at fp16 later. Remove unused reference code. 2020-02-01 22:11:00 -08:00
Ross Wightman ef457555d3 BlockDrop working on GPU 2020-02-01 22:11:00 -08:00
Ross Wightman 3ff19079f9 Missed nn_ops.py from last commit 2020-02-01 22:11:00 -08:00
Ross Wightman 9f11b4e8a2 Add ConvBnAct layer to parallel integrated SelectKernelConv, add support for DropPath and DropBlock to ResNet base and SK blocks 2020-02-01 22:11:00 -08:00
Ross Wightman cefc9b7761 Move SelectKernelConv to conv2d_layers and more
* always apply attention in SelectKernelConv, leave MixedConv for no attention alternative
* make MixedConv torchscript compatible
* refactor first/previous dilation name to make more sense in ResNet* networks
2020-02-01 22:11:00 -08:00