127 Commits

Author SHA1 Message Date
Chris Ha
e6a762346a Implement Adaptive Kernel selection
When channel size is given,
calculate adaptive kernel size according to original paper.
Otherwise use the given kernel size(k_size), which defaults to 3
2020-02-09 11:58:03 +09:00
Chris Ha
6db087a1ff Merge remote-tracking branch 'upstream/master' into eca 2020-02-07 19:36:35 +09:00
Chris Ha
904c618040 Update EcaModule.py
Make pylint happy
(commas, unused imports, missed imports)
2020-02-07 19:36:18 +09:00
Chris Ha
db91ba053b EcaModule(CamelCase)
CamelCased EcaModule.
Renamed all instances of ecalayer to EcaModule.
eca_module.py->EcaModule.py
2020-02-07 19:28:07 +09:00
Ross Wightman
5c4991a088 Add PyTorch trained EfficientNet-ES weights from Andrew Lavin 2020-02-06 12:53:55 -08:00
Chris Ha
d04ff95eda Merge branch 'master' into eca 2020-02-06 22:44:52 +09:00
Chris Ha
d63ae121d5 Clean up eca_module code
functionally similar
adjusted rwightman's version of reshaping and viewing.
Use F.pad for circular eca version for cleaner code
2020-02-06 22:44:33 +09:00
Ross Wightman
d66819d1f3 Indentation mistake. Fixes #81 2020-02-04 22:56:00 -08:00
Chris Ha
f87fcd7e88 Implement Eca modules
implement ECA module by
1. adopting original eca_module.py into models folder
2. adding use_eca layer besides every instance of SE layer
2020-02-04 23:15:29 +09:00
Ross Wightman
4808b3c32f Bump version for PyPi update, fix few out of date README items/mistakes, add README updates for TF EfficientNet-B8 (RandAugment) 2020-02-03 11:44:17 -08:00
Ross Wightman
1daa303744 Add support to Dataset for class id mapping file, clean up a bit of old logic. Add results file arg for validation and update script. 2020-02-01 18:07:32 -08:00
Ross Wightman
91534522f9 Add newly added TF ported EfficientNet-B8 weights (RandAugment) 2020-02-01 18:01:14 -08:00
Ross Wightman
12dbc74742 New ResNet50 JSD + RandAugment weights 2020-01-31 10:55:54 -08:00
Ross Wightman
2f41905ba5 Update ResNet50 weights to AuxMix trained 78.994 top-1. A few commentes re 'tiered_narrow' tn variant. 2020-01-12 17:55:58 -08:00
Ross Wightman
d9a6a9d0af
Merge pull request #74 from rwightman/augmix-jsd
AugMix, JSD loss, SplitBatchNorm (Auxiliary BN), and more
2020-01-11 12:04:29 -08:00
Ross Wightman
3eb4a96eda Update AugMix, JSD, etc comments and references 2020-01-11 12:02:05 -08:00
Ross Wightman
a28117ea46 Add tiered narrow ResNet (tn) and weights for seresnext26tn_32x4d 2020-01-11 11:29:01 -08:00
Ross Wightman
833066b540 A few minor things in SplitBN 2020-01-05 20:07:03 -08:00
Ross Wightman
7547119891 Add SplitBatchNorm. AugMix, Rand/AutoAugment, Split (Aux) BatchNorm, Jensen-Shannon Divergence, RandomErasing all working together 2020-01-05 19:58:59 -08:00
Ross Wightman
2e955cfd0c Update RandomErasing with some improved arg names, tweak to aspect range 2020-01-05 14:31:48 -08:00
Ross Wightman
3cc0f91e23 Fix augmix variable name scope overlap, default non-blended mode 2020-01-05 14:27:27 -08:00
Ross Wightman
ec0dd4053a Add updated RandAugment trained EfficientNet-B0 trained weights from @michaelklachko 2020-01-03 17:18:46 -08:00
Ross Wightman
40fea63ebe Add checkpoint averaging script. Add headers, shebangs, exec perms to all scripts 2020-01-03 14:57:46 -08:00
Ross Wightman
4666cc9aed Add --pin-mem arg to enable dataloader pin_memory (showing more benefit in some scenarios now), also add --torchscript arg to validate.py for testing models with jit.script 2020-01-02 16:22:06 -08:00
Ross Wightman
53001dd292 ResNet / Res2Net additions:
* ResNet torchscript compat
* output_stride arg supported to limit network stride via dilations (support for dilation added to Res2Net)
* allow activation layer to be changed via act_layer arg
2020-01-01 17:15:56 -08:00
Ross Wightman
f96b3e5e92 InceptionResNetV2 torchscript compatible 2020-01-01 17:13:37 -08:00
Ross Wightman
19d93fe454 Add selecsls60 weights 2019-12-31 16:49:04 -08:00
Ross Wightman
0062c15fb0 Update checkpoint url with modelzoo compatible ones. 2019-12-30 15:59:19 -08:00
Ross Wightman
b5315e66b5 Streamline SelecSLS model without breaking checkpoint compat. Move cfg handling out of model class. Update feature/pooling behaviour to match current. 2019-12-30 15:44:47 -08:00
Ross Wightman
d59a756c16 Run PyCharm autoformat on selecsls and change mix cap variables and model names to all lower 2019-12-30 14:30:46 -08:00
Ross Wightman
fb3a0f4bb8
Merge pull request #65 from mehtadushy/selecsls
Incorporate SelecSLS Models
2019-12-30 14:23:53 -08:00
Ross Wightman
19fc205a4d Update comments on the new SE-ResNeXt26 models 2019-12-28 17:33:10 -08:00
Ross Wightman
acc3ed2b8c Add EfficientNet-B3 weights, trained from scratch with RA. 2019-12-28 17:24:15 -08:00
Dushyant Mehta
2404361f62 correct asset paths 2019-12-28 23:32:20 +01:00
Dushyant Mehta
31939311f6 Added SelecSLS models 2019-12-28 23:06:00 +01:00
rwightman
1f4498f217 Add ResNet deep tiered stem and model weights for seresnext26t_32x4d and seresnext26d_32x4d 2019-12-28 11:43:50 -08:00
Dushyant Mehta
32012a44fd Added SelecSLS model 2019-12-28 20:41:55 +01:00
Ross Wightman
73b78459dc Add update RandAugment MixNet-XL weights 2019-12-24 10:08:24 -08:00
Ross Wightman
3afc2a4dc0 Some cleanup/improvements to AugMix impl:
* make 'increasing' levels for Contrast, Color, Brightness, Saturation ops
* remove recursion from faster blending mix
* add config striing parsing for AugMix
2019-12-20 23:04:11 -08:00
Ross Wightman
232ab7fb12 Working on an implementation of AugMix with JensenShannonDivergence loss that's compatible with my AutoAugment and RandAugment impl 2019-12-20 23:04:11 -08:00
Ross Wightman
a435ea1327 Change reduce_bn to distribute_bn, add ability to choose between broadcast and reduce (mean). Add crop_pct arg to allow selecting validation crop while training. 2019-12-19 22:56:54 -08:00
Ross Wightman
3bff2b21dc Add support for keeping running bn stats the same across distributed training nodes before eval/save 2019-12-05 22:35:40 -08:00
Ross Wightman
0161de0127 Switch RandoErasing back to on GPU normal sampling 2019-12-05 22:35:08 -08:00
Ross Wightman
ff421e5e09 New PyTorch trained EfficientNet-B2 weights with my RandAugment impl 2019-12-04 11:09:47 -08:00
Ross Wightman
3bef524f9c Finish with HRNet, weights and models updated. Improve consistency in model classifier/global pool treatment. 2019-11-29 17:56:36 -08:00
Ross Wightman
6ca0828166 Update EfficientNet comments, MobileNetV3 non-TF create fns, fix factory arg checks, bump PyTorch version req to 1.2 2019-11-28 17:43:00 -08:00
Ross Wightman
eccbadca74 Update EfficientNet comments 2019-11-28 17:11:53 -08:00
Ross Wightman
902d32fb16 Renamed gen_efficientnet.py -> efficientnet.py 2019-11-28 17:04:35 -08:00
Ross Wightman
5a0a8de7e3 ResNet updates:
* remove redundant GluonResNet model/blocks and use the code in ResNet for Gluon weights
* change SEModules back to using AdaptiveAvgPool instead of mean, PyTorch issue long fixed
2019-11-28 17:04:35 -08:00
Ross Wightman
a39cc43374 Bring EfficientNet and MobileNetV3 up to date with my gen-efficientnet repo
* Split MobileNetV3 and EfficientNet model files and put builder and blocks in own files (getting too large)
* Finalize CondConv EfficientNet variant
* Add the AdvProp weights files and B8 EfficientNet model
* Refine the feature extraction module for EfficientNet and MobileNetV3
2019-11-28 17:04:35 -08:00