Ross Wightman
08553e16b3
Merge branch 'master' into attention
2020-02-14 18:24:21 -08:00
Ross Wightman
fa38f24967
Update SK network configs, add weights for skresnet8 and skresnext50
2020-02-14 15:37:00 -08:00
Ross Wightman
f098fda2ca
Add map_location='cpu' to ModelEma resume, should improve #72
2020-02-12 13:23:56 -08:00
Ross Wightman
ba15ca47e8
Add ported EfficientNet-L2, B0-B7 NoisyStudent weights from TF TPU
2020-02-12 11:26:38 -08:00
Ross Wightman
5e6dbbaf30
Add CBAM for experimentation
2020-02-10 16:23:09 -08:00
Ross Wightman
d725991870
Remove debug print from ECA module
2020-02-10 16:21:33 -08:00
Ross Wightman
2a7d256fd5
Re-enable mem-efficient/jit activations after torchscript tests
2020-02-10 11:59:36 -08:00
Ross Wightman
f902bcd54c
Layer refactoring continues, ResNet downsample rewrite for proper dilation in 3x3 and avg_pool cases
...
* select_conv2d -> create_conv2d
* added create_attn to create attention module from string/bool/module
* factor padding helpers into own file, use in both conv2d_same and avg_pool2d_same
* add some more test eca resnet variants
* minor tweaks, naming, comments, consistency
2020-02-10 11:55:03 -08:00
Ross Wightman
a99ec4e7d1
A bunch more layer reorg, splitting many layers into own files. Improve torchscript compatibility.
2020-02-09 14:46:28 -08:00
Ross Wightman
13746a33fc
Big move, layer modules and fn to timm/models/layers
2020-02-09 13:13:08 -08:00
Ross Wightman
f54612f648
Merge branch 'select_kernel' into attention
2020-02-09 12:59:24 -08:00
Ross Wightman
4defbbbaa8
Fix module name mistake, start layers sub-package
2020-02-09 12:44:26 -08:00
Ross Wightman
7011cd0902
A little bit of ECA cleanup
2020-02-09 12:41:18 -08:00
Ross Wightman
46471df7b2
Merge pull request #82 from VRandme/eca
...
ECA-Net Efficient Channel Attention
2020-02-09 12:31:05 -08:00
Ross Wightman
d0eb59ef46
Remove unused default_init for EfficientNets, experimenting with fanout calc for #84
2020-02-09 11:33:32 -08:00
Chris Ha
e6a762346a
Implement Adaptive Kernel selection
...
When channel size is given,
calculate adaptive kernel size according to original paper.
Otherwise use the given kernel size(k_size), which defaults to 3
2020-02-09 11:58:03 +09:00
Ross Wightman
13e8da2b46
SelectKernel split_input works best when input channels split like grouped conv, but output is full width. Disable zero_init for SK nets, seems a bad combo.
2020-02-07 22:42:04 -08:00
Chris Ha
6db087a1ff
Merge remote-tracking branch 'upstream/master' into eca
2020-02-07 19:36:35 +09:00
Chris Ha
904c618040
Update EcaModule.py
...
Make pylint happy
(commas, unused imports, missed imports)
2020-02-07 19:36:18 +09:00
Chris Ha
db91ba053b
EcaModule(CamelCase)
...
CamelCased EcaModule.
Renamed all instances of ecalayer to EcaModule.
eca_module.py->EcaModule.py
2020-02-07 19:28:07 +09:00
Ross Wightman
5c4991a088
Add PyTorch trained EfficientNet-ES weights from Andrew Lavin
2020-02-06 12:53:55 -08:00
Chris Ha
d04ff95eda
Merge branch 'master' into eca
2020-02-06 22:44:52 +09:00
Chris Ha
d63ae121d5
Clean up eca_module code
...
functionally similar
adjusted rwightman's version of reshaping and viewing.
Use F.pad for circular eca version for cleaner code
2020-02-06 22:44:33 +09:00
Ross Wightman
d66819d1f3
Indentation mistake. Fixes #81
2020-02-04 22:56:00 -08:00
Chris Ha
f87fcd7e88
Implement Eca modules
...
implement ECA module by
1. adopting original eca_module.py into models folder
2. adding use_eca layer besides every instance of SE layer
2020-02-04 23:15:29 +09:00
Ross Wightman
4808b3c32f
Bump version for PyPi update, fix few out of date README items/mistakes, add README updates for TF EfficientNet-B8 (RandAugment)
2020-02-03 11:44:17 -08:00
Ross Wightman
7d07ebb660
Adding some configs to sknet, incl ResNet50 variants from 'Compounding ... Assembled Techniques' paper and original SKNet50
2020-02-01 23:28:48 -08:00
Ross Wightman
a9d2424fd1
Add separate zero_init_last_bn function to support more block variety without a mess
2020-02-01 22:11:00 -08:00
Ross Wightman
355aa152d5
Just leave it float for now, will look at fp16 later. Remove unused reference code.
2020-02-01 22:11:00 -08:00
Ross Wightman
ef457555d3
BlockDrop working on GPU
2020-02-01 22:11:00 -08:00
Ross Wightman
3ff19079f9
Missed nn_ops.py from last commit
2020-02-01 22:11:00 -08:00
Ross Wightman
9f11b4e8a2
Add ConvBnAct layer to parallel integrated SelectKernelConv, add support for DropPath and DropBlock to ResNet base and SK blocks
2020-02-01 22:11:00 -08:00
Ross Wightman
cefc9b7761
Move SelectKernelConv to conv2d_layers and more
...
* always apply attention in SelectKernelConv, leave MixedConv for no attention alternative
* make MixedConv torchscript compatible
* refactor first/previous dilation name to make more sense in ResNet* networks
2020-02-01 22:11:00 -08:00
Ross Wightman
9abe610931
Used wrong channel var for split
2020-02-01 22:11:00 -08:00
Ross Wightman
58e28dc7e7
Move Selective Kernel blocks/convs to their own sknet.py file
2020-02-01 22:11:00 -08:00
Ross Wightman
a93bae6dc5
A SelectiveKernelBasicBlock for more experiments
2020-02-01 22:11:00 -08:00
Ross Wightman
ad087b4b17
Missed bias=False in selection conv
2020-02-01 22:11:00 -08:00
Ross Wightman
c8b3d6b81a
Initial impl of Selective Kernel Networks. Very much a WIP.
2020-02-01 22:11:00 -08:00
Ross Wightman
1daa303744
Add support to Dataset for class id mapping file, clean up a bit of old logic. Add results file arg for validation and update script.
2020-02-01 18:07:32 -08:00
Ross Wightman
91534522f9
Add newly added TF ported EfficientNet-B8 weights (RandAugment)
2020-02-01 18:01:14 -08:00
Ross Wightman
12dbc74742
New ResNet50 JSD + RandAugment weights
2020-01-31 10:55:54 -08:00
Ross Wightman
2f41905ba5
Update ResNet50 weights to AuxMix trained 78.994 top-1. A few commentes re 'tiered_narrow' tn variant.
2020-01-12 17:55:58 -08:00
Ross Wightman
d9a6a9d0af
Merge pull request #74 from rwightman/augmix-jsd
...
AugMix, JSD loss, SplitBatchNorm (Auxiliary BN), and more
2020-01-11 12:04:29 -08:00
Ross Wightman
3eb4a96eda
Update AugMix, JSD, etc comments and references
2020-01-11 12:02:05 -08:00
Ross Wightman
a28117ea46
Add tiered narrow ResNet (tn) and weights for seresnext26tn_32x4d
2020-01-11 11:29:01 -08:00
Ross Wightman
833066b540
A few minor things in SplitBN
2020-01-05 20:07:03 -08:00
Ross Wightman
7547119891
Add SplitBatchNorm. AugMix, Rand/AutoAugment, Split (Aux) BatchNorm, Jensen-Shannon Divergence, RandomErasing all working together
2020-01-05 19:58:59 -08:00
Ross Wightman
2e955cfd0c
Update RandomErasing with some improved arg names, tweak to aspect range
2020-01-05 14:31:48 -08:00
Ross Wightman
3cc0f91e23
Fix augmix variable name scope overlap, default non-blended mode
2020-01-05 14:27:27 -08:00
Ross Wightman
ec0dd4053a
Add updated RandAugment trained EfficientNet-B0 trained weights from @michaelklachko
2020-01-03 17:18:46 -08:00
Ross Wightman
40fea63ebe
Add checkpoint averaging script. Add headers, shebangs, exec perms to all scripts
2020-01-03 14:57:46 -08:00
Ross Wightman
4666cc9aed
Add --pin-mem arg to enable dataloader pin_memory (showing more benefit in some scenarios now), also add --torchscript arg to validate.py for testing models with jit.script
2020-01-02 16:22:06 -08:00
Ross Wightman
53001dd292
ResNet / Res2Net additions:
...
* ResNet torchscript compat
* output_stride arg supported to limit network stride via dilations (support for dilation added to Res2Net)
* allow activation layer to be changed via act_layer arg
2020-01-01 17:15:56 -08:00
Ross Wightman
f96b3e5e92
InceptionResNetV2 torchscript compatible
2020-01-01 17:13:37 -08:00
Ross Wightman
19d93fe454
Add selecsls60 weights
2019-12-31 16:49:04 -08:00
Ross Wightman
0062c15fb0
Update checkpoint url with modelzoo compatible ones.
2019-12-30 15:59:19 -08:00
Ross Wightman
b5315e66b5
Streamline SelecSLS model without breaking checkpoint compat. Move cfg handling out of model class. Update feature/pooling behaviour to match current.
2019-12-30 15:44:47 -08:00
Ross Wightman
d59a756c16
Run PyCharm autoformat on selecsls and change mix cap variables and model names to all lower
2019-12-30 14:30:46 -08:00
Ross Wightman
fb3a0f4bb8
Merge pull request #65 from mehtadushy/selecsls
...
Incorporate SelecSLS Models
2019-12-30 14:23:53 -08:00
Ross Wightman
19fc205a4d
Update comments on the new SE-ResNeXt26 models
2019-12-28 17:33:10 -08:00
Ross Wightman
acc3ed2b8c
Add EfficientNet-B3 weights, trained from scratch with RA.
2019-12-28 17:24:15 -08:00
Dushyant Mehta
2404361f62
correct asset paths
2019-12-28 23:32:20 +01:00
Dushyant Mehta
31939311f6
Added SelecSLS models
2019-12-28 23:06:00 +01:00
rwightman
1f4498f217
Add ResNet deep tiered stem and model weights for seresnext26t_32x4d and seresnext26d_32x4d
2019-12-28 11:43:50 -08:00
Dushyant Mehta
32012a44fd
Added SelecSLS model
2019-12-28 20:41:55 +01:00
Ross Wightman
73b78459dc
Add update RandAugment MixNet-XL weights
2019-12-24 10:08:24 -08:00
Ross Wightman
3afc2a4dc0
Some cleanup/improvements to AugMix impl:
...
* make 'increasing' levels for Contrast, Color, Brightness, Saturation ops
* remove recursion from faster blending mix
* add config striing parsing for AugMix
2019-12-20 23:04:11 -08:00
Ross Wightman
232ab7fb12
Working on an implementation of AugMix with JensenShannonDivergence loss that's compatible with my AutoAugment and RandAugment impl
2019-12-20 23:04:11 -08:00
Ross Wightman
a435ea1327
Change reduce_bn to distribute_bn, add ability to choose between broadcast and reduce (mean). Add crop_pct arg to allow selecting validation crop while training.
2019-12-19 22:56:54 -08:00
Ross Wightman
3bff2b21dc
Add support for keeping running bn stats the same across distributed training nodes before eval/save
2019-12-05 22:35:40 -08:00
Ross Wightman
0161de0127
Switch RandoErasing back to on GPU normal sampling
2019-12-05 22:35:08 -08:00
Ross Wightman
ff421e5e09
New PyTorch trained EfficientNet-B2 weights with my RandAugment impl
2019-12-04 11:09:47 -08:00
Ross Wightman
3bef524f9c
Finish with HRNet, weights and models updated. Improve consistency in model classifier/global pool treatment.
2019-11-29 17:56:36 -08:00
Ross Wightman
6ca0828166
Update EfficientNet comments, MobileNetV3 non-TF create fns, fix factory arg checks, bump PyTorch version req to 1.2
2019-11-28 17:43:00 -08:00
Ross Wightman
eccbadca74
Update EfficientNet comments
2019-11-28 17:11:53 -08:00
Ross Wightman
902d32fb16
Renamed gen_efficientnet.py -> efficientnet.py
2019-11-28 17:04:35 -08:00
Ross Wightman
5a0a8de7e3
ResNet updates:
...
* remove redundant GluonResNet model/blocks and use the code in ResNet for Gluon weights
* change SEModules back to using AdaptiveAvgPool instead of mean, PyTorch issue long fixed
2019-11-28 17:04:35 -08:00
Ross Wightman
a39cc43374
Bring EfficientNet and MobileNetV3 up to date with my gen-efficientnet repo
...
* Split MobileNetV3 and EfficientNet model files and put builder and blocks in own files (getting too large)
* Finalize CondConv EfficientNet variant
* Add the AdvProp weights files and B8 EfficientNet model
* Refine the feature extraction module for EfficientNet and MobileNetV3
2019-11-28 17:04:35 -08:00
Ross Wightman
ad93347548
Initial HRNet classification model commit
2019-11-28 17:00:52 -08:00
Ross Wightman
2393708650
Missed stashing of out_indices in model
2019-11-28 17:00:52 -08:00
Ross Wightman
35e8f0c5e7
Fixup a few comments, add PyTorch version aware Flatten and finish as_sequential for GenEfficientNet
2019-11-28 17:00:52 -08:00
Ross Wightman
7ac6db4543
Missed activations.py
2019-11-28 17:00:52 -08:00
Ross Wightman
506df0e3d0
Add CondConv support for EfficientNet into WIP for GenEfficientNet Feature extraction setup
2019-11-28 17:00:52 -08:00
Ross Wightman
576d360f20
Bring in JIT version of optimized swish activation from gen_efficientnet as default (while working on feature extraction functionality here).
2019-11-22 13:57:45 -08:00
Ross Wightman
7b83e67f77
Pass drop connect arg through to EfficientNet models
2019-11-22 13:27:43 -08:00
Ross Wightman
31453b039e
Update Auto/RandAugment comments, README, more.
...
* Add a weighted choice option for RandAugment
* Adjust magnitude noise/std naming, config
2019-11-22 13:24:52 -08:00
Ross Wightman
4243f076f1
Adding RandAugment to AutoAugment impl, some tweaks to AA included
2019-11-21 21:14:33 -08:00
Ross Wightman
0d58c50fb1
Add TF RandAug weights for B5/B7 EfficientNet models.
2019-10-30 16:49:17 -07:00
Ross Wightman
c099374771
Map pretrained checkpoint to cpu to avoid issue with some pretrained checkpoints still having CUDA tensors. Fixes #42
2019-10-19 17:27:46 -07:00
Ross Wightman
b93fcf0708
Add Facebook Research Semi-Supervised and Semi-Weakly Supervised ResNet model weights.
2019-10-19 17:05:37 -07:00
Ross Wightman
a9eb484835
Add memory efficient Swish impl
2019-10-19 14:48:30 -07:00
rwightman
d3ba34ee7e
Fix Mobilenet V3 model name for sotabench. Minor res2net cleanup.
2019-09-05 15:47:56 -07:00
Ross Wightman
2680ad14bb
Add Res2Net and DLA to README
2019-09-04 17:38:59 -07:00
rwightman
adbf770f16
Add Res2Net and DLA models w/ pretrained weights. Update sotabench.
2019-09-04 17:06:42 -07:00
Ross Wightman
4002c0d4ce
Fix AutoAugment abs translate calc
2019-09-01 22:07:45 -07:00
Ross Wightman
c06274e5a2
Add note on random selection of magnitude value
2019-09-01 20:32:26 -07:00
Ross Wightman
b750b76f67
More AutoAugment work. Ready to roll...
2019-09-01 16:55:42 -07:00
Ross Wightman
25d2088d9e
Working on auto-augment
2019-08-31 23:09:48 -07:00
Ross Wightman
aff194f42c
Merge pull request #32 from rwightman/opt
...
More optimizer work
2019-08-29 15:26:15 -07:00
Ross Wightman
64966f61f7
Add Nvidia's NovogGrad impl from Jasper (cleaner/faster than current) and Apex Fused optimizers
2019-08-29 15:21:38 -07:00
Ross Wightman
3d9c8a6489
Add support for new AMP checkpointing support w/ amp.state_dict
2019-08-29 15:19:18 -07:00
Ross Wightman
ba3c97c3ad
Some Lookahead cleanup and fixes
2019-08-29 15:14:35 -07:00
Ross Wightman
e9d2ec4d8e
Merge pull request #31 from rwightman/opt
...
Optimizers and more
2019-08-28 00:20:39 -07:00
Ross Wightman
fac58f609a
Add RAdam, NovoGrad, Lookahead, and AdamW optimizers, a few ResNet tweaks and scheduler factory tweak.
...
* Add some of the trendy new optimizers. Decent results but not clearly better than the standards.
* Can create a None scheduler for constant LR
* ResNet defaults to zero_init of last BN in residual
* add resnet50d config
2019-08-28 00:14:10 -07:00
Ross Wightman
81875d52a6
Update sotabench model list, add Mean-Max pooling DPN variants, disable download progress
2019-08-27 00:07:32 -07:00
Ross Wightman
f37e633e9b
Merge remote-tracking branch 'origin/re-exp' into opt
2019-08-26 14:29:23 -07:00
Ross Wightman
b06dce8d71
Bump version for next push to pypi
2019-08-25 22:32:12 -07:00
Ross Wightman
73fbd97ed4
Add weights for my MixNet-XL creation, include README updates for EdgeTPU models
2019-08-24 19:57:42 -07:00
Ross Wightman
51a2375b0c
Experimenting with a custom MixNet-XL and MixNet-XXL definition
2019-08-24 19:57:42 -07:00
Ross Wightman
9ec6824bab
Finally got around to adding EdgeTPU EfficientNet variant
2019-08-24 13:39:49 -07:00
Ross Wightman
daeaa113e2
Add initial sotabench attempt. Split create_transform out of create_loader. Update requirements.txt
2019-08-12 23:06:19 -07:00
Ross Wightman
66634d2200
Add support to split random erasing blocks into randomly selected number with --recount arg. Fix random selection of aspect ratios.
2019-08-12 16:01:58 -07:00
Ross Wightman
6946281fde
Experimenting with random erasing changes
2019-08-12 16:01:58 -07:00
Ross Wightman
aeaaad7304
Merge pull request #24 from rwightman/gluon_xception
...
Port Gluon Aligned Xception models
2019-08-11 23:08:21 -07:00
Ross Wightman
3b4868f6dc
A few more additions to Gluon Xception models to match interface of others.
2019-08-11 23:06:23 -07:00
Ross Wightman
4d505e0785
Add working Gluon Xception-65 model. Some cleanup still needed.
2019-08-10 13:52:01 -07:00
Minqin Chen
4e7a854dd0
Update helpers.py
...
Fixing out of memory error by loading the checkpoint onto the CPU.
2019-08-11 04:21:39 +08:00
Ross Wightman
0c874195db
Update results csv files, bump version for timm pip release
2019-08-05 11:33:17 -07:00
Ross Wightman
4fe2da558c
Add MixNet Small and Large PyTorch native weights (no same padding)
2019-08-02 23:22:48 -07:00
Ross Wightman
e879cf52fa
Update validation scores for new TF EfficientNet weights.
2019-07-31 14:38:55 -07:00
Ross Wightman
77e2e0c4e3
Add new auto-augmentation Tensorflow EfficientNet weights, incl B6 and B7 models. Validation scores still pending but looking good.
2019-07-30 18:31:02 -07:00
Ross Wightman
857f33015a
Add native PyTorch weights for MixNet-Medium with no SAME padding necessary. Remove unused block of code.
2019-07-29 11:59:15 -07:00
Ross Wightman
e7c8a37334
Make min-lr and cooldown-epochs cmdline args, change dash in color_jitter arg for consistency
2019-07-26 09:35:31 -07:00
Ross Wightman
d4debe6597
Update version, results csv files, and move remaining dropbox weights to github
2019-07-25 16:54:44 -07:00
Ross Wightman
dfa9298b4e
Add MixNet ( https://arxiv.org/abs/1907.09595 ) with pretrained weights converted from Tensorflow impl
...
* refactor 'same' convolution and add helper to use MixedConv2d when needed
* improve performance of 'same' padding for cases that can be handled statically
* add support for extra exp, pw, and dw kernel specs with grouping support to decoder/string defs for MixNet
* shuffle some args for a bit more consistency, a little less clutter overall in gen_efficientnet.py
2019-07-25 11:42:01 -07:00
Ross Wightman
7a92caa560
Add basic image folder style dataset to read directly out of tar files, example in validate.py
2019-07-25 10:51:03 -07:00
Ross Wightman
d6ac5bbc48
EfficientNet and related cleanup
...
* remove folded_bn support and corresponding untrainable tflite ported weights
* combine bn args into dict
* add inplace support to activations and use where possible for reduced mem on large models
2019-07-22 09:29:58 -07:00
Ross Wightman
3d9be78fc6
A bit more ResNet cleanup.
...
* add inplace=True back
* minor comment improvements
* few clarity changes
2019-07-19 16:44:35 -07:00
Ross Wightman
33436fafad
Add weights for ResNeXt50d model
2019-07-19 14:09:10 -07:00
Ross Wightman
e78cd79073
Move ResNet additions for Gluon into main ResNet impl. Add ResNet-26 and ResNet-26d models with weights.
2019-07-14 18:17:35 -07:00
Ross Wightman
6cdf35e670
Add explicit half/fp16 support to loader and validation script
2019-07-05 13:52:25 -07:00
Ross Wightman
a6b2f6eca5
Update README, bump version
2019-07-03 22:48:33 -07:00
Ross Wightman
949b7a81c4
Fix typo in Densenet default resolutions
2019-07-03 22:11:26 -07:00
Ross Wightman
da52fcf78a
Add NASNet-Large model
2019-07-03 22:10:50 -07:00
Ross Wightman
6057496409
Register dpn107
2019-06-30 09:57:06 -07:00
Ross Wightman
3d1a66b6fc
Version 0.1.6
2019-06-30 09:55:23 -07:00
Ross Wightman
a6878b5218
Fix DPN config keys that I broke
2019-06-30 09:54:52 -07:00
Ross Wightman
9b0070edc9
Add two comments back, fix typo
2019-06-29 16:44:25 -07:00
Ross Wightman
188aeae8f4
Bump version 0.1.4
2019-06-29 16:17:54 -07:00
Ross Wightman
c3287aafb3
Slight improvement in EfficientNet-B2 native PyTorch weights
2019-06-29 16:17:29 -07:00
Ross Wightman
b8762cc67d
Model updates. Add my best ResNet50 weights top-1=78.47. Add some other torchvision weights.
...
* Remove some models that don't exist as pretrained an likely never will (se)resnext152
* Add some torchvision weights as tv_ for models that I have added better weights for
* Add wide resnet recently added to torchvision along with resnext101-32x8d
* Add functionality to model registry to allow filtering on pretrained weight presence
2019-06-29 15:50:33 -07:00
Ross Wightman
65a634626f
Switch random erasing to doing normal_() on CPU to avoid instability, remove a debug print
2019-06-29 10:03:13 -07:00
Ross Wightman
c6b32cbe73
A number of tweaks to arguments, epoch handling, config
...
* reorganize train args
* allow resolve_data_config to be used with dict args, not just arparse
* stop incrementing epoch before save, more consistent naming vs csv, etc
* update resume and start epoch handling to match above
* stop auto-incrementing epoch in scheduler
2019-06-28 13:49:20 -07:00
Ross Wightman
9d653b68a2
Make drop_connect rate scaling match official impl. Fixes #14
2019-06-25 09:30:36 -07:00
Ross Wightman
13c19e213d
Add native PyTorch EfficientNet B1 and B2 weights. Not quite where I want them, but hitting the brick wall and moving on to other projects...
2019-06-24 13:12:04 -07:00
Ross Wightman
a0275cfa2f
Fix arg positions in two entrypoint aliases
2019-06-24 08:25:14 -07:00
Ross Wightman
fe59249701
Bump version to 0.1.2
2019-06-23 18:27:30 -07:00
Ross Wightman
171c0b88b6
Add model registry and model listing fns, refactor model_factory/create_model fn
2019-06-23 18:22:16 -07:00
Ross Wightman
8512436436
Add instagram pretrained ResNeXt models from https://pytorch.org/hub/facebookresearch_WSL-Images_resnext/ , update README
2019-06-23 12:29:02 -07:00
Ross Wightman
87b92c528e
Some pretrianed URL changes
...
* host some of Cadene's weights on github instead of .fr for speed
* add my old port of ensemble adversarial inception resnet v2
* switch to my TF port of normal inception res v2 and change FC layer back to 'classif' for compat with ens_adv
2019-06-21 13:57:08 -07:00
Ross Wightman
827a3d6010
Add current checkpoints output back to CheckpointSaver (via logger)
2019-06-21 11:57:43 -07:00
Ross Wightman
63961b36a2
Missed pnasnet entrypoint
2019-06-20 23:34:20 -07:00
Ross Wightman
6cc214bd7a
Consistency in model entrypoints
...
* move pretrained entrypoint arg to first pos to be closer to torchvision/hub
* change DPN weight URLS to my github location
2019-06-20 23:29:44 -07:00
Ross Wightman
6fc886acaf
Remove all prints, change most to logging calls, tweak alignment of batch logs, improve setup.py
2019-06-20 17:29:25 -07:00
Ross Wightman
aa4354f466
Big re-org, working towards making pip/module as 'timm'
2019-06-19 17:20:51 -07:00