Commit Graph

26 Commits (965d0a2d363668b7f8d1794e45c52d525bdb6278)

Author SHA1 Message Date
Fredo Guan 81ca323751
Davit update formatting and fix grad checkpointing (#7)
fixed head to gap->norm->fc as per convnext, along with option for norm->gap->fc
failed tests due to clip convnext models, davit tests passed
2023-01-15 14:34:56 -08:00
Ross Wightman 927f031293 Major module / path restructure, timm.models.layers -> timm.layers, add _ prefix to all non model modules in timm.models 2022-12-06 15:00:06 -08:00
Ross Wightman 1d8d6f6072 Fix two default args in DenseNet blocks... fix #1427 2022-08-25 15:00:35 -07:00
Ross Wightman 0862e6ebae Fix correctness of some group matching regex (no impact on result), some formatting, missed forward_head for resnet 2022-03-19 14:58:54 -07:00
Ross Wightman 372ad5fa0d Significant model refactor and additions:
* All models updated with revised foward_features / forward_head interface
* Vision transformer and MLP based models consistently output sequence from forward_features (pooling or token selection considered part of 'head')
* WIP param grouping interface to allow consistent grouping of parameters for layer-wise decay across all model types
* Add gradient checkpointing support to a significant % of models, especially popular architectures
* Formatting and interface consistency improvements across models
* layer-wise LR decay impl part of optimizer factory w/ scale support in scheduler
* Poolformer and Volo architectures added
2022-02-28 13:56:23 -08:00
Ross Wightman abc9ba2544 Transitioning default_cfg -> pretrained_cfg. Improving handling of pretrained_cfg source (HF-Hub, files, timm config, etc). Checkpoint handling tweaks. 2022-01-25 21:54:13 -08:00
Ross Wightman ab49d275de Significant norm update
* ConvBnAct layer renamed -> ConvNormAct and ConvNormActAa for anti-aliased
* Significant update to EfficientNet and MobileNetV3 arch to support NormAct layers and grouped conv (as alternative to depthwise)
* Update RegNet to add Z variant
* Add Pre variant of XceptionAligned that works with NormAct layers
* EvoNorm matches bits_and_tpu branch for merge
2021-12-14 13:48:30 -08:00
Ross Wightman d584e7f617 Support for huggingface hub via create_model and default_cfgs.
* improve consistency of model creation helper fns
* add comments to some of the model helpers
* support passing external default_cfgs so they can be sourced from hub
2021-03-16 22:48:26 -07:00
Ross Wightman b1f1a54de9 More uniform treatment of classifiers across all models, reduce code duplication. 2020-08-03 22:18:24 -07:00
Ross Wightman 3b9004bef9 Lots of changes to model creation helpers, close to finalizing feature extraction / interfaces 2020-07-17 17:54:26 -07:00
Ross Wightman d23a2697d0 Working on feature extraction, interfaces refined, a number of models working, some in progress. 2020-06-29 18:18:59 -07:00
Ross Wightman a7e8cadd15 Remove pointless densenet configs, add an iabn version of 264 as it makes more sense to try someday... 2020-06-03 17:13:52 -07:00
Ross Wightman e78daf586a better densenet121 and densenetblur121d weights 2020-06-03 13:30:03 -07:00
Ross Wightman eb7653614f Monster commit, activation refactor, VoVNet, norm_act improvements, more
* refactor activations into basic PyTorch, jit scripted, and memory efficient custom auto
* implement hard-mish, better grad for hard-swish
* add initial VovNet V1/V2 impl, fix #151
* VovNet and DenseNet first models to use NormAct layers (support BatchNormAct2d, EvoNorm, InplaceIABN)
* Wrap IABN for any models that use it
* make more models torchscript compatible (DPN, PNasNet, Res2Net, SelecSLS) and add tests
2020-06-01 17:16:52 -07:00
Ross Wightman 6441e9cc1b Fix memory_efficient mode for DenseNets. Add AntiAliasing (Blur) support for DenseNets and create one test model. Add lr cycle/mul params to train args. 2020-05-22 16:16:45 -07:00
Ross Wightman 780860d140 Add norm_act factory method, move JIT of norm layers to factory 2020-05-09 22:09:21 -07:00
Ross Wightman 14edacdf9a DenseNet converted to support ABN (norm + act) modules. Experimenting with EvoNorm, IABN 2020-05-09 18:26:41 -07:00
Ross Wightman 022ed001f3 Update DenseNet to latest in Torchvision (torchscript compat, checkpointing, proper init). Start adding ehanced configurability, stem options... 2020-05-07 09:57:48 -07:00
Vyacheslav Shults a7ebe09029 Replace all None by nn.Identity() in all models reset_classifier when False-values num_classes is given.
Make small code refactoring
2020-05-06 09:54:03 +03:00
Ross Wightman 13746a33fc Big move, layer modules and fn to timm/models/layers 2020-02-09 13:13:08 -08:00
Ross Wightman 3bef524f9c Finish with HRNet, weights and models updated. Improve consistency in model classifier/global pool treatment. 2019-11-29 17:56:36 -08:00
Ross Wightman 949b7a81c4 Fix typo in Densenet default resolutions 2019-07-03 22:11:26 -07:00
Ross Wightman 171c0b88b6 Add model registry and model listing fns, refactor model_factory/create_model fn 2019-06-23 18:22:16 -07:00
Ross Wightman 6cc214bd7a Consistency in model entrypoints
* move pretrained entrypoint arg to first pos to be closer to torchvision/hub
* change DPN weight URLS to my github location
2019-06-20 23:29:44 -07:00
Ross Wightman 6fc886acaf Remove all prints, change most to logging calls, tweak alignment of batch logs, improve setup.py 2019-06-20 17:29:25 -07:00
Ross Wightman aa4354f466 Big re-org, working towards making pip/module as 'timm' 2019-06-19 17:20:51 -07:00