Ross Wightman
c838c4233f
Add typing to reset_classifier() on other models
2024-05-12 11:12:00 -07:00
Ross Wightman
d4386219c6
Improve type handling for arange & rel pos embeds, keep calculations in float32 until application (may change to apply in float32 in future). Prevent arange type hijacking by DeepSpeed Zero
2024-01-26 16:35:51 -08:00
Fredo Guan
bbe798317f
Update EdgeNeXt to use ClassifierHead as per ConvNeXt ( #2051 )
...
* Update edgenext.py
2023-12-11 12:17:19 -08:00
Ross Wightman
dfaab97d20
More consistency in model arg/kwarg merge handling
2023-11-21 09:48:03 -08:00
Ross Wightman
e4e43190ce
Add typing to all model entrypoint fns, add old cache check env var to builder
2023-05-08 08:52:38 -07:00
Ross Wightman
3386af8c86
Final push to get remaining models using multi-weight pretrained configs, almost all weights on HF hub
2023-04-26 15:52:13 -07:00
Ross Wightman
4d135421a3
Implement patch dropout for eva / vision_transformer, refactor / improve consistency of dropout args across all vit based models
2023-04-07 20:27:23 -07:00
Ross Wightman
927f031293
Major module / path restructure, timm.models.layers -> timm.layers, add _ prefix to all non model modules in timm.models
2022-12-06 15:00:06 -08:00
Ross Wightman
13565aad50
Add edgenext_base model def & weight link, update to improve ONNX export #1385
2022-08-05 16:58:34 -07:00
Ross Wightman
a45b4bce9a
x and xx small edgenext models do benefit from larger test input size
2022-07-08 10:53:27 -07:00
Ross Wightman
a1cb25066e
Add edgnext_small_rw weights trained with swin like recipe. Better than original 'small' but not the recent 'USI' distilled weights.
2022-07-07 22:02:57 -07:00
Ross Wightman
dd9b8f57c4
Add feature_info to edgenext for features_only support, hopefully fix some fx / test errors
2022-07-02 15:20:45 -07:00
Ross Wightman
6064d16a2d
Add initial EdgeNeXt import. Significant cleanup / reorg (like ConvNeXt). Fix #1320
...
* edgenext refactored for torchscript compat, stage base organization
* slight refactor of ConvNeXt to match some EdgeNeXt additions
* remove use of funky LayerNorm layer in ConvNeXt and just use nn.LayerNorm and LayerNorm2d (permute)
2022-07-01 15:18:42 -07:00