Commit Graph

6 Commits (900d2b508d02db46b573170aa04c7c1d7ec8c152)

Author SHA1 Message Date
Ross Wightman a5a2ad2e48 Fix consistency, testing for forward_head w/ pre_logits, reset_classifier, models with pre_logits size != unpooled feature size
* add test that model supports forward_head(x, pre_logits=True)
* add head_hidden_size attr to all models and set differently from num_features attr when head has hidden layers
* test forward_features() feat dim == model.num_features and pre_logits feat dim == self.head_hidden_size
* more consistency in reset_classifier signature, add typing
* asserts in some heads where pooling cannot be disabled
Fix #2194
2024-06-07 13:53:00 -07:00
Thorsten Hempel d4c21b95f4 Update repghost.py 2023-09-15 11:41:56 -07:00
Ross Wightman 5242ba6edc MobileOne and FastViT weights on HF hub, more code cleanup and tweaks, features_only working. Add reparam flag to validate and benchmark, support reparm of all models with fuse(), reparameterize() or switch_to_deploy() methods on modules 2023-08-23 22:50:37 -07:00
Ross Wightman 69e0ca2e36 Weights on hf hub, bicubic yields slightly better eval 2023-08-19 16:25:45 -07:00
Chengpeng Chen e7f97cb5ce Fix typos RepGhost models 2023-08-16 14:27:45 +08:00
Chengpeng Chen d1d0193615 Add RepGhost models and weights 2023-08-16 11:54:53 +08:00