Commit Graph

7 Commits (225f4f92b3738322dfe67f67aa5af47d36c91c37)

Author SHA1 Message Date
Ross Wightman a5a2ad2e48 Fix consistency, testing for forward_head w/ pre_logits, reset_classifier, models with pre_logits size != unpooled feature size
* add test that model supports forward_head(x, pre_logits=True)
* add head_hidden_size attr to all models and set differently from num_features attr when head has hidden layers
* test forward_features() feat dim == model.num_features and pre_logits feat dim == self.head_hidden_size
* more consistency in reset_classifier signature, add typing
* asserts in some heads where pooling cannot be disabled
Fix #2194
2024-06-07 13:53:00 -07:00
Ross Wightman c8b2f28096 Fix a few typos, fix fastvit proj_drop, add code link 2023-08-28 21:26:29 -07:00
Ross Wightman 56c285445c Wrong pool size for 384x384 inception_next_base 2023-08-24 18:31:44 -07:00
Ross Wightman af9f56f3bf inception_next dilation support, weights on hf hub, classifier reset / global pool / no head fixes 2023-08-24 18:31:44 -07:00
Ross Wightman 2d33b9df6c Add features_only support to inception_next 2023-08-24 18:31:44 -07:00
Ross Wightman 3d8d7450ad InceptionNeXt using timm builder, more cleanup 2023-08-24 18:31:44 -07:00
Ross Wightman f4cf9775c3 Adding InceptionNeXt 2023-08-24 18:31:44 -07:00