Commit Graph

7 Commits (9613c7684408c4ca0c4a1448d0972b7ecb3564db)

Author SHA1 Message Date
Ross Wightman a5a2ad2e48 Fix consistency, testing for forward_head w/ pre_logits, reset_classifier, models with pre_logits size != unpooled feature size
* add test that model supports forward_head(x, pre_logits=True)
* add head_hidden_size attr to all models and set differently from num_features attr when head has hidden layers
* test forward_features() feat dim == model.num_features and pre_logits feat dim == self.head_hidden_size
* more consistency in reset_classifier signature, add typing
* asserts in some heads where pooling cannot be disabled
Fix #2194
2024-06-07 13:53:00 -07:00
Ross Wightman 31e0dc0a5d Tweak hgnet before merge 2024-02-12 15:00:32 -08:00
方曦 9dbea3bef6 fix cls head in hgnet 2023-12-27 21:26:26 +08:00
SeeFun 56ae8b906d
fix reset head in hgnet 2023-12-27 20:11:29 +08:00
SeeFun 6862c9850a
fix backward in hgnet 2023-12-27 16:49:37 +08:00
方曦 4aa166de9c Add hgnet ssld weights 2023-10-09 19:14:10 +08:00
方曦 159e91605c Add PP-HGNet and PP-HGNetv2 models 2023-10-09 19:04:58 +08:00