8 Commits

Author SHA1 Message Date
Ross Wightman
a5a2ad2e48 Fix consistency, testing for forward_head w/ pre_logits, reset_classifier, models with pre_logits size != unpooled feature size
* add test that model supports forward_head(x, pre_logits=True)
* add head_hidden_size attr to all models and set differently from num_features attr when head has hidden layers
* test forward_features() feat dim == model.num_features and pre_logits feat dim == self.head_hidden_size
* more consistency in reset_classifier signature, add typing
* asserts in some heads where pooling cannot be disabled
Fix #2194
2024-06-07 13:53:00 -07:00
Ross Wightman
7a4e987b9f Hiera weights on hub 2024-05-13 11:43:22 -07:00
Ross Wightman
3e03b2bf3f Fix a few more hiera API issues 2024-05-12 11:11:45 -07:00
Ross Wightman
211d18d8ac Move norm & pool into Hiera ClassifierHead. Misc fixes, update features_intermediate() naming 2024-05-11 23:37:35 -07:00
Ross Wightman
c6db4043cd Update forward_intermediates for hiera to have its own fwd impl w/ early stopping. Remove return_intermediates bool from forward(). Still an fx issue with None mask arg :( 2024-04-29 17:23:37 -07:00
Ross Wightman
ef147fd2fb Add forward_intermediates API to Hiera for features_only=True support 2024-04-21 11:30:41 -07:00
Ross Wightman
d88bed6535 Bit more Hiera fiddling 2024-04-21 09:36:57 -07:00
Ross Wightman
8a54d2a930 WIP Hiera implementation. Fix #2083. Trying to get image size adaptation to work. 2024-04-20 09:47:17 -07:00