Commit Graph

20 Commits (bda46f8e6f4bd6463317b6402b8d63c107ceb09a)

Author SHA1 Message Date
Adam J. Stewart 19aaea3c8f
Fix nn.Module type hints 2025-01-11 15:09:21 +01:00
Ross Wightman 2b251fb291 Wrap torch checkpoint() fn to default use_reentrant flag to False and allow env var override 2025-01-06 11:28:39 -08:00
Ross Wightman dc94cca0e5 Remaining Hiera sbb weights uploaded 2024-08-21 10:06:27 -07:00
Ross Wightman 962958723c More Hiera updates. Add forward_intermediates to hieradat/sam2 impl. Make both use same classifier module. Add coarse bool to intermediates. 2024-08-16 11:10:04 -07:00
Ross Wightman a50e53d41f Rename global pos embed for Hiera abswin, factor out commonly used vit weight init fns to layers. Add a channels-last ver of normmlp head. 2024-08-15 17:46:36 -07:00
Ross Wightman 2f3fed43b8 Fix hiera init with num_classes=0, fix weight tag names for sbb2 hiera/vit weights, add LayerScale/LayerScale2d to layers 2024-08-15 11:14:38 -07:00
Ross Wightman fee91fdd41 Update Hiera model for abswin, more stable weight init, layer-scale. ImageNet-12k weights for hiera_small_abswin, and two of the sbb vits with improved reg4 init. 2024-08-14 12:22:40 -07:00
Ross Wightman e9ef9424f0 Add a few missing __all__ entries. 2024-08-07 09:35:51 -07:00
Ross Wightman cec70b6779
Merge pull request #2225 from huggingface/small_things
Small things
2024-07-25 20:29:13 -07:00
Ross Wightman d2240745d3 Fix issue where feature out_indices out of order after wrapping with FeatureGetterNet due to use of set() 2024-07-22 13:33:30 -07:00
Ross Wightman 1a05ed29a1 Add to 'abswin' hiera models for train trials 2024-07-19 11:05:31 -07:00
Ross Wightman 3a8a965891 Implement absolute+window pos embed for hiera, resizable but needs new weights 2024-07-18 21:43:37 -07:00
Ross Wightman a5a2ad2e48 Fix consistency, testing for forward_head w/ pre_logits, reset_classifier, models with pre_logits size != unpooled feature size
* add test that model supports forward_head(x, pre_logits=True)
* add head_hidden_size attr to all models and set differently from num_features attr when head has hidden layers
* test forward_features() feat dim == model.num_features and pre_logits feat dim == self.head_hidden_size
* more consistency in reset_classifier signature, add typing
* asserts in some heads where pooling cannot be disabled
Fix #2194
2024-06-07 13:53:00 -07:00
Ross Wightman 7a4e987b9f Hiera weights on hub 2024-05-13 11:43:22 -07:00
Ross Wightman 3e03b2bf3f Fix a few more hiera API issues 2024-05-12 11:11:45 -07:00
Ross Wightman 211d18d8ac Move norm & pool into Hiera ClassifierHead. Misc fixes, update features_intermediate() naming 2024-05-11 23:37:35 -07:00
Ross Wightman c6db4043cd Update forward_intermediates for hiera to have its own fwd impl w/ early stopping. Remove return_intermediates bool from forward(). Still an fx issue with None mask arg :( 2024-04-29 17:23:37 -07:00
Ross Wightman ef147fd2fb Add forward_intermediates API to Hiera for features_only=True support 2024-04-21 11:30:41 -07:00
Ross Wightman d88bed6535 Bit more Hiera fiddling 2024-04-21 09:36:57 -07:00
Ross Wightman 8a54d2a930 WIP Hiera implementation. Fix #2083. Trying to get image size adaptation to work. 2024-04-20 09:47:17 -07:00