Default Branch

a22366e3ce · Merge pull request #2503 from huggingface/beit3_remap_clean · Updated 2025-05-31 07:40:28 +08:00

Branches

72858c193c · Add siglip2 compatible naflex encoders. Add support to factorized pos embeds and 'aspect preserving mode' to Flex Embeds. Some more docstrings and typing. · Updated 2025-05-31 07:15:37 +08:00

22
24

211cf90721 · Imports getting unwieldy in vision_transformer.py · Updated 2025-05-31 06:11:51 +08:00

1
0
Included

bfa7ec917a · Doing some Claude enabled docstring, type annotation and other cleanup · Updated 2025-05-31 02:49:27 +08:00

6
1

55e52c45ef · Initial run through remapping beit3 -> vision_transformer.py · Updated 2025-05-30 00:50:17 +08:00

23
6

88b7ef6035 · Disable dynamic_img_size default on PE models for now · Updated 2025-05-11 06:00:29 +08:00

35
0
Included

907a32e699 · Check forward_intermediates features against forward_features output · Updated 2025-05-07 03:56:58 +08:00

54
1

fe353419af · Add local-dir: schema support for model loading (config + weights) from folder · Updated 2025-04-18 01:32:48 +08:00

57
1

990f618868 · Remove torch_out from onnx export, no point without the export_ fn · Updated 2025-04-16 03:09:33 +08:00

59
2

382444362a · Fix arg merging of sknet, old seresnet. Fix #2470 · Updated 2025-04-15 00:24:31 +08:00

60
1

228e080e39 · siglip2 weights on hub, fix forward_intermediates when no prefix tokens (& return prefix selected) · Updated 2025-02-22 04:46:14 +08:00

68
3

5942f1d492 · Add vit so150m2 weights · Updated 2025-02-15 07:02:32 +08:00

71
1

5f85f8eefa · Fix comment, add 'stochastic weight decay' idea because why not · Updated 2025-01-31 07:44:02 +08:00

80
3

93f44d1805 · Try to force numpy<2.0 for torch 1.13 tests, update newest tested torch to 2.5.1 · Updated 2025-01-29 07:42:20 +08:00

81
1

9c26c959eb · Prep Kron for merge, add detail to attributions note, README. · Updated 2025-01-28 12:59:23 +08:00

95
5

c5cf0e0049 · Add the 256x256 in1k ft of the so150m, add an alternate so150m def · Updated 2025-01-19 06:04:12 +08:00

101
2

9bcd7a9f37 · LeViT safetensors load is broken by conversion code that wasn't deactivated · Updated 2025-01-17 03:08:35 +08:00

103
1

c173886e75 · Merge branch 'main' into caojiaolong-main · Updated 2025-01-09 01:11:50 +08:00

126
0
Included

1969528296 · Fix dtype log when default (None) is used w/o AMP · Updated 2025-01-08 03:47:22 +08:00

129
0
Included

155f6e7fea · Update README, few minor fixups. · Updated 2025-01-07 05:09:15 +08:00

132
0
Included

2bd531e033 · Add 384x384 in12k pretrain and finetune for convnext_nano · Updated 2025-01-01 03:00:44 +08:00

141
1