Ross Wightman
|
464885e135
|
See if we can avoid some model / layer pickle issues with the aa attr in ConvNormAct
|
2024-12-03 08:02:55 -08:00 |
|
Ross Wightman
|
5fe5f9d488
|
Add a different mnv4 conv-small weight
|
2024-12-02 16:14:37 -08:00 |
|
Ross Wightman
|
303f7691a1
|
Add cautious mars, improve test reliability by skipping grad diff for first step
|
2024-12-02 11:29:02 -08:00 |
|
Ross Wightman
|
82e8677690
|
Make LaProp weight decay match typical PyTorch 'decoupled' behaviour where it's scaled by LR
|
2024-11-29 16:44:43 -08:00 |
|
Ross Wightman
|
886eb77938
|
Update README, missed small discrep in adafactor min dim update
|
2024-11-29 10:57:47 -08:00 |
|
Ross Wightman
|
e3e434bbc4
|
To be technically correct, need to check the in-place _ ver of op
|
2024-11-28 15:11:58 -08:00 |
|
Ross Wightman
|
7c32d3bd82
|
Work around _foreach_maximum issue, need scalar other support
|
2024-11-28 15:11:58 -08:00 |
|
Ross Wightman
|
7cf683628f
|
Cautious optimizer impl plus some typing cleanup.
|
2024-11-28 15:11:58 -08:00 |
|
Ross Wightman
|
4f64ec4e14
|
Add guard around 'somewhat' newer torch RAdam / NAdam imports
|
2024-11-26 15:10:15 -08:00 |
|
Ross Wightman
|
1ab02a11a1
|
Update Adan with newer impl (from original source) that includes multi-tensor fn
|
2024-11-26 15:10:15 -08:00 |
|
Ross Wightman
|
a024ab3170
|
Replace radam & nadam impl with torch.optim ver, rename legacy adamw, nadam, radam impl in timm. Update optim factory & tests.
|
2024-11-26 15:10:15 -08:00 |
|
Ross Wightman
|
7b54eab807
|
Add MARS and LaProp impl, simplified from originals
|
2024-11-26 15:10:15 -08:00 |
|
Ross Wightman
|
e5aea357b1
|
Update Adopt to include clipping for stability, separate wd so no param decay if update not taken on first step
|
2024-11-26 15:10:15 -08:00 |
|
Johannes
|
093a234d01
|
Update torchvision resnet legacy weight urls in resnet.py
|
2024-11-26 15:53:54 +01:00 |
|
Ross Wightman
|
2fcf73e580
|
Add mini imagenet info files
|
2024-11-25 10:53:28 -08:00 |
|
Ross Wightman
|
900d2b508d
|
add mnv4 conv_medium in12k -> in1k ft
|
2024-11-22 16:31:45 -08:00 |
|
Ross Wightman
|
6bcbdbfe41
|
CS3-DarkNet Small (Focus) w/ RA4 recipe. Fix #2122
|
2024-11-22 16:31:45 -08:00 |
|
Ross Wightman
|
ae0737f5d0
|
Typo
|
2024-11-17 13:54:50 -08:00 |
|
Ross Wightman
|
84049d7f1e
|
Missed input_size pretraind_cfg metadata for v2 34d @ 384
|
2024-11-17 12:44:08 -08:00 |
|
Ross Wightman
|
b7a4b49ae6
|
Add some 384x384 small model weights, 3 variants of mnv4 conv medium on in12k pretrain, and resnetv2-34d on in1k
|
2024-11-17 12:14:39 -08:00 |
|
Antoine Broyelle
|
74196aceda
|
Add py.typed file as recommended by PEP 561
|
2024-11-14 11:26:00 -08:00 |
|
Ross Wightman
|
e35ea733ab
|
Fix compiler check for adopt so it doesn't fail for torch >= 2 but less than recent with .is_compiling()
|
2024-11-13 11:24:01 -08:00 |
|
Ross Wightman
|
0b5264a108
|
Missing optimizers in __init__.py, add bind_defaults=False for unit tests
|
2024-11-13 10:50:46 -08:00 |
|
Ross Wightman
|
d0161f303a
|
Small optim factory tweak. default bind_defaults=True for get_optimizer_class
|
2024-11-13 10:45:48 -08:00 |
|
Ross Wightman
|
3bef09f831
|
Tweak a few docstrings
|
2024-11-13 10:12:31 -08:00 |
|
Ross Wightman
|
8b9b6824ae
|
Minor changes, has_eps=False missing for bnb lion
|
2024-11-12 20:49:01 -08:00 |
|
Ross Wightman
|
61305cc26a
|
Fix adopt descriptions
|
2024-11-12 20:49:01 -08:00 |
|
Ross Wightman
|
dde990785e
|
More fixes for new factory & tests, add back adahessian
|
2024-11-12 20:49:01 -08:00 |
|
Ross Wightman
|
45490ac52f
|
Post merge fix reference of old param groups helper fn locations
|
2024-11-12 20:49:01 -08:00 |
|
Ross Wightman
|
53657a31b7
|
Try to fix documentation build, add better docstrings to public optimizer api
|
2024-11-12 20:49:01 -08:00 |
|
Ross Wightman
|
ee5f6e76bb
|
A bit of an optimizer overhaul, added an improved factory, list_optimizers, class helper and add info classes with descriptions, arg configs
|
2024-11-12 20:49:01 -08:00 |
|
Ross Wightman
|
c1cf8c52b9
|
Update adafactor comments / attrib
|
2024-11-12 20:49:01 -08:00 |
|
Ross Wightman
|
94e0560aba
|
Remove an indent level in init_group for adopt, update optim tests, adopt failing rosenbrock
|
2024-11-12 20:49:01 -08:00 |
|
Ross Wightman
|
ff136b8d3a
|
Fix ADOPT on older PyTorch (tested back to 1.13)
|
2024-11-12 20:49:01 -08:00 |
|
Ross Wightman
|
79abc25f55
|
Add ADOPT optimizer
|
2024-11-12 20:49:01 -08:00 |
|
Ross Wightman
|
36a45e5d94
|
Improve row/col dim var name
|
2024-11-12 20:49:01 -08:00 |
|
Ross Wightman
|
e7b0480381
|
Cleanup original adafactor impl, add row/col dim heuristic that works with both conv and linear layers
|
2024-11-12 20:49:01 -08:00 |
|
Ross Wightman
|
1409ce2dbe
|
Change eps defaults in adafactor_bv again after some checking
|
2024-11-12 20:49:01 -08:00 |
|
Ross Wightman
|
9d8ccd2ba7
|
A bit of lars/lamb cleanup, torch.where supports scalars properly now, make lamb grad clipping optional, clean it up a bit
|
2024-11-12 20:49:01 -08:00 |
|
Ross Wightman
|
7cfaeced67
|
Change adafactor_bv epsilon default
|
2024-11-12 20:49:01 -08:00 |
|
Ross Wightman
|
0b5ae49251
|
Remove adafactorbv numpy dep, hack fix for loading optimizer state w/ half prec momentum (need better one)
|
2024-11-12 20:49:01 -08:00 |
|
Ross Wightman
|
19090ea966
|
Need to init momentum with correct dtype
|
2024-11-12 20:49:01 -08:00 |
|
Ross Wightman
|
484a88f4b4
|
Remove unused beta2 fn, make eps grad^2 handling same across factorized and non-factorized cases
|
2024-11-12 20:49:01 -08:00 |
|
Ross Wightman
|
7c16adca83
|
An impl of adafactor as per big vision (scaling vit) changes
|
2024-11-12 20:49:01 -08:00 |
|
Ross Wightman
|
363b043c13
|
Extend train epoch schedule by warmup_epochs if warmup_prefix enable, allows schedule to reach end w/ prefix enabledy
|
2024-11-08 11:01:11 -08:00 |
|
Augustin Godinot
|
7f0c1b1f30
|
Add trust_remote_code argument to ReaderHfds
|
2024-11-08 08:16:36 -08:00 |
|
Wojtek Jasiński
|
eb94efb218
|
fix pos embed dynamic resampling for eva
|
2024-11-06 16:03:27 -08:00 |
|
Wojtek Jasiński
|
3c7822c621
|
fix pos embed dynamic resampling for deit
|
2024-11-06 16:03:27 -08:00 |
|
Wojtek Jasiński
|
3ae3f44288
|
Fix positional embedding resampling for non-square inputs in ViT
|
2024-11-06 16:03:27 -08:00 |
|
Ross Wightman
|
d4dde48dd5
|
Missed first_conv from resnet18d
|
2024-10-31 19:29:53 -07:00 |
|