pytorch-image-models/timm/utils
Ross Wightman c50004db79 Allow training w/o validation split set 2024-01-08 09:38:42 -08:00
..
__init__.py MobileOne and FastViT weights on HF hub, more code cleanup and tweaks, features_only working. Add reparam flag to validate and benchmark, support reparm of all models with fuse(), reparameterize() or switch_to_deploy() methods on modules 2023-08-23 22:50:37 -07:00
agc.py
checkpoint_saver.py
clip_grad.py
cuda.py clip gradients with update 2023-04-19 23:36:20 -07:00
decay_batch.py Add bulk_runner script and updates to benchmark.py and validate.py for better error handling in bulk runs (used for benchmark and validation result runs). Improved batch size decay stepping on retry... 2022-07-18 18:04:54 -07:00
distributed.py Refactor device handling in scripts, distributed init to be less 'cuda' centric. More device args passed through where needed. 2022-09-23 16:08:59 -07:00
jit.py disable nvfuser for jit te/legacy modes (for PT 1.12+) 2022-07-13 10:34:34 -07:00
log.py
metrics.py
misc.py Davit update formatting and fix grad checkpointing (#7) 2023-01-15 14:34:56 -08:00
model.py MobileOne and FastViT weights on HF hub, more code cleanup and tweaks, features_only working. Add reparam flag to validate and benchmark, support reparm of all models with fuse(), reparameterize() or switch_to_deploy() methods on modules 2023-08-23 22:50:37 -07:00
model_ema.py
onnx.py Add onnx utils and export code, tweak padding and conv2d_same for better dynamic export with recent PyTorch 2023-04-11 17:03:57 -07:00
random.py
summary.py Allow training w/o validation split set 2024-01-08 09:38:42 -08:00