Commit Graph

126 Commits (ee8893c8063f6937fec7096e47ba324c206e22b9)
 

Author SHA1 Message Date
Francisco Massa a8e90967a3
Add option to finetune on larger resolution (#43)
* Add option for finetuning a model

* Fixes

* Keep model in eval mode during finetuning

* Only skip head weights if size mismatch

* Remove finetune-epochs

Might not be needed

* Raise error if distillation + finetune are enabled
2021-01-15 10:13:52 +01:00
Francisco Massa d9932c08b5 Update .gitignore 2021-01-13 15:58:31 -08:00
Francisco Massa 726f061615 Fix lint 2021-01-13 05:45:38 -08:00
Francisco Massa 8eae3269da
Add Knowledge-Distillation (#42)
* Add knowledge distillation

* Bugfix

* Bugfix

* Make names more readable and use single torch.cat call

* Remove criterion.train() in engine

The teacher should stay in eval mode

* Change default argument for teacher-model

* Return the average of classifiers during inference

* Cleanup unused code

* Add docstring for DistillationLoss

* Remove warnings from newer PyTorch

Also uses more stable variant, instead of using softmax + log, use directly log_softmax
2021-01-13 14:19:23 +01:00
Francisco Massa 30eb3186da
Add --output_dir to README (#36)
This will make it clearer to users that they need to specify it if running without run_with_submitit, so that the results can be saved
2021-01-11 14:23:46 +01:00
Changlin Li b06fcba6bf
Support parallelized evaluation (#24)
* support parallelized evaluation

* remove shuffle arg of loader val, add sampler val in non-dist branch

* replace timm eval sampler with torch sampler

* add logger synchronizing to support parallelized evaluation

* add command line argument dist-eval and warning
2021-01-08 11:05:39 +01:00
Matthijs Douze 38fcfbd863
Merge pull request #27 from fmassa/license-change
Change LICENSE to Apache 2.0
2021-01-08 11:02:40 +01:00
Francisco Massa 0c4b8f60bd Change LICENSE to Apache 2.0 2021-01-08 10:51:58 +01:00
Francisco Massa dcd888df20
Update default data-path (#25)
Points to the new valid location
2021-01-07 22:18:54 +01:00
Zhiyuan Chen 0282d2a175
Remove drop block args (#23)
timm's Vision Transformer does not support drop block
2021-01-07 17:28:41 +01:00
sanjaydatasciencedojo 4e91d2588f
Remove unused libraries (#9) 2020-12-27 21:43:54 +01:00
Hugo Touvron 93ec3bc31e
Update README.md 2020-12-26 15:39:09 +01:00
rv f277626fd3
Update README.md 2020-12-24 17:30:54 -04:00
rv a853fc091b
Update README.md 2020-12-24 17:29:59 -04:00
rv 1010d7ff0b
Update README.md 2020-12-24 14:03:56 -04:00
rv 510f492b42
Update README.md 2020-12-24 14:02:31 -04:00
rv 2aefd8fc86
Update README.md 2020-12-23 22:20:46 -04:00
rv 347345390b
Update README.md 2020-12-23 22:01:24 -04:00
Francisco Massa 0ce0ad3c6f Use main branch instead of master for torchhub 2020-12-23 14:12:50 -08:00
rv d4527cf9a1
Update README.md 2020-12-23 17:52:45 -04:00
rv bca0e06708
Update README.md 2020-12-23 17:43:32 -04:00
rv 67bf427de7
Update README.md 2020-12-23 17:43:15 -04:00
rv 6eacc0e792
Update README.md 2020-12-23 17:42:14 -04:00
rv 72f0506b8b
Update README.md 2020-12-23 17:41:57 -04:00
rv b110ec28fd
Update README.md 2020-12-23 17:41:21 -04:00
Francisco Massa 1d38fa4c37 Initial commit 2020-12-23 10:47:58 -08:00