fast-reid/projects/FastDistill
liaoxingyu 8ab3554958 Support self-distill with EMA updated model 2021-05-31 17:17:24 +08:00
..
configs bugfix & merge classification transforms (#448) 2021-03-30 15:47:14 +08:00
fastdistill Support self-distill with EMA updated model 2021-05-31 17:17:24 +08:00
README.md update fastdistill readme 2021-01-30 11:52:47 +08:00
train_net.py update fastreid V1.0 2021-01-18 11:36:38 +08:00

README.md

FastDistill in FastReID

This project provides a strong distillation method for both embedding and classification training. The feature distillation comes from overhaul-distillation.

Datasets Prepration

  • DukeMTMC-reID

Train and Evaluation

# teacher model training
python3 projects/FastDistill/train_net.py \
--config-file projects/FastDistill/configs/sbs_r101ibn.yml \
--num-gpus 4

# loss distillation
python3 projects/FastDistill/train_net.py \
--config-file projects/FastDistill/configs/kd-sbs_r101ibn-sbs_r34.yaml \
--num-gpus 4 \
MODEL.META_ARCHITECTURE Distiller
KD.MODEL_CONFIG '("projects/FastDistill/logs/dukemtmc/r101_ibn/config.yaml",)' \
KD.MODEL_WEIGHTS '("projects/FastDistill/logs/dukemtmc/r101_ibn/model_best.pth",)'

# loss+overhaul distillation
python3 projects/FastDistill/train_net.py \
--config-file projects/FastDistill/configs/kd-sbs_r101ibn-sbs_r34.yaml \
--num-gpus 4 \
MODEL.META_ARCHITECTURE DistillerOverhaul
KD.MODEL_CONFIG '("projects/FastDistill/logs/dukemtmc/r101_ibn/config.yaml",)' \
KD.MODEL_WEIGHTS '("projects/FastDistill/logs/dukemtmc/r101_ibn/model_best.pth",)'

Experimental Results

Settings

All the experiments are conducted with 4 V100 GPUs.

DukeMTMC-reID

Model Rank@1 mAP
R101_ibn (teacher) 90.66 81.14
R34 (student) 86.31 73.28
JS Div 88.60 77.80
JS Div + Overhaul 88.73 78.25

Contact

This project is conducted by Xingyu Liao and Guan'an Wang (guan.wang0706@gmail).