fast-reid/projects/StrongerBaseline
liaoxingyu 18fd7faff7 docs(stronger baseline): update stronger baseline results
update stronger baseline results on Market1501, DukeMTMC and MSMT17
2020-04-28 21:02:21 +08:00
..
configs feat(stronger baseline): add solid tricks for a stronger baseline 2020-04-28 11:53:26 +08:00
scripts feat(stronger baseline): add solid tricks for a stronger baseline 2020-04-28 11:53:26 +08:00
README.md docs(stronger baseline): update stronger baseline results 2020-04-28 21:02:21 +08:00
train_net.py feat(stronger baseline): add solid tricks for a stronger baseline 2020-04-28 11:53:26 +08:00

README.md

Stronger Baseline in FastReID

Training

To train a model, run

CUDA_VISIBLE_DEVICES=gpus python train_net.py --config-file <config.yaml>

For example, to launch a end-to-end baseline training on market1501 dataset with ibn-net on 4 GPUs, one should excute:

CUDA_VISIBLE_DEVICES=0,1,2,3 python train_net.py --config-file='configs/sbs_market1501.yml'

Experimental Results

stronger baseline tricks:

  1. Non-local block
  2. GeM pooling
  3. Circle loss
  4. Freeze backbone training
  5. Cutout data augmentation & Auto Augmentation
  6. Cosine annealing learning rate decay
  7. Soft margin triplet loss

Market1501 dataset

Method Pretrained Rank@1 mAP mINP
stronger baseline(ResNet50-ibn) ImageNet 95.5 88.4 65.8
Robust-ReID ImageNet 96.2 89.7 -

DukeMTMC dataset

Method Pretrained Rank@1 mAP mINP
stronger baseline(ResNet50-ibn) ImageNet 91.3 81.6 47.6
Robust-ReID ImageNet 89.8 80.3 -

MSMT17 dataset

Method Pretrained Rank@1 mAP mINP
stronger baseline(ResNet50-ibn) ImageNet 84.2 61.5 15.7
ABD-Net ImageNet 82.3 60.8 -