2019-01-10 18:39:31 +08:00
2019-01-10 18:39:31 +08:00
2019-01-10 18:39:31 +08:00
2019-01-10 18:39:31 +08:00
2019-01-10 18:39:31 +08:00
2019-01-10 18:39:31 +08:00
2019-01-10 18:39:31 +08:00
2019-01-10 18:39:31 +08:00
2019-01-10 18:39:31 +08:00
2019-01-10 18:39:31 +08:00
2019-01-10 18:39:31 +08:00
2018-06-15 17:49:47 +08:00
2019-01-10 18:39:31 +08:00

ReID_baseline

Baseline model (with bottleneck) for person ReID (using softmax and triplet loss).

We support

  • easy dataset preparation
  • end-to-end training and evaluation
  • high modular management

Get Started

The designed architecture follows this guide PyTorch-Project-Template, you can check each folder's purpose by yourself.

  1. cd to folder where you want to download this repo

  2. Run git clone https://github.com/L1aoXingyu/reid_baseline.git

  3. Install dependencies:

  4. Prepare dataset

    Create a directory to store reid datasets under this repo via

    cd reid_baseline
    mkdir data
    
    1. Download dataset to data/ from http://www.liangzheng.org/Project/project_reid.html
    2. Extract dataset and rename to market1501. The data structure would like:
    data
        market1501
            bounding_box_test/
            bounding_box_train/
    
  5. Prepare pretrained model if you don't have

    from torchvision import models
    models.resnet50(pretrained=True)
    

    Then it will automatically download model in ~/.torch/models/, you should set this path in config/defaults.py for all training or set in every single training config file in configs/.

Train

Most of the configuration files that we provide, you can run this command for training

python3 tools/train.py --config_file='configs/market1501_softmax_bs64.yml'

You can also modify your cfg parameters as follow

python3 tools/train.py --config_file='configs/market1501_softmax_bs64.yml' INPUT.SIZE_TRAIN '(256, 128)' INPUT.SIZE_TEST '(256, 128)'

Results

network architecture

cfg market1501 cuhk03 dukemtmc
softmax, size=(384, 128), batch_size=64 92.5 (79.4) 60.4 (56.1) 84.6 (68.1)
softmax, size=(256, 128), batch_size=64 92.0 (80.4) 60.5 (55.5) 84.1(68.4)
softmax_triplet, size=(384, 128), batch_size=128(32 id x 4 imgs) 93.2 (82.5) - 86.4 (73.1)
softmax_triplet, size=(256, 128), batch_size=128(32 id x 4 imgs) 93.8 (83.2) 65.9 (61.4) -
Languages
Python 86.7%
C++ 11%
Cython 1.3%
CMake 0.6%
Dockerfile 0.4%