SOTA Re-identification Methods and Toolbox
 
 
 
 
 
Go to file
zjk15068083791 2f3f6e3267
Add files via upload
2020-04-27 16:27:44 +08:00
demo refactor bn_no_bias 2020-04-08 21:04:09 +08:00
fastreid Add files via upload 2020-04-27 16:27:44 +08:00
projects docs($projects): update agw readme 2020-04-21 11:35:54 +08:00
tests update version0.2 code 2020-03-25 10:58:26 +08:00
.gitignore Change architecture: 2020-02-18 21:01:23 +08:00
README.md Update performance 2020-03-31 14:37:24 +08:00

README.md

FastReID

FastReID is a research platform that implements state-of-the-art re-identification algorithms.

Quick Start

The designed architecture follows this guide PyTorch-Project-Template, you can check each folder's purpose by yourself.

  1. cd to folder where you want to download this repo

  2. Run git clone https://github.com/L1aoXingyu/fast-reid.git

  3. Install dependencies:

  4. Prepare dataset Create a directory to store reid datasets under projects, for example

    cd fast-reid/projects/StrongBaseline
    mkdir datasets
    
    1. Download dataset to datasets/ from baidu pan or google driver
    2. Extract dataset. The dataset structure would like:
    datasets
        Market-1501-v15.09.15
            bounding_box_test/
            bounding_box_train/
    
  5. Prepare pretrained model. If you use origin ResNet, you do not need to do anything. But if you want to use ResNet_ibn, you need to download pretrain model in here. And then you can put it in ~/.cache/torch/checkpoints or anywhere you like.

    Then you should set the pretrain model path in configs/baseline_market1501.yml.

  6. compile with cython to accelerate evalution

    cd fastreid/evaluation/rank_cylib; make all
    

Model Zoo and Baselines

Market1501 dataset

Method Pretrained Rank@1 mAP mINP
BagTricks ImageNet 93.6% 85.1% 58.1%
BagTricks + Ibn-a ImageNet 94.8% 87.3% 63.5%
AGW ImageNet 94.9% 87.4% 63.1%

DukeMTMC dataset

Method Pretrained Rank@1 mAP mINP
BagTricks ImageNet 86.1% 75.9% 38.7%
BagTricks + Ibn-a ImageNet 89.0% 78.8% 43.6%
AGW ImageNet 88.9% 79.1% 43.2%

MSMT17 dataset

Method Pretrained Rank@1 mAP mINP
BagTricks ImageNet 70.4% 47.5% 9.6%
BagTricks + Ibn-a ImageNet 76.9% 55.0% 13.5%
AGW ImageNet 75.6% 52.6% 11.9%