73 Commits

Author SHA1 Message Date
liaoxingyu
5982f90920 support loading various pretrained weights
Summary: Support loading pretrained model by custom path. With this function, we can load infoMin weights.
2020-05-26 14:33:18 +08:00
liaoxingyu
5d4758125d support ResNet34 backbone
Summary: add BasicBlock to support ResNet34
2020-05-26 13:18:09 +08:00
liaoxingyu
84c733fa85 fix: remove prefetcher, put normalizer in model
1. remove messy data prefetcher which will cause  confusion
2. put normliazer in model to accelerate training via GPU computing
2020-05-25 23:39:11 +08:00
liaoxingyu
94c86579a3 fix(heads): fix bug in reduce head
add neck_feat from config, add inplace in leakyrelu for memory save
2020-05-23 10:41:13 +08:00
liaoxingyu
c21de64166 fix: add linear initial method 2020-05-21 23:59:51 +08:00
liaoxingyu
e990cf3e34 style: fix some typro 2020-05-21 15:55:51 +08:00
liaoxingyu
2ac55a7601 feat: update roc curve and TPR@FPR metric
support plot multiple ROC curves with different model
2020-05-20 14:29:33 +08:00
liaoxingyu
e344eae1cc feat: support plotting roc curve and compute auc score
ROC curve and AUC score will be help for thresholds
2020-05-19 20:45:26 +08:00
liaoxingyu
fd90555e19 feat: add multi-dataset joint training
new feature that can support joint training, and find some bugs in funtion combine_all of datasets/bases.py that assume person id in dataset has been relabeld from 0 to num_class.
Another bug appears in msmt17 which trainset and testset person id both begin from 0, and we should change testset id from num_class of trainset.
2020-05-18 20:06:04 +08:00
liaoxingyu
579a5cf552 fix: modify re-rank typro 2020-05-18 17:05:20 +08:00
liaoxingyu
d63bf5facc fix: add syncBN options in defaultTraine 2020-05-16 22:44:53 +08:00
liaoxingyu
b28c0032e8 fix: add monkey-patching to enable syncBN
add a trigger to make syncBN work
2020-05-15 13:33:33 +08:00
liaoxingyu
18a33f7962 feat: add MGN model
support MGN architecture and training config
2020-05-15 11:39:54 +08:00
liaoxingyu
bf18479541 fix: revise syncBN bug 2020-05-14 14:52:37 +08:00
liaoxingyu
0872a32621 feat: add syncBN support 2020-05-14 13:15:09 +08:00
liaoxingyu
0356ef8c5c feat: add SyncBN and GroupNorm suppor 2020-05-14 11:36:28 +08:00
liaoxingyu
5ae3d4fecf feat: add aqe support in test phase
query expansion will combine the retrived topk nearest neighbors with the original query feature,
it will enhance mAP by a large margin.feat:
2020-05-13 16:27:22 +08:00
liaoxingyu
320010f2ae feat: support re-rank in test phase 2020-05-13 11:47:52 +08:00
Xingyu Liao
01d940bfdd
update vehicle dataset
Summary: Pull Request resolved: #49 

Reviewed By: xingyu liao
2020-05-12 21:44:05 +08:00
liaoxingyu
9addfb0ae2 feat: support visualizing label list
add features to support label list visualization, which can be used
for label correction or check the hardest sample
2020-05-12 21:35:33 +08:00
Jinkai Zheng
640c9bfc97
Add small, medium and large vehicle test datasets 2020-05-11 01:34:19 -05:00
Jinkai Zheng
e059b751a6
Add modified vehicle datasets with small, medium and large test datastes 2020-05-11 01:30:56 -05:00
liaoxingyu
9b6fda3830 style: remove title in visualization 2020-05-11 14:12:29 +08:00
liaoxingyu
13bb03eb07 feat: add rank result visualization tools
Update visualization tools which can save rank list with AP metrics from high to low, vice versa.
In order to compute AP fast in visualizer, modify rank_cylib to get all_AP instead of mAP.
In this way, we can use Cython to compute results.
2020-05-10 23:17:10 +08:00
liaoxingyu
651e6ba9c4 feat: support multiprocess predictor
add asyncpredictor to support multiprocessing feature extraction with dataloader
2020-05-09 18:23:36 +08:00
liaoxingyu
4be4cacb73 fix: add a simple way to reset data prefetcher when resume training
use data prefetcher build-in reset function to reload it rather than
redefining a new data prefetcher, otherwise it will introduce other
problems in eval-only mode.
2020-05-09 11:58:27 +08:00
liaoxingyu
9fae467adf feat(engine/defaults): add DefaultPredictor to get image reid features
Add a new predictor interface, and modify demo code to predict image features.
2020-05-08 19:24:27 +08:00
liaoxingyu
8ab0bc2455 style(backbone): make parameters loading logging more elegant 2020-05-08 12:22:06 +08:00
liaoxingyu
0b15ac4e03 feat(hooks&optim): update stochastic weight averging hooks
Update swa method which will do after regular training if you
set this option enabled.
2020-05-08 12:20:04 +08:00
liaoxingyu
afac8aad5d Fix(engine): fix preciseBN dataloader bugs
preciseBN needs to pass data prefetcher, but now a DataLoader is passed
2020-05-06 14:26:34 +08:00
liaoxingyu
948af64fd1 feat: add swa algorithm
Add swa and related config options,
if it is enabled, model will do swa after regular training
2020-05-06 10:17:44 +08:00
liaoxingyu
6d96529d4c fix(data): fix resume training bug
fix dataset pid dictionary loading bug when resume training,
data prefetcher will pre-load a batch of data, this will lead to
misalignment of old pid dict and updated pid dict.
We can address this problem by redefine a prefetcher in resume_or_load
2020-05-05 23:20:42 +08:00
liaoxingyu
a2dcd7b4ab feat(layers/norm): add ghost batchnorm
add a get_norm fucntion to easily change normalization between batchnorm, ghost bn and group bn
2020-05-01 09:02:46 +08:00
liaoxingyu
329764bb60 refactor(heads): move num_classes out from heads
set parameter num_classes in meta_arch to easily modify different heads fc layer
2020-04-29 21:29:48 +08:00
liaoxingyu
d27729c5bb refactor(preciseBN): add preciseBN datasets show 2020-04-29 21:05:53 +08:00
liaoxingyu
ec19bcc1d3 style(configs): put all config files together
put all config files into one place for easily control,
and add tools for put train_net.py which almost the same in
different projects
2020-04-29 16:18:54 +08:00
liaoxingyu
e38a799b63 fix(engine/defaults): fix precise bn bug
fix problem in precise bn, which will not use precise bn datasets, and throw some errors
2020-04-29 16:16:54 +08:00
zjk15068083791
2f3f6e3267
Add files via upload 2020-04-27 16:27:44 +08:00
liaoxingyu
5daf322ac6 fix(data/samplers): fix bug in triplet sampler
use drop indices to avoid two groups of same id,
but did not consider drop indices = 0, then will lead to
indentites[:0], this is a no teminal loop
2020-04-27 15:25:29 +08:00
liaoxingyu
325d9abb76 feat($solver): change scheduler call methods
using name of lr scheduler in config to call
2020-04-27 15:12:01 +08:00
liaoxingyu
9e3f2c1e7a fix($data/transforms): change augmix augmentation pool
change augmentation_all from augmentation_reid in augmix
because we found AutoAugmentation using ImageNet Policy will not harm performance
2020-04-27 15:06:27 +08:00
liaoxingyu
4d3e5fd378 refactor(evaluation): add feature l2 norm in evaluation
change the l2 norm function from inference function in Module to reid evaluation.
because sometimes we need to use the original features generated by model rather than normalized ones.
2020-04-27 14:51:39 +08:00
liaoxingyu
9910bb9158 fix($modeling/heads): fix targets missing bug
fix bug in heads about return outputs without targets.
2020-04-27 14:49:58 +08:00
liaoxingyu
2efbc6d371 fix($modeling/heads/bnneck_head): fix heads outputs bug
fix bug of heads outputs, which will lead to no targets return.
2020-04-27 11:48:21 +08:00
liaoxingyu
a6bd0371e2 feat($data): add autoaugment
add auto augmentation support with ImageNet policy and CIFAR10 policy.
modify codes in transforms and config for adapting to this augmentation.
2020-04-27 11:41:12 +08:00
liaoxingyu
8abd3bab03 feat($layers): add new act func
add mish, gelu supported
2020-04-24 12:17:00 +08:00
liaoxingyu
3984f0c91d refactor($modeling/meta): refactor heads output
without intermediate variables generated by reid heads, make it more flexible
2020-04-24 12:16:18 +08:00
liaoxingyu
e3ae03cc58 feat($modeling/backbones): add new backbones
add osnet, resnext and resnest backbone supported
2020-04-24 12:14:56 +08:00
liaoxingyu
b098b194ba refactor($modeling/meta_arch): remove bdb_network 2020-04-21 11:44:29 +08:00
liaoxingyu
6c9af664dc refactor($modeling/meta_arch): remove useless parts
remove useless meta_archs and backbones
2020-04-21 11:42:14 +08:00