Commit Graph

78 Commits (5dfe5375153e9d61aa3025a736870359d96a9516)

Author SHA1 Message Date
liaoxingyu 5dfe537515 update attribute project 2020-09-23 19:45:13 +08:00
liaoxingyu 2f29228086 fix regnet cfg problem 2020-09-23 14:41:44 +08:00
liaoxingyu df823da09a fix typro bce loss
close #273
2020-09-18 10:39:10 +08:00
liaoxingyu d9a63a959f fix bug in mgn #272
Summary: fix get_norm bug in mgn
2020-09-17 18:17:53 +08:00
liaoxingyu 648198e6e5 add efficientnet support 2020-09-10 11:04:52 +08:00
liaoxingyu 1b84348619 remove `num_splits` in batchnorm
Summary: `num_splits` works for GhostBN, but it's very uncommon
2020-09-10 11:01:07 +08:00
liaoxingyu 4d573b8107 refactor reid head
Summary: merge BNneckHead, LinearHead and ReductionHead into EmbeddingHead
because they are highly similar and can be prepared for ClsHead
2020-09-10 10:57:37 +08:00
liaoxingyu aa5c422606 fix pair-wise circle loss
fix #252
2020-09-09 15:28:52 +08:00
liaoxingyu 53fed7451d feat: support amp training
Summary: support automatic mixed precision training #217
2020-09-02 18:03:12 +08:00
liaoxingyu d00ce8fc3c refactor model arch 2020-09-01 16:14:45 +08:00
liaoxingyu f4305b0964 fix bnneck 2020-08-20 16:21:14 +08:00
liaoxingyu ac8409a7da updating for pytorch1.6 2020-08-20 15:51:41 +08:00
liaoxingyu 9c667d1a0f add pretrain_path support in backbone 2020-08-14 14:00:26 +08:00
liaoxingyu db6b42da4f update resnest url 2020-08-10 14:18:00 +08:00
liaoxingyu d1c20cbe50 fix pre-train model bugs
fix bugs locks when downloading pre-train model
2020-08-04 15:56:36 +08:00
liaoxingyu 35794621cc remove addmm warning 2020-07-31 16:32:10 +08:00
liaoxingyu 2430b8ed75 pretrain model bugfix
Fix pretrain model download bugs and testing bugs in multiprocess
2020-07-31 10:42:38 +08:00
liaoxingyu 65169b40bd support ddp testing 2020-07-30 20:15:28 +08:00
liaoxingyu 16655448c2 onnx/trt support
Summary: change model pretrain mode and support onnx/TensorRT export
2020-07-29 17:43:39 +08:00
liaoxingyu ee634df290 rm _all_ in resnet 2020-07-17 19:40:25 +08:00
liaoxingyu 5ad22d5d36 update regnet init 2020-07-17 19:35:40 +08:00
liaoxingyu f9539be683 add regnet config file 2020-07-17 19:32:36 +08:00
liaoxingyu 3b57dea49f support regnet backbone 2020-07-17 19:13:45 +08:00
liaoxingyu 3f35eb449d minor update 2020-07-14 11:58:06 +08:00
liaoxingyu f8d468647c resnet expansion add 2020-07-10 22:40:07 +08:00
liaoxingyu ea8a3cc534 fix typro 2020-07-10 16:26:35 +08:00
liaoxingyu fec7abc461 finish v0.2 ddp training 2020-07-06 16:57:43 +08:00
liaoxingyu 5ae2cff47e fix circle/arcface pred_logits
fix #136
2020-07-06 16:57:03 +08:00
liaoxingyu ecc2b1a790 update naive sampler
Summary: update naive sampler which will introduce unbalanced sampling
2020-06-15 20:50:25 +08:00
liaoxingyu 56a1ab4a5d update fast global avgpool
Summary: update fast pool according to https://arxiv.org/pdf/2003.13630.pdf
2020-06-12 16:34:03 +08:00
liaoxingyu cbdc01a1c3 update pairwise circle loss
Summary: add param of pairwise circle loss to config, and update pairwise circle loss version
2020-06-10 19:07:29 +08:00
liaoxingyu 3732f94405 update osnet 2020-06-09 14:38:49 +08:00
liaoxingyu 25a7f82df7 change style in baseline 2020-06-05 11:23:11 +08:00
liaoxingyu bc221cb05f fix mgn multi-gpu training problem
Summary: norm_type in pool_reduce will not change when use syncBN
2020-06-05 11:11:50 +08:00
liaoxingyu 94d85fe11c fix convert caffe model problem 2020-06-04 16:39:12 +08:00
liaoxingyu e7156e1cfa fix mgn not registered problem 2020-06-03 11:46:28 +08:00
liaoxingyu c036ac5bdd update reduction head 2020-05-30 16:50:02 +08:00
liaoxingyu 5528d17ace refactor code
Summary: change code style and refactor code, add avgmax pooling layer in gem_pool
2020-05-28 13:49:39 +08:00
liaoxingyu a1cb123cfa fix R101 bottleneck missing problem
Summary: add key 101 in block dict to support R101
2020-05-26 14:48:32 +08:00
liaoxingyu d4b71de3aa switch between soft and hard margin when inf
Summary: Add a mechnism to automatic switch triplet loss with soft margin to hard margin when loss becomes inf.
2020-05-26 14:36:33 +08:00
liaoxingyu 5982f90920 support loading various pretrained weights
Summary: Support loading pretrained model by custom path. With this function, we can load infoMin weights.
2020-05-26 14:33:18 +08:00
liaoxingyu 5d4758125d support ResNet34 backbone
Summary: add BasicBlock to support ResNet34
2020-05-26 13:18:09 +08:00
liaoxingyu 84c733fa85 fix: remove prefetcher, put normalizer in model
1. remove messy data prefetcher which will cause  confusion
2. put normliazer in model to accelerate training via GPU computing
2020-05-25 23:39:11 +08:00
liaoxingyu 94c86579a3 fix(heads): fix bug in reduce head
add neck_feat from config, add inplace in leakyrelu for memory save
2020-05-23 10:41:13 +08:00
liaoxingyu c21de64166 fix: add linear initial method 2020-05-21 23:59:51 +08:00
liaoxingyu 18a33f7962 feat: add MGN model
support MGN architecture and training config
2020-05-15 11:39:54 +08:00
liaoxingyu 0356ef8c5c feat: add SyncBN and GroupNorm suppor 2020-05-14 11:36:28 +08:00
liaoxingyu 9fae467adf feat(engine/defaults): add DefaultPredictor to get image reid features
Add a new predictor interface, and modify demo code to predict image features.
2020-05-08 19:24:27 +08:00
liaoxingyu 8ab0bc2455 style(backbone): make parameters loading logging more elegant 2020-05-08 12:22:06 +08:00
liaoxingyu a2dcd7b4ab feat(layers/norm): add ghost batchnorm
add a get_norm fucntion to easily change normalization between batchnorm, ghost bn and group bn
2020-05-01 09:02:46 +08:00