liaoxingyu
|
53fed7451d
|
feat: support amp training
Summary: support automatic mixed precision training #217
|
2020-09-02 18:03:12 +08:00 |
liaoxingyu
|
d00ce8fc3c
|
refactor model arch
|
2020-09-01 16:14:45 +08:00 |
liaoxingyu
|
ac8409a7da
|
updating for pytorch1.6
|
2020-08-20 15:51:41 +08:00 |
liaoxingyu
|
3f35eb449d
|
minor update
|
2020-07-14 11:58:06 +08:00 |
liaoxingyu
|
e81b13798c
|
change way of loss function
Summary: move loss computation from meta_arch to run_step considering distillation loss
|
2020-07-10 16:28:53 +08:00 |
liaoxingyu
|
fec7abc461
|
finish v0.2 ddp training
|
2020-07-06 16:57:43 +08:00 |
liaoxingyu
|
84c733fa85
|
fix: remove prefetcher, put normalizer in model
1. remove messy data prefetcher which will cause confusion
2. put normliazer in model to accelerate training via GPU computing
|
2020-05-25 23:39:11 +08:00 |
liaoxingyu
|
6a8961ce48
|
1. upload circle loss and arcface
2. finish freeze training
3. update augmix data augmentation
|
2020-04-05 23:54:26 +08:00 |
liaoxingyu
|
23bedfce12
|
update version0.2 code
|
2020-03-25 10:58:26 +08:00 |
L1aoXingyu
|
12957f66aa
|
Change architecture:
1. delete redundant preprocess
2. add data prefetcher to accelerate data loading
3. fix minor bug of triplet sampler when only one image for one id
|
2020-02-18 21:01:23 +08:00 |
L1aoXingyu
|
a2f69d0537
|
Update StrongBaseline results for market1501 and dukemtmc
|
2020-02-11 22:38:40 +08:00 |
L1aoXingyu
|
8a9c0ccfad
|
Finish first version for fastreid
|
2020-02-10 22:13:04 +08:00 |
L1aoXingyu
|
db6ed12b14
|
Update sampler code
|
2020-02-10 07:38:56 +08:00 |
liaoxingyu
|
71950d2c09
|
1. Fix evaluation code
2. Finish multi-dataset evaluation
3. Decouple image preprocess and output postprocess with model forward for DataParallel training
4. Finish build backbone registry
5. Fix dataset sampler
|
2020-01-21 20:24:26 +08:00 |
liaoxingyu
|
b761b656f3
|
Finish basic training loop and evaluation results
|
2020-01-20 21:33:37 +08:00 |