liaoxingyu
54f96ba78a
fix for lint_python
2021-05-31 17:36:56 +08:00
liaoxingyu
8ab3554958
Support self-distill with EMA updated model
2021-05-31 17:17:24 +08:00
Xingyu Liao
883fd4aede
add configurable decorator & linear loss decouple ( #441 )
...
Summary: Add configurable decorator which can call `Baseline` with `Baseline(cfg)` or `Baseline(cfg, heads=heads, ...)`
Decouple linear and loss computation for partial-fc support.
Reviewed By: l1aoxingyu
2021-03-23 12:10:06 +08:00
liaoxingyu
77a91b1204
feat: support multi-teacher kd
...
Summary: support multi-teacher kd with logits and overhaul distillation
2021-01-29 17:25:31 +08:00
liaoxingyu
e26182e6ec
make lr warmup by iter
...
Summary: change warmup way by iter not by epoch, which will make it more flexible when training small epochs
2021-01-22 11:17:21 +08:00
liaoxingyu
15e1729a27
update fastreid V1.0
2021-01-18 11:36:38 +08:00