Commit Graph

1 Commits (54f96ba78ad087c7110bafe2132a91536f0b5a01)

Author SHA1 Message Date
liaoxingyu 3d1bae9f13 fix triplet loss backward propagation on multi-gpu training (#82)
Summary: fix torch.distributed.all_gather has no gradient when performing all_gather operation on the provided tensors, instead using `GatherLayer`.
2020-09-28 17:16:51 +08:00