fix the comment of init_weights in ``ConvModule`` (#730)

* polish the comment of init_weights in ConvModule

* polish the comment

* polish the comment
pull/732/head
Jintao Lin 2020-12-18 14:00:47 +08:00 committed by GitHub
parent 95acffb910
commit 3392a4ea00
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
1 changed files with 7 additions and 6 deletions

View File

@ -166,13 +166,14 @@ class ConvModule(nn.Module):
def init_weights(self):
# 1. It is mainly for customized conv layers with their own
# initialization manners, and we do not want ConvModule to
# overrides the initialization.
# initialization manners by calling their own ``init_weights()``,
# and we do not want ConvModule to override the initialization.
# 2. For customized conv layers without their own initialization
# manners, they will be initialized by this method with default
# `kaiming_init`.
# 3. For PyTorch's conv layers, they will be initialized anyway by
# their own `reset_parameters` methods.
# manners (that is, they don't have their own ``init_weights()``)
# and PyTorch's conv layers, they will be initialized by
# this method with default ``kaiming_init``.
# Note: For PyTorch's conv layers, they will be overwritten by our
# initialization implementation using default ``kaiming_init``.
if not hasattr(self.conv, 'init_weights'):
if self.with_activation and self.act_cfg['type'] == 'LeakyReLU':
nonlinearity = 'leaky_relu'