Unverified Commit 3392a4ea authored by Jintao Lin's avatar Jintao Lin Committed by GitHub
Browse files

fix the comment of init_weights in ``ConvModule`` (#730)

* polish the comment of init_weights in ConvModule

* polish the comment

* polish the comment
parent 95acffb9
......@@ -166,13 +166,14 @@ class ConvModule(nn.Module):
def init_weights(self):
# 1. It is mainly for customized conv layers with their own
# initialization manners, and we do not want ConvModule to
# overrides the initialization.
# initialization manners by calling their own ``init_weights()``,
# and we do not want ConvModule to override the initialization.
# 2. For customized conv layers without their own initialization
# manners, they will be initialized by this method with default
# `kaiming_init`.
# 3. For PyTorch's conv layers, they will be initialized anyway by
# their own `reset_parameters` methods.
# manners (that is, they don't have their own ``init_weights()``)
# and PyTorch's conv layers, they will be initialized by
# this method with default ``kaiming_init``.
# Note: For PyTorch's conv layers, they will be overwritten by our
# initialization implementation using default ``kaiming_init``.
if not hasattr(self.conv, 'init_weights'):
if self.with_activation and self.act_cfg['type'] == 'LeakyReLU':
nonlinearity = 'leaky_relu'
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment