Unverified Commit 3392a4ea authored by Jintao Lin's avatar Jintao Lin Committed by GitHub
Browse files

fix the comment of init_weights in ``ConvModule`` (#730)

* polish the comment of init_weights in ConvModule

* polish the comment

* polish the comment
parent 95acffb9
...@@ -166,13 +166,14 @@ class ConvModule(nn.Module): ...@@ -166,13 +166,14 @@ class ConvModule(nn.Module):
def init_weights(self): def init_weights(self):
# 1. It is mainly for customized conv layers with their own # 1. It is mainly for customized conv layers with their own
# initialization manners, and we do not want ConvModule to # initialization manners by calling their own ``init_weights()``,
# overrides the initialization. # and we do not want ConvModule to override the initialization.
# 2. For customized conv layers without their own initialization # 2. For customized conv layers without their own initialization
# manners, they will be initialized by this method with default # manners (that is, they don't have their own ``init_weights()``)
# `kaiming_init`. # and PyTorch's conv layers, they will be initialized by
# 3. For PyTorch's conv layers, they will be initialized anyway by # this method with default ``kaiming_init``.
# their own `reset_parameters` methods. # Note: For PyTorch's conv layers, they will be overwritten by our
# initialization implementation using default ``kaiming_init``.
if not hasattr(self.conv, 'init_weights'): if not hasattr(self.conv, 'init_weights'):
if self.with_activation and self.act_cfg['type'] == 'LeakyReLU': if self.with_activation and self.act_cfg['type'] == 'LeakyReLU':
nonlinearity = 'leaky_relu' nonlinearity = 'leaky_relu'
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment