@@ -57,7 +56,6 @@ The construction function will do the following:
...
@@ -57,7 +56,6 @@ The construction function will do the following:
self._aggre_type = aggregator_type
self._aggre_type = aggregator_type
self.norm = norm
self.norm = norm
self.activation = activation
self.activation = activation
self._allow_zero_in_degree = allow_zero_in_degree
In construction function, we first need to set the data dimensions. For
In construction function, we first need to set the data dimensions. For
general Pytorch module, the dimensions are usually input dimension,
general Pytorch module, the dimensions are usually input dimension,
...
@@ -115,7 +113,7 @@ DGL NN Module Forward Function
...
@@ -115,7 +113,7 @@ DGL NN Module Forward Function
In NN module, ``forward()`` function does the actual message passing and
In NN module, ``forward()`` function does the actual message passing and
computating. Compared with Pytorch’s NN module which usually takes
computating. Compared with Pytorch’s NN module which usually takes
tensors as the parameters, DGL NN module takes an additional parameter
tensors as the parameters, DGL NN module takes an additional parameter
`DGLGraph <https://docs.dgl.ai/api/python/graph.html>`__. The
:class:`dgl.DGLGraph`. The
workload for ``forward()`` function can be splitted into three parts:
workload for ``forward()`` function can be splitted into three parts:
- Graph checking and graph type specification.
- Graph checking and graph type specification.
...
@@ -133,37 +131,16 @@ Graph checking and graph type specification
...
@@ -133,37 +131,16 @@ Graph checking and graph type specification
def forward(self, graph, feat):
def forward(self, graph, feat):
with graph.local_scope():
with graph.local_scope():
# Graph checking
if not self._allow_zero_in_degree:
if (graph.in_degrees() == 0).any():
raise DGLError('There are 0-in-degree nodes in the graph,
'output for those nodes will be invalid.'
'This is harmful for some applications, '
'causing silent performance regression.'
'Adding self-loop on the input graph by calling
'`g = dgl.add_self_loop(g)` will resolve the issue.'
'Setting ``allow_zero_in_degree`` to be `True`
'when constructing this module will suppress the '
'check and let the code run.')
# Specify graph type then expand input feature according to graph type
# Specify graph type then expand input feature according to graph type
feat_src, feat_dst = expand_as_pair(feat, graph)
feat_src, feat_dst = expand_as_pair(feat, graph)
**This part of code is usually shared by all the NN modules.**
``forward()`` needs to handle many corner cases on the input that can
``forward()`` needs to handle many corner cases on the input that can
lead to invalid values in computing and message passing. The above
lead to invalid values in computing and message passing. One typical check in conv modules like :class:`~dgl.nn.pytorch.conv.GraphConv` is to verify no 0-in-degree node in the input graph. When a node has 0-in-degree, the ``mailbox`` will be empty and the reduce function will produce all-zero values. This may cause silent regression in model performance. However, in :class:`~dgl.nn.pytorch.conv.SAGEConv` module, the aggregated representation will be concatenated with the original node feature, the output of ``forward()`` will not be all-zero. No such check is needed in this case.
example handles the case where there are 0-in-degree nodes in the input
graph.
When a node has 0-in-degree, the ``mailbox`` will be empty and the
reduce function will not produce valid values. For example, if the
reduce function is ``max``, the output for the 0-in-degree nodes
will be ``-inf``.
DGL NN module should be reusable across different types of graph input
DGL NN module should be reusable across different types of graph input