"docs/git@developer.sourcefind.cn:OpenDAS/vision.git" did not exist on "c67a58396d894c48897b96fce4eae0bcb3fd2755"
Unverified Commit 3bd5a9b6 authored by Mufei Li's avatar Mufei Li Committed by GitHub
Browse files

[Doc] Fix Doc (#3777)

* Update

* Update

* Update

* Update
parent 6d9433b0
...@@ -10,7 +10,9 @@ API Reference ...@@ -10,7 +10,9 @@ API Reference
dgl.DGLGraph dgl.DGLGraph
dgl.distributed dgl.distributed
dgl.function dgl.function
nn nn-pytorch
nn-tensorflow
nn-mxnet
dgl.ops dgl.ops
dgl.sampling dgl.sampling
dgl.contrib.UnifiedTensor dgl.contrib.UnifiedTensor
......
.. _apinn-mxnet:
dgl.nn (MXNet)
================
Conv Layers
----------------------------------------
.. autosummary::
:toctree: ../../generated/
:nosignatures:
:template: classtemplate.rst
~dgl.nn.mxnet.conv.GraphConv
~dgl.nn.mxnet.conv.RelGraphConv
~dgl.nn.mxnet.conv.TAGConv
~dgl.nn.mxnet.conv.GATConv
~dgl.nn.mxnet.conv.EdgeConv
~dgl.nn.mxnet.conv.SAGEConv
~dgl.nn.mxnet.conv.SGConv
~dgl.nn.mxnet.conv.APPNPConv
~dgl.nn.mxnet.conv.GINConv
~dgl.nn.mxnet.conv.GatedGraphConv
~dgl.nn.mxnet.conv.GMMConv
~dgl.nn.mxnet.conv.ChebConv
~dgl.nn.mxnet.conv.AGNNConv
~dgl.nn.mxnet.conv.NNConv
Dense Conv Layers
----------------------------------------
.. autosummary::
:toctree: ../../generated/
:nosignatures:
:template: classtemplate.rst
~dgl.nn.mxnet.conv.DenseGraphConv
~dgl.nn.mxnet.conv.DenseSAGEConv
~dgl.nn.mxnet.conv.DenseChebConv
Global Pooling Layers
----------------------------------------
.. autosummary::
:toctree: ../../generated/
:nosignatures:
:template: classtemplate.rst
~dgl.nn.mxnet.glob.SumPooling
~dgl.nn.mxnet.glob.AvgPooling
~dgl.nn.mxnet.glob.MaxPooling
~dgl.nn.mxnet.glob.SortPooling
~dgl.nn.mxnet.glob.GlobalAttentionPooling
~dgl.nn.mxnet.glob.Set2Set
Heterogeneous Learning Modules
----------------------------------------
.. autosummary::
:toctree: ../../generated/
:nosignatures:
:template: classtemplate.rst
~dgl.nn.mxnet.HeteroGraphConv
Utility Modules
----------------------------------------
.. autosummary::
:toctree: ../../generated/
:nosignatures:
:template: classtemplate.rst
~dgl.nn.mxnet.utils.Sequential
.. _apinn-pytorch:
dgl.nn (PyTorch)
================
Conv Layers
----------------------------------------
.. autosummary::
:toctree: ../../generated/
:nosignatures:
:template: classtemplate.rst
~dgl.nn.pytorch.conv.GraphConv
~dgl.nn.pytorch.conv.EdgeWeightNorm
~dgl.nn.pytorch.conv.RelGraphConv
~dgl.nn.pytorch.conv.TAGConv
~dgl.nn.pytorch.conv.GATConv
~dgl.nn.pytorch.conv.GATv2Conv
~dgl.nn.pytorch.conv.EGATConv
~dgl.nn.pytorch.conv.EdgeConv
~dgl.nn.pytorch.conv.SAGEConv
~dgl.nn.pytorch.conv.SGConv
~dgl.nn.pytorch.conv.APPNPConv
~dgl.nn.pytorch.conv.GINConv
~dgl.nn.pytorch.conv.GatedGraphConv
~dgl.nn.pytorch.conv.GMMConv
~dgl.nn.pytorch.conv.ChebConv
~dgl.nn.pytorch.conv.AGNNConv
~dgl.nn.pytorch.conv.NNConv
~dgl.nn.pytorch.conv.AtomicConv
~dgl.nn.pytorch.conv.CFConv
~dgl.nn.pytorch.conv.DotGatConv
~dgl.nn.pytorch.conv.TWIRLSConv
~dgl.nn.pytorch.conv.TWIRLSUnfoldingAndAttention
~dgl.nn.pytorch.conv.GCN2Conv
Dense Conv Layers
----------------------------------------
.. autosummary::
:toctree: ../../generated/
:nosignatures:
:template: classtemplate.rst
~dgl.nn.pytorch.conv.DenseGraphConv
~dgl.nn.pytorch.conv.DenseSAGEConv
~dgl.nn.pytorch.conv.DenseChebConv
Global Pooling Layers
----------------------------------------
.. autosummary::
:toctree: ../../generated/
:nosignatures:
:template: classtemplate.rst
~dgl.nn.pytorch.glob.SumPooling
~dgl.nn.pytorch.glob.AvgPooling
~dgl.nn.pytorch.glob.MaxPooling
~dgl.nn.pytorch.glob.SortPooling
~dgl.nn.pytorch.glob.WeightAndSum
~dgl.nn.pytorch.glob.GlobalAttentionPooling
~dgl.nn.pytorch.glob.Set2Set
~dgl.nn.pytorch.glob.SetTransformerEncoder
~dgl.nn.pytorch.glob.SetTransformerDecoder
Score Modules for Link Prediction and Knowledge Graph Completion
----------------------------------------
.. autosummary::
:toctree: ../../generated/
:nosignatures:
:template: classtemplate.rst
~dgl.nn.pytorch.link.EdgePredictor
~dgl.nn.pytorch.link.TransE
~dgl.nn.pytorch.link.TransR
Heterogeneous Learning Modules
----------------------------------------
.. autosummary::
:toctree: ../../generated/
:nosignatures:
:template: classtemplate.rst
~dgl.nn.pytorch.HeteroGraphConv
~dgl.nn.pytorch.HeteroLinear
~dgl.nn.pytorch.HeteroEmbedding
~dgl.nn.pytorch.TypedLinear
Utility Modules
----------------------------------------
.. autosummary::
:toctree: ../../generated/
:nosignatures:
:template: classtemplate.rst
~dgl.nn.pytorch.utils.Sequential
~dgl.nn.pytorch.utils.WeightBasis
~dgl.nn.pytorch.factory.KNNGraph
~dgl.nn.pytorch.factory.SegmentedKNNGraph
~dgl.nn.pytorch.utils.JumpingKnowledge
~dgl.nn.pytorch.sparse_emb.NodeEmbedding
~dgl.nn.pytorch.explain.GNNExplainer
.. _apinn-tensorflow:
dgl.nn (TensorFlow)
================
Conv Layers
----------------------------------------
.. autosummary::
:toctree: ../../generated/
:nosignatures:
:template: classtemplate.rst
~dgl.nn.tensorflow.conv.GraphConv
~dgl.nn.tensorflow.conv.RelGraphConv
~dgl.nn.tensorflow.conv.GATConv
~dgl.nn.tensorflow.conv.SAGEConv
~dgl.nn.tensorflow.conv.ChebConv
~dgl.nn.tensorflow.conv.SGConv
~dgl.nn.tensorflow.conv.APPNPConv
~dgl.nn.tensorflow.conv.GINConv
Global Pooling Layers
----------------------------------------
.. autosummary::
:toctree: ../../generated/
:nosignatures:
:template: classtemplate.rst
~dgl.nn.tensorflow.glob.SumPooling
~dgl.nn.tensorflow.glob.AvgPooling
~dgl.nn.tensorflow.glob.MaxPooling
~dgl.nn.tensorflow.glob.SortPooling
~dgl.nn.tensorflow.glob.GlobalAttentionPooling
Heterogeneous Learning Modules
----------------------------------------
.. autosummary::
:toctree: ../../generated/
:nosignatures:
:template: classtemplate.rst
~dgl.nn.tensorflow.glob.HeteroGraphConv
.. _apinn:
dgl.nn
==========
PyTorch
----------------------------------------
Conv Layers
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
.. currentmodule:: dgl.nn.pytorch.conv
.. automodule:: dgl.nn.pytorch.conv
.. autosummary::
:toctree: ../../generated/
:nosignatures:
:template: classtemplate.rst
GraphConv
EdgeWeightNorm
RelGraphConv
TAGConv
GATConv
GATv2Conv
EGATConv
EdgeConv
SAGEConv
SGConv
APPNPConv
GINConv
GatedGraphConv
GMMConv
ChebConv
AGNNConv
NNConv
AtomicConv
CFConv
DotGatConv
TWIRLSConv
TWIRLSUnfoldingAndAttention
GCN2Conv
Dense Conv Layers
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
.. autosummary::
:toctree: ../../generated/
:nosignatures:
:template: classtemplate.rst
DenseGraphConv
DenseSAGEConv
DenseChebConv
Global Pooling Layers
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
.. currentmodule:: dgl.nn.pytorch.glob
.. automodule:: dgl.nn.pytorch.glob
.. autosummary::
:toctree: ../../generated/
:nosignatures:
:template: classtemplate.rst
SumPooling
AvgPooling
MaxPooling
SortPooling
WeightAndSum
GlobalAttentionPooling
Set2Set
SetTransformerEncoder
SetTransformerDecoder
Score Modules for Link Prediction and Knowledge Graph Completion
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
.. currentmodule:: dgl.nn.pytorch.link
.. automodule:: dgl.nn.pytorch.link
.. autosummary::
:toctree: ../../generated/
:nosignatures:
:template: classtemplate.rst
EdgePredictor
TransE
TransR
Heterogeneous Learning Modules
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
.. currentmodule:: dgl.nn.pytorch
.. automodule:: dgl.nn.pytorch
.. autosummary::
:toctree: ../../generated/
:nosignatures:
:template: classtemplate.rst
HeteroGraphConv
HeteroLinear
HeteroEmbedding
TypedLinear
Utility Modules
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
.. autosummary::
:toctree: ../../generated/
:nosignatures:
:template: classtemplate.rst
~utils.Sequential
~utils.WeightBasis
~factory.KNNGraph
~factory.SegmentedKNNGraph
~utils.JumpingKnowledge
~sparse_emb.NodeEmbedding
~explain.GNNExplainer
TensorFlow
----------------------------------------
Conv Layers
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
.. currentmodule:: dgl.nn.tensorflow.conv
.. automodule:: dgl.nn.tensorflow.conv
.. autosummary::
:toctree: ../../generated/
:nosignatures:
:template: classtemplate.rst
GraphConv
RelGraphConv
GATConv
SAGEConv
ChebConv
SGConv
APPNPConv
GINConv
Global Pooling Layers
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
.. currentmodule:: dgl.nn.tensorflow.glob
.. automodule:: dgl.nn.tensorflow.glob
.. autosummary::
:toctree: ../../generated/
:nosignatures:
:template: classtemplate.rst
SumPooling
AvgPooling
MaxPooling
SortPooling
GlobalAttentionPooling
Heterogeneous Learning Modules
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
.. currentmodule:: dgl.nn.tensorflow
.. automodule:: dgl.nn.tensorflow
.. autosummary::
:toctree: ../../generated/
:nosignatures:
:template: classtemplate.rst
HeteroGraphConv
MXNet
----------------------------------------
Conv Layers
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
.. currentmodule:: dgl.nn.mxnet.conv
.. automodule:: dgl.nn.mxnet.conv
.. autosummary::
:toctree: ../../generated/
:nosignatures:
:template: classtemplate.rst
GraphConv
RelGraphConv
TAGConv
GATConv
EdgeConv
SAGEConv
SGConv
APPNPConv
GINConv
GatedGraphConv
GMMConv
ChebConv
AGNNConv
NNConv
Dense Conv Layers
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
.. autosummary::
:toctree: ../../generated/
:nosignatures:
:template: classtemplate.rst
DenseGraphConv
DenseSAGEConv
DenseChebConv
Global Pooling Layers
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
.. currentmodule:: dgl.nn.mxnet.glob
.. automodule:: dgl.nn.mxnet.glob
.. autosummary::
:toctree: ../../generated/
:nosignatures:
:template: classtemplate.rst
SumPooling
AvgPooling
MaxPooling
SortPooling
GlobalAttentionPooling
Set2Set
Heterogeneous Learning Modules
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
.. currentmodule:: dgl.nn.mxnet
.. automodule:: dgl.nn.mxnet
.. autosummary::
:toctree: ../../generated/
:nosignatures:
:template: classtemplate.rst
HeteroGraphConv
Utility Modules
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
.. autosummary::
:toctree: ../../generated/
:nosignatures:
:template: classtemplate.rst
~utils.Sequential
...@@ -44,7 +44,9 @@ Welcome to Deep Graph Library Tutorials and Documentation ...@@ -44,7 +44,9 @@ Welcome to Deep Graph Library Tutorials and Documentation
api/python/dgl.distributed api/python/dgl.distributed
api/python/dgl.function api/python/dgl.function
api/python/dgl.geometry api/python/dgl.geometry
api/python/nn api/python/nn-pytorch
api/python/nn-tensorflow
api/python/nn-mxnet
api/python/nn.functional api/python/nn.functional
api/python/dgl.ops api/python/dgl.ops
api/python/dgl.optim api/python/dgl.optim
......
...@@ -50,11 +50,7 @@ __all__ = [ ...@@ -50,11 +50,7 @@ __all__ = [
] ]
def update_graph_structure(g, data_dict, copy_edata=True): def update_graph_structure(g, data_dict, copy_edata=True):
r""" r"""Update the structure of a graph.
Description
-----------
Update the structure of a graph.
Parameters Parameters
---------- ----------
...@@ -93,12 +89,7 @@ def update_graph_structure(g, data_dict, copy_edata=True): ...@@ -93,12 +89,7 @@ def update_graph_structure(g, data_dict, copy_edata=True):
return new_g return new_g
class BaseTransform: class BaseTransform:
r""" r"""An abstract class for writing transforms."""
Description
-----------
An abstract class for writing transforms.
"""
def __call__(self, g): def __call__(self, g):
raise NotImplementedError raise NotImplementedError
...@@ -106,11 +97,7 @@ class BaseTransform: ...@@ -106,11 +97,7 @@ class BaseTransform:
return self.__class__.__name__ + '()' return self.__class__.__name__ + '()'
class AddSelfLoop(BaseTransform): class AddSelfLoop(BaseTransform):
r""" r"""Add self-loops for each node in the graph and return a new graph.
Description
-----------
Add self-loops for each node in the graph and return a new graph.
For heterogeneous graphs, self-loops are added only for edge types with same For heterogeneous graphs, self-loops are added only for edge types with same
source and destination node types. source and destination node types.
...@@ -210,11 +197,7 @@ class AddSelfLoop(BaseTransform): ...@@ -210,11 +197,7 @@ class AddSelfLoop(BaseTransform):
return g return g
class RemoveSelfLoop(BaseTransform): class RemoveSelfLoop(BaseTransform):
r""" r"""Remove self-loops for each node in the graph and return a new graph.
Description
-----------
Remove self-loops for each node in the graph and return a new graph.
For heterogeneous graphs, this operation only applies to edge types with same For heterogeneous graphs, this operation only applies to edge types with same
source and destination node types. source and destination node types.
...@@ -246,11 +229,7 @@ class RemoveSelfLoop(BaseTransform): ...@@ -246,11 +229,7 @@ class RemoveSelfLoop(BaseTransform):
(tensor([1]), tensor([2])) (tensor([1]), tensor([2]))
""" """
def transform_etype(self, c_etype, g): def transform_etype(self, c_etype, g):
r""" r"""Transform the graph corresponding to a canonical edge type.
Description
-----------
Transform the graph corresponding to a canonical edge type.
Parameters Parameters
---------- ----------
...@@ -275,11 +254,7 @@ class RemoveSelfLoop(BaseTransform): ...@@ -275,11 +254,7 @@ class RemoveSelfLoop(BaseTransform):
return g return g
class AddReverse(BaseTransform): class AddReverse(BaseTransform):
r""" r"""Add a reverse edge :math:`(i,j)` for each edge :math:`(j,i)` in the input graph and
Description
-----------
Add a reverse edge :math:`(i,j)` for each edge :math:`(j,i)` in the input graph and
return a new graph. return a new graph.
For a heterogeneous graph, it adds a "reverse" edge type for each edge type For a heterogeneous graph, it adds a "reverse" edge type for each edge type
...@@ -343,11 +318,7 @@ class AddReverse(BaseTransform): ...@@ -343,11 +318,7 @@ class AddReverse(BaseTransform):
self.sym_new_etype = sym_new_etype self.sym_new_etype = sym_new_etype
def transform_symmetric_etype(self, c_etype, g, data_dict): def transform_symmetric_etype(self, c_etype, g, data_dict):
r""" r"""Transform the graph corresponding to a symmetric canonical edge type.
Description
-----------
Transform the graph corresponding to a symmetric canonical edge type.
Parameters Parameters
---------- ----------
...@@ -366,11 +337,7 @@ class AddReverse(BaseTransform): ...@@ -366,11 +337,7 @@ class AddReverse(BaseTransform):
data_dict[c_etype] = (src, dst) data_dict[c_etype] = (src, dst)
def transform_asymmetric_etype(self, c_etype, g, data_dict): def transform_asymmetric_etype(self, c_etype, g, data_dict):
r""" r"""Transform the graph corresponding to an asymmetric canonical edge type.
Description
-----------
Transform the graph corresponding to an asymmetric canonical edge type.
Parameters Parameters
---------- ----------
...@@ -389,11 +356,7 @@ class AddReverse(BaseTransform): ...@@ -389,11 +356,7 @@ class AddReverse(BaseTransform):
}) })
def transform_etype(self, c_etype, g, data_dict): def transform_etype(self, c_etype, g, data_dict):
r""" r"""Transform the graph corresponding to a canonical edge type.
Description
-----------
Transform the graph corresponding to a canonical edge type.
Parameters Parameters
---------- ----------
...@@ -434,11 +397,7 @@ class AddReverse(BaseTransform): ...@@ -434,11 +397,7 @@ class AddReverse(BaseTransform):
return new_g return new_g
class ToSimple(BaseTransform): class ToSimple(BaseTransform):
r""" r"""Convert a graph to a simple graph without parallel edges and return a new graph.
Description
-----------
Convert a graph to a simple graph without parallel edges and return a new graph.
Parameters Parameters
---------- ----------
...@@ -496,11 +455,7 @@ class ToSimple(BaseTransform): ...@@ -496,11 +455,7 @@ class ToSimple(BaseTransform):
aggregator=self.aggregator) aggregator=self.aggregator)
class LineGraph(BaseTransform): class LineGraph(BaseTransform):
r""" r"""Return the line graph of the input graph.
Description
-----------
Return the line graph of the input graph.
The line graph :math:`L(G)` of a given graph :math:`G` is a graph where The line graph :math:`L(G)` of a given graph :math:`G` is a graph where
the nodes in :math:`L(G)` correspond to the edges in :math:`G`. For a pair the nodes in :math:`L(G)` correspond to the edges in :math:`G`. For a pair
...@@ -553,11 +508,7 @@ class LineGraph(BaseTransform): ...@@ -553,11 +508,7 @@ class LineGraph(BaseTransform):
return functional.line_graph(g, backtracking=self.backtracking, shared=True) return functional.line_graph(g, backtracking=self.backtracking, shared=True)
class KHopGraph(BaseTransform): class KHopGraph(BaseTransform):
r""" r"""Return the graph whose edges connect the :math:`k`-hop neighbors of the original graph.
Description
-----------
Return the graph whose edges connect the :math:`k`-hop neighbors of the original graph.
This module only works for homogeneous graphs. This module only works for homogeneous graphs.
...@@ -585,13 +536,10 @@ class KHopGraph(BaseTransform): ...@@ -585,13 +536,10 @@ class KHopGraph(BaseTransform):
return functional.khop_graph(g, self.k) return functional.khop_graph(g, self.k)
class AddMetaPaths(BaseTransform): class AddMetaPaths(BaseTransform):
r""" r"""Add new edges to an input graph based on given metapaths, as described in
`Heterogeneous Graph Attention Network <https://arxiv.org/abs/1903.07293>`__.
Description Formally, a metapath is a path of the form
-----------
Add new edges to an input graph based on given metapaths, as described in
`Heterogeneous Graph Attention Network <https://arxiv.org/abs/1903.07293>`__. Formally,
a metapath is a path of the form
.. math:: .. math::
...@@ -655,11 +603,7 @@ class AddMetaPaths(BaseTransform): ...@@ -655,11 +603,7 @@ class AddMetaPaths(BaseTransform):
return new_g return new_g
class Compose(BaseTransform): class Compose(BaseTransform):
r""" r"""Create a transform composed of multiple transforms in sequence.
Description
-----------
Create a transform composed of multiple transforms in sequence.
Parameters Parameters
---------- ----------
...@@ -692,12 +636,8 @@ class Compose(BaseTransform): ...@@ -692,12 +636,8 @@ class Compose(BaseTransform):
return self.__class__.__name__ + '([\n' + ',\n'.join(args) + '\n])' return self.__class__.__name__ + '([\n' + ',\n'.join(args) + '\n])'
class GCNNorm(BaseTransform): class GCNNorm(BaseTransform):
r""" r"""Apply symmetric adjacency normalization to an input graph and save the result edge
weights, as described in `Semi-Supervised Classification with Graph Convolutional Networks
Description
-----------
Apply symmetric adjacency normalization to an input graph and save the result edge weights,
as described in `Semi-Supervised Classification with Graph Convolutional Networks
<https://arxiv.org/abs/1609.02907>`__. <https://arxiv.org/abs/1609.02907>`__.
For a heterogeneous graph, this only applies to symmetric canonical edge types, whose source For a heterogeneous graph, this only applies to symmetric canonical edge types, whose source
...@@ -768,15 +708,12 @@ class GCNNorm(BaseTransform): ...@@ -768,15 +708,12 @@ class GCNNorm(BaseTransform):
return g return g
class PPR(BaseTransform): class PPR(BaseTransform):
r""" r"""Apply personalized PageRank (PPR) to an input graph for diffusion, as introduced in
Description
-----------
Apply personalized PageRank (PPR) to an input graph for diffusion, as introduced in
`The pagerank citation ranking: Bringing order to the web `The pagerank citation ranking: Bringing order to the web
<http://ilpubs.stanford.edu:8090/422/>`__. A sparsification will be applied to the <http://ilpubs.stanford.edu:8090/422/>`__.
weighted adjacency matrix after diffusion. Specifically, edges whose weight is below
a threshold will be dropped. A sparsification will be applied to the weighted adjacency matrix after diffusion.
Specifically, edges whose weight is below a threshold will be dropped.
This module only works for homogeneous graphs. This module only works for homogeneous graphs.
...@@ -819,11 +756,7 @@ class PPR(BaseTransform): ...@@ -819,11 +756,7 @@ class PPR(BaseTransform):
self.avg_degree = avg_degree self.avg_degree = avg_degree
def get_eps(self, num_nodes, mat): def get_eps(self, num_nodes, mat):
r""" r"""Get the threshold for graph sparsification.
Description
-----------
Get the threshold for graph sparsification.
""" """
if self.eps is None: if self.eps is None:
# Infer from self.avg_degree # Infer from self.avg_degree
...@@ -884,13 +817,10 @@ def is_bidirected(g): ...@@ -884,13 +817,10 @@ def is_bidirected(g):
# pylint: disable=C0103 # pylint: disable=C0103
class HeatKernel(BaseTransform): class HeatKernel(BaseTransform):
r""" r"""Apply heat kernel to an input graph for diffusion, as introduced in
Description
-----------
Apply heat kernel to an input graph for diffusion, as introduced in
`Diffusion kernels on graphs and other discrete structures `Diffusion kernels on graphs and other discrete structures
<https://www.ml.cmu.edu/research/dap-papers/kondor-diffusion-kernels.pdf>`__. <https://www.ml.cmu.edu/research/dap-papers/kondor-diffusion-kernels.pdf>`__.
A sparsification will be applied to the weighted adjacency matrix after diffusion. A sparsification will be applied to the weighted adjacency matrix after diffusion.
Specifically, edges whose weight is below a threshold will be dropped. Specifically, edges whose weight is below a threshold will be dropped.
...@@ -935,11 +865,7 @@ class HeatKernel(BaseTransform): ...@@ -935,11 +865,7 @@ class HeatKernel(BaseTransform):
self.avg_degree = avg_degree self.avg_degree = avg_degree
def get_eps(self, num_nodes, mat): def get_eps(self, num_nodes, mat):
r""" r"""Get the threshold for graph sparsification.
Description
-----------
Get the threshold for graph sparsification.
""" """
if self.eps is None: if self.eps is None:
# Infer from self.avg_degree # Infer from self.avg_degree
...@@ -983,14 +909,11 @@ class HeatKernel(BaseTransform): ...@@ -983,14 +909,11 @@ class HeatKernel(BaseTransform):
return new_g return new_g
class GDC(BaseTransform): class GDC(BaseTransform):
r""" r"""Apply graph diffusion convolution (GDC) to an input graph, as introduced in
`Diffusion Improves Graph Learning <https://www.in.tum.de/daml/gdc/>`__.
Description A sparsification will be applied to the weighted adjacency matrix after diffusion.
----------- Specifically, edges whose weight is below a threshold will be dropped.
Apply graph diffusion convolution (GDC) to an input graph, as introduced in
`Diffusion Improves Graph Learning <https://www.in.tum.de/daml/gdc/>`__. A sparsification
will be applied to the weighted adjacency matrix after diffusion. Specifically, edges whose
weight is below a threshold will be dropped.
This module only works for homogeneous graphs. This module only works for homogeneous graphs.
...@@ -1033,12 +956,7 @@ class GDC(BaseTransform): ...@@ -1033,12 +956,7 @@ class GDC(BaseTransform):
self.avg_degree = avg_degree self.avg_degree = avg_degree
def get_eps(self, num_nodes, mat): def get_eps(self, num_nodes, mat):
r""" r"""Get the threshold for graph sparsification."""
Description
-----------
Get the threshold for graph sparsification.
"""
if self.eps is None: if self.eps is None:
# Infer from self.avg_degree # Infer from self.avg_degree
if self.avg_degree > num_nodes: if self.avg_degree > num_nodes:
...@@ -1079,11 +997,7 @@ class GDC(BaseTransform): ...@@ -1079,11 +997,7 @@ class GDC(BaseTransform):
return new_g return new_g
class NodeShuffle(BaseTransform): class NodeShuffle(BaseTransform):
r""" r"""Randomly shuffle the nodes.
Description
-----------
Randomly shuffle the nodes.
Example Example
------- -------
...@@ -1116,11 +1030,7 @@ class NodeShuffle(BaseTransform): ...@@ -1116,11 +1030,7 @@ class NodeShuffle(BaseTransform):
# pylint: disable=C0103 # pylint: disable=C0103
class DropNode(BaseTransform): class DropNode(BaseTransform):
r""" r"""Randomly drop nodes, as described in
Description
-----------
Randomly drop nodes, as described in
`Graph Contrastive Learning with Augmentations <https://arxiv.org/abs/2010.13902>`__. `Graph Contrastive Learning with Augmentations <https://arxiv.org/abs/2010.13902>`__.
Parameters Parameters
...@@ -1166,11 +1076,7 @@ class DropNode(BaseTransform): ...@@ -1166,11 +1076,7 @@ class DropNode(BaseTransform):
# pylint: disable=C0103 # pylint: disable=C0103
class DropEdge(BaseTransform): class DropEdge(BaseTransform):
r""" r"""Randomly drop edges, as described in
Description
-----------
Randomly drop edges, as described in
`DropEdge: Towards Deep Graph Convolutional Networks on Node Classification `DropEdge: Towards Deep Graph Convolutional Networks on Node Classification
<https://arxiv.org/abs/1907.10903>`__ and `Graph Contrastive Learning with Augmentations <https://arxiv.org/abs/1907.10903>`__ and `Graph Contrastive Learning with Augmentations
<https://arxiv.org/abs/2010.13902>`__. <https://arxiv.org/abs/2010.13902>`__.
...@@ -1214,11 +1120,7 @@ class DropEdge(BaseTransform): ...@@ -1214,11 +1120,7 @@ class DropEdge(BaseTransform):
return g return g
class AddEdge(BaseTransform): class AddEdge(BaseTransform):
r""" r"""Randomly add edges, as described in `Graph Contrastive Learning with Augmentations
Description
-----------
Randomly add edges, as described in `Graph Contrastive Learning with Augmentations
<https://arxiv.org/abs/2010.13902>`__. <https://arxiv.org/abs/2010.13902>`__.
Parameters Parameters
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment