Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
OpenDAS
dgl
Commits
3bd5a9b6
Unverified
Commit
3bd5a9b6
authored
Feb 25, 2022
by
Mufei Li
Committed by
GitHub
Feb 25, 2022
Browse files
[Doc] Fix Doc (#3777)
* Update * Update * Update * Update
parent
6d9433b0
Changes
7
Show whitespace changes
Inline
Side-by-side
Showing
7 changed files
with
268 additions
and
393 deletions
+268
-393
docs/source/api/python/index.rst
docs/source/api/python/index.rst
+3
-1
docs/source/api/python/nn-mxnet.rst
docs/source/api/python/nn-mxnet.rst
+74
-0
docs/source/api/python/nn-pytorch.rst
docs/source/api/python/nn-pytorch.rst
+107
-0
docs/source/api/python/nn-tensorflow.rst
docs/source/api/python/nn-tensorflow.rst
+45
-0
docs/source/api/python/nn.rst
docs/source/api/python/nn.rst
+0
-257
docs/source/index.rst
docs/source/index.rst
+3
-1
python/dgl/transforms/module.py
python/dgl/transforms/module.py
+36
-134
No files found.
docs/source/api/python/index.rst
View file @
3bd5a9b6
...
...
@@ -10,7 +10,9 @@ API Reference
dgl.DGLGraph
dgl.distributed
dgl.function
nn
nn-pytorch
nn-tensorflow
nn-mxnet
dgl.ops
dgl.sampling
dgl.contrib.UnifiedTensor
...
...
docs/source/api/python/nn-mxnet.rst
0 → 100644
View file @
3bd5a9b6
.. _apinn-mxnet:
dgl.nn (MXNet)
================
Conv Layers
----------------------------------------
.. autosummary::
:toctree: ../../generated/
:nosignatures:
:template: classtemplate.rst
~dgl.nn.mxnet.conv.GraphConv
~dgl.nn.mxnet.conv.RelGraphConv
~dgl.nn.mxnet.conv.TAGConv
~dgl.nn.mxnet.conv.GATConv
~dgl.nn.mxnet.conv.EdgeConv
~dgl.nn.mxnet.conv.SAGEConv
~dgl.nn.mxnet.conv.SGConv
~dgl.nn.mxnet.conv.APPNPConv
~dgl.nn.mxnet.conv.GINConv
~dgl.nn.mxnet.conv.GatedGraphConv
~dgl.nn.mxnet.conv.GMMConv
~dgl.nn.mxnet.conv.ChebConv
~dgl.nn.mxnet.conv.AGNNConv
~dgl.nn.mxnet.conv.NNConv
Dense Conv Layers
----------------------------------------
.. autosummary::
:toctree: ../../generated/
:nosignatures:
:template: classtemplate.rst
~dgl.nn.mxnet.conv.DenseGraphConv
~dgl.nn.mxnet.conv.DenseSAGEConv
~dgl.nn.mxnet.conv.DenseChebConv
Global Pooling Layers
----------------------------------------
.. autosummary::
:toctree: ../../generated/
:nosignatures:
:template: classtemplate.rst
~dgl.nn.mxnet.glob.SumPooling
~dgl.nn.mxnet.glob.AvgPooling
~dgl.nn.mxnet.glob.MaxPooling
~dgl.nn.mxnet.glob.SortPooling
~dgl.nn.mxnet.glob.GlobalAttentionPooling
~dgl.nn.mxnet.glob.Set2Set
Heterogeneous Learning Modules
----------------------------------------
.. autosummary::
:toctree: ../../generated/
:nosignatures:
:template: classtemplate.rst
~dgl.nn.mxnet.HeteroGraphConv
Utility Modules
----------------------------------------
.. autosummary::
:toctree: ../../generated/
:nosignatures:
:template: classtemplate.rst
~dgl.nn.mxnet.utils.Sequential
docs/source/api/python/nn-pytorch.rst
0 → 100644
View file @
3bd5a9b6
.. _apinn-pytorch:
dgl.nn (PyTorch)
================
Conv Layers
----------------------------------------
.. autosummary::
:toctree: ../../generated/
:nosignatures:
:template: classtemplate.rst
~dgl.nn.pytorch.conv.GraphConv
~dgl.nn.pytorch.conv.EdgeWeightNorm
~dgl.nn.pytorch.conv.RelGraphConv
~dgl.nn.pytorch.conv.TAGConv
~dgl.nn.pytorch.conv.GATConv
~dgl.nn.pytorch.conv.GATv2Conv
~dgl.nn.pytorch.conv.EGATConv
~dgl.nn.pytorch.conv.EdgeConv
~dgl.nn.pytorch.conv.SAGEConv
~dgl.nn.pytorch.conv.SGConv
~dgl.nn.pytorch.conv.APPNPConv
~dgl.nn.pytorch.conv.GINConv
~dgl.nn.pytorch.conv.GatedGraphConv
~dgl.nn.pytorch.conv.GMMConv
~dgl.nn.pytorch.conv.ChebConv
~dgl.nn.pytorch.conv.AGNNConv
~dgl.nn.pytorch.conv.NNConv
~dgl.nn.pytorch.conv.AtomicConv
~dgl.nn.pytorch.conv.CFConv
~dgl.nn.pytorch.conv.DotGatConv
~dgl.nn.pytorch.conv.TWIRLSConv
~dgl.nn.pytorch.conv.TWIRLSUnfoldingAndAttention
~dgl.nn.pytorch.conv.GCN2Conv
Dense Conv Layers
----------------------------------------
.. autosummary::
:toctree: ../../generated/
:nosignatures:
:template: classtemplate.rst
~dgl.nn.pytorch.conv.DenseGraphConv
~dgl.nn.pytorch.conv.DenseSAGEConv
~dgl.nn.pytorch.conv.DenseChebConv
Global Pooling Layers
----------------------------------------
.. autosummary::
:toctree: ../../generated/
:nosignatures:
:template: classtemplate.rst
~dgl.nn.pytorch.glob.SumPooling
~dgl.nn.pytorch.glob.AvgPooling
~dgl.nn.pytorch.glob.MaxPooling
~dgl.nn.pytorch.glob.SortPooling
~dgl.nn.pytorch.glob.WeightAndSum
~dgl.nn.pytorch.glob.GlobalAttentionPooling
~dgl.nn.pytorch.glob.Set2Set
~dgl.nn.pytorch.glob.SetTransformerEncoder
~dgl.nn.pytorch.glob.SetTransformerDecoder
Score Modules for Link Prediction and Knowledge Graph Completion
----------------------------------------
.. autosummary::
:toctree: ../../generated/
:nosignatures:
:template: classtemplate.rst
~dgl.nn.pytorch.link.EdgePredictor
~dgl.nn.pytorch.link.TransE
~dgl.nn.pytorch.link.TransR
Heterogeneous Learning Modules
----------------------------------------
.. autosummary::
:toctree: ../../generated/
:nosignatures:
:template: classtemplate.rst
~dgl.nn.pytorch.HeteroGraphConv
~dgl.nn.pytorch.HeteroLinear
~dgl.nn.pytorch.HeteroEmbedding
~dgl.nn.pytorch.TypedLinear
Utility Modules
----------------------------------------
.. autosummary::
:toctree: ../../generated/
:nosignatures:
:template: classtemplate.rst
~dgl.nn.pytorch.utils.Sequential
~dgl.nn.pytorch.utils.WeightBasis
~dgl.nn.pytorch.factory.KNNGraph
~dgl.nn.pytorch.factory.SegmentedKNNGraph
~dgl.nn.pytorch.utils.JumpingKnowledge
~dgl.nn.pytorch.sparse_emb.NodeEmbedding
~dgl.nn.pytorch.explain.GNNExplainer
docs/source/api/python/nn-tensorflow.rst
0 → 100644
View file @
3bd5a9b6
.. _apinn-tensorflow:
dgl.nn (TensorFlow)
================
Conv Layers
----------------------------------------
.. autosummary::
:toctree: ../../generated/
:nosignatures:
:template: classtemplate.rst
~dgl.nn.tensorflow.conv.GraphConv
~dgl.nn.tensorflow.conv.RelGraphConv
~dgl.nn.tensorflow.conv.GATConv
~dgl.nn.tensorflow.conv.SAGEConv
~dgl.nn.tensorflow.conv.ChebConv
~dgl.nn.tensorflow.conv.SGConv
~dgl.nn.tensorflow.conv.APPNPConv
~dgl.nn.tensorflow.conv.GINConv
Global Pooling Layers
----------------------------------------
.. autosummary::
:toctree: ../../generated/
:nosignatures:
:template: classtemplate.rst
~dgl.nn.tensorflow.glob.SumPooling
~dgl.nn.tensorflow.glob.AvgPooling
~dgl.nn.tensorflow.glob.MaxPooling
~dgl.nn.tensorflow.glob.SortPooling
~dgl.nn.tensorflow.glob.GlobalAttentionPooling
Heterogeneous Learning Modules
----------------------------------------
.. autosummary::
:toctree: ../../generated/
:nosignatures:
:template: classtemplate.rst
~dgl.nn.tensorflow.glob.HeteroGraphConv
docs/source/api/python/nn.rst
deleted
100644 → 0
View file @
6d9433b0
.. _apinn:
dgl.nn
==========
PyTorch
----------------------------------------
Conv Layers
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
.. currentmodule:: dgl.nn.pytorch.conv
.. automodule:: dgl.nn.pytorch.conv
.. autosummary::
:toctree: ../../generated/
:nosignatures:
:template: classtemplate.rst
GraphConv
EdgeWeightNorm
RelGraphConv
TAGConv
GATConv
GATv2Conv
EGATConv
EdgeConv
SAGEConv
SGConv
APPNPConv
GINConv
GatedGraphConv
GMMConv
ChebConv
AGNNConv
NNConv
AtomicConv
CFConv
DotGatConv
TWIRLSConv
TWIRLSUnfoldingAndAttention
GCN2Conv
Dense Conv Layers
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
.. autosummary::
:toctree: ../../generated/
:nosignatures:
:template: classtemplate.rst
DenseGraphConv
DenseSAGEConv
DenseChebConv
Global Pooling Layers
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
.. currentmodule:: dgl.nn.pytorch.glob
.. automodule:: dgl.nn.pytorch.glob
.. autosummary::
:toctree: ../../generated/
:nosignatures:
:template: classtemplate.rst
SumPooling
AvgPooling
MaxPooling
SortPooling
WeightAndSum
GlobalAttentionPooling
Set2Set
SetTransformerEncoder
SetTransformerDecoder
Score Modules for Link Prediction and Knowledge Graph Completion
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
.. currentmodule:: dgl.nn.pytorch.link
.. automodule:: dgl.nn.pytorch.link
.. autosummary::
:toctree: ../../generated/
:nosignatures:
:template: classtemplate.rst
EdgePredictor
TransE
TransR
Heterogeneous Learning Modules
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
.. currentmodule:: dgl.nn.pytorch
.. automodule:: dgl.nn.pytorch
.. autosummary::
:toctree: ../../generated/
:nosignatures:
:template: classtemplate.rst
HeteroGraphConv
HeteroLinear
HeteroEmbedding
TypedLinear
Utility Modules
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
.. autosummary::
:toctree: ../../generated/
:nosignatures:
:template: classtemplate.rst
~utils.Sequential
~utils.WeightBasis
~factory.KNNGraph
~factory.SegmentedKNNGraph
~utils.JumpingKnowledge
~sparse_emb.NodeEmbedding
~explain.GNNExplainer
TensorFlow
----------------------------------------
Conv Layers
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
.. currentmodule:: dgl.nn.tensorflow.conv
.. automodule:: dgl.nn.tensorflow.conv
.. autosummary::
:toctree: ../../generated/
:nosignatures:
:template: classtemplate.rst
GraphConv
RelGraphConv
GATConv
SAGEConv
ChebConv
SGConv
APPNPConv
GINConv
Global Pooling Layers
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
.. currentmodule:: dgl.nn.tensorflow.glob
.. automodule:: dgl.nn.tensorflow.glob
.. autosummary::
:toctree: ../../generated/
:nosignatures:
:template: classtemplate.rst
SumPooling
AvgPooling
MaxPooling
SortPooling
GlobalAttentionPooling
Heterogeneous Learning Modules
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
.. currentmodule:: dgl.nn.tensorflow
.. automodule:: dgl.nn.tensorflow
.. autosummary::
:toctree: ../../generated/
:nosignatures:
:template: classtemplate.rst
HeteroGraphConv
MXNet
----------------------------------------
Conv Layers
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
.. currentmodule:: dgl.nn.mxnet.conv
.. automodule:: dgl.nn.mxnet.conv
.. autosummary::
:toctree: ../../generated/
:nosignatures:
:template: classtemplate.rst
GraphConv
RelGraphConv
TAGConv
GATConv
EdgeConv
SAGEConv
SGConv
APPNPConv
GINConv
GatedGraphConv
GMMConv
ChebConv
AGNNConv
NNConv
Dense Conv Layers
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
.. autosummary::
:toctree: ../../generated/
:nosignatures:
:template: classtemplate.rst
DenseGraphConv
DenseSAGEConv
DenseChebConv
Global Pooling Layers
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
.. currentmodule:: dgl.nn.mxnet.glob
.. automodule:: dgl.nn.mxnet.glob
.. autosummary::
:toctree: ../../generated/
:nosignatures:
:template: classtemplate.rst
SumPooling
AvgPooling
MaxPooling
SortPooling
GlobalAttentionPooling
Set2Set
Heterogeneous Learning Modules
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
.. currentmodule:: dgl.nn.mxnet
.. automodule:: dgl.nn.mxnet
.. autosummary::
:toctree: ../../generated/
:nosignatures:
:template: classtemplate.rst
HeteroGraphConv
Utility Modules
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
.. autosummary::
:toctree: ../../generated/
:nosignatures:
:template: classtemplate.rst
~utils.Sequential
docs/source/index.rst
View file @
3bd5a9b6
...
...
@@ -44,7 +44,9 @@ Welcome to Deep Graph Library Tutorials and Documentation
api/python/dgl.distributed
api/python/dgl.function
api/python/dgl.geometry
api/python/nn
api/python/nn-pytorch
api/python/nn-tensorflow
api/python/nn-mxnet
api/python/nn.functional
api/python/dgl.ops
api/python/dgl.optim
...
...
python/dgl/transforms/module.py
View file @
3bd5a9b6
...
...
@@ -50,11 +50,7 @@ __all__ = [
]
def
update_graph_structure
(
g
,
data_dict
,
copy_edata
=
True
):
r
"""
Description
-----------
Update the structure of a graph.
r
"""Update the structure of a graph.
Parameters
----------
...
...
@@ -93,12 +89,7 @@ def update_graph_structure(g, data_dict, copy_edata=True):
return
new_g
class
BaseTransform
:
r
"""
Description
-----------
An abstract class for writing transforms.
"""
r
"""An abstract class for writing transforms."""
def
__call__
(
self
,
g
):
raise
NotImplementedError
...
...
@@ -106,11 +97,7 @@ class BaseTransform:
return
self
.
__class__
.
__name__
+
'()'
class
AddSelfLoop
(
BaseTransform
):
r
"""
Description
-----------
Add self-loops for each node in the graph and return a new graph.
r
"""Add self-loops for each node in the graph and return a new graph.
For heterogeneous graphs, self-loops are added only for edge types with same
source and destination node types.
...
...
@@ -210,11 +197,7 @@ class AddSelfLoop(BaseTransform):
return
g
class
RemoveSelfLoop
(
BaseTransform
):
r
"""
Description
-----------
Remove self-loops for each node in the graph and return a new graph.
r
"""Remove self-loops for each node in the graph and return a new graph.
For heterogeneous graphs, this operation only applies to edge types with same
source and destination node types.
...
...
@@ -246,11 +229,7 @@ class RemoveSelfLoop(BaseTransform):
(tensor([1]), tensor([2]))
"""
def
transform_etype
(
self
,
c_etype
,
g
):
r
"""
Description
-----------
Transform the graph corresponding to a canonical edge type.
r
"""Transform the graph corresponding to a canonical edge type.
Parameters
----------
...
...
@@ -275,11 +254,7 @@ class RemoveSelfLoop(BaseTransform):
return
g
class
AddReverse
(
BaseTransform
):
r
"""
Description
-----------
Add a reverse edge :math:`(i,j)` for each edge :math:`(j,i)` in the input graph and
r
"""Add a reverse edge :math:`(i,j)` for each edge :math:`(j,i)` in the input graph and
return a new graph.
For a heterogeneous graph, it adds a "reverse" edge type for each edge type
...
...
@@ -343,11 +318,7 @@ class AddReverse(BaseTransform):
self
.
sym_new_etype
=
sym_new_etype
def
transform_symmetric_etype
(
self
,
c_etype
,
g
,
data_dict
):
r
"""
Description
-----------
Transform the graph corresponding to a symmetric canonical edge type.
r
"""Transform the graph corresponding to a symmetric canonical edge type.
Parameters
----------
...
...
@@ -366,11 +337,7 @@ class AddReverse(BaseTransform):
data_dict
[
c_etype
]
=
(
src
,
dst
)
def
transform_asymmetric_etype
(
self
,
c_etype
,
g
,
data_dict
):
r
"""
Description
-----------
Transform the graph corresponding to an asymmetric canonical edge type.
r
"""Transform the graph corresponding to an asymmetric canonical edge type.
Parameters
----------
...
...
@@ -389,11 +356,7 @@ class AddReverse(BaseTransform):
})
def
transform_etype
(
self
,
c_etype
,
g
,
data_dict
):
r
"""
Description
-----------
Transform the graph corresponding to a canonical edge type.
r
"""Transform the graph corresponding to a canonical edge type.
Parameters
----------
...
...
@@ -434,11 +397,7 @@ class AddReverse(BaseTransform):
return
new_g
class
ToSimple
(
BaseTransform
):
r
"""
Description
-----------
Convert a graph to a simple graph without parallel edges and return a new graph.
r
"""Convert a graph to a simple graph without parallel edges and return a new graph.
Parameters
----------
...
...
@@ -496,11 +455,7 @@ class ToSimple(BaseTransform):
aggregator
=
self
.
aggregator
)
class
LineGraph
(
BaseTransform
):
r
"""
Description
-----------
Return the line graph of the input graph.
r
"""Return the line graph of the input graph.
The line graph :math:`L(G)` of a given graph :math:`G` is a graph where
the nodes in :math:`L(G)` correspond to the edges in :math:`G`. For a pair
...
...
@@ -553,11 +508,7 @@ class LineGraph(BaseTransform):
return
functional
.
line_graph
(
g
,
backtracking
=
self
.
backtracking
,
shared
=
True
)
class
KHopGraph
(
BaseTransform
):
r
"""
Description
-----------
Return the graph whose edges connect the :math:`k`-hop neighbors of the original graph.
r
"""Return the graph whose edges connect the :math:`k`-hop neighbors of the original graph.
This module only works for homogeneous graphs.
...
...
@@ -585,13 +536,10 @@ class KHopGraph(BaseTransform):
return
functional
.
khop_graph
(
g
,
self
.
k
)
class
AddMetaPaths
(
BaseTransform
):
r
"""
r
"""Add new edges to an input graph based on given metapaths, as described in
`Heterogeneous Graph Attention Network <https://arxiv.org/abs/1903.07293>`__.
Description
-----------
Add new edges to an input graph based on given metapaths, as described in
`Heterogeneous Graph Attention Network <https://arxiv.org/abs/1903.07293>`__. Formally,
a metapath is a path of the form
Formally, a metapath is a path of the form
.. math::
...
...
@@ -655,11 +603,7 @@ class AddMetaPaths(BaseTransform):
return
new_g
class
Compose
(
BaseTransform
):
r
"""
Description
-----------
Create a transform composed of multiple transforms in sequence.
r
"""Create a transform composed of multiple transforms in sequence.
Parameters
----------
...
...
@@ -692,12 +636,8 @@ class Compose(BaseTransform):
return
self
.
__class__
.
__name__
+
'([
\n
'
+
',
\n
'
.
join
(
args
)
+
'
\n
])'
class
GCNNorm
(
BaseTransform
):
r
"""
Description
-----------
Apply symmetric adjacency normalization to an input graph and save the result edge weights,
as described in `Semi-Supervised Classification with Graph Convolutional Networks
r
"""Apply symmetric adjacency normalization to an input graph and save the result edge
weights, as described in `Semi-Supervised Classification with Graph Convolutional Networks
<https://arxiv.org/abs/1609.02907>`__.
For a heterogeneous graph, this only applies to symmetric canonical edge types, whose source
...
...
@@ -768,15 +708,12 @@ class GCNNorm(BaseTransform):
return
g
class
PPR
(
BaseTransform
):
r
"""
Description
-----------
Apply personalized PageRank (PPR) to an input graph for diffusion, as introduced in
r
"""Apply personalized PageRank (PPR) to an input graph for diffusion, as introduced in
`The pagerank citation ranking: Bringing order to the web
<http://ilpubs.stanford.edu:8090/422/>`__. A sparsification will be applied to the
weighted adjacency matrix after diffusion. Specifically, edges whose weight is below
a threshold will be dropped.
<http://ilpubs.stanford.edu:8090/422/>`__.
A sparsification will be applied to the weighted adjacency matrix after diffusion.
Specifically, edges whose weight is below a threshold will be dropped.
This module only works for homogeneous graphs.
...
...
@@ -819,11 +756,7 @@ class PPR(BaseTransform):
self
.
avg_degree
=
avg_degree
def
get_eps
(
self
,
num_nodes
,
mat
):
r
"""
Description
-----------
Get the threshold for graph sparsification.
r
"""Get the threshold for graph sparsification.
"""
if
self
.
eps
is
None
:
# Infer from self.avg_degree
...
...
@@ -884,13 +817,10 @@ def is_bidirected(g):
# pylint: disable=C0103
class
HeatKernel
(
BaseTransform
):
r
"""
Description
-----------
Apply heat kernel to an input graph for diffusion, as introduced in
r
"""Apply heat kernel to an input graph for diffusion, as introduced in
`Diffusion kernels on graphs and other discrete structures
<https://www.ml.cmu.edu/research/dap-papers/kondor-diffusion-kernels.pdf>`__.
A sparsification will be applied to the weighted adjacency matrix after diffusion.
Specifically, edges whose weight is below a threshold will be dropped.
...
...
@@ -935,11 +865,7 @@ class HeatKernel(BaseTransform):
self
.
avg_degree
=
avg_degree
def
get_eps
(
self
,
num_nodes
,
mat
):
r
"""
Description
-----------
Get the threshold for graph sparsification.
r
"""Get the threshold for graph sparsification.
"""
if
self
.
eps
is
None
:
# Infer from self.avg_degree
...
...
@@ -983,14 +909,11 @@ class HeatKernel(BaseTransform):
return
new_g
class
GDC
(
BaseTransform
):
r
"""
r
"""Apply graph diffusion convolution (GDC) to an input graph, as introduced in
`Diffusion Improves Graph Learning <https://www.in.tum.de/daml/gdc/>`__.
Description
-----------
Apply graph diffusion convolution (GDC) to an input graph, as introduced in
`Diffusion Improves Graph Learning <https://www.in.tum.de/daml/gdc/>`__. A sparsification
will be applied to the weighted adjacency matrix after diffusion. Specifically, edges whose
weight is below a threshold will be dropped.
A sparsification will be applied to the weighted adjacency matrix after diffusion.
Specifically, edges whose weight is below a threshold will be dropped.
This module only works for homogeneous graphs.
...
...
@@ -1033,12 +956,7 @@ class GDC(BaseTransform):
self
.
avg_degree
=
avg_degree
def
get_eps
(
self
,
num_nodes
,
mat
):
r
"""
Description
-----------
Get the threshold for graph sparsification.
"""
r
"""Get the threshold for graph sparsification."""
if
self
.
eps
is
None
:
# Infer from self.avg_degree
if
self
.
avg_degree
>
num_nodes
:
...
...
@@ -1079,11 +997,7 @@ class GDC(BaseTransform):
return
new_g
class
NodeShuffle
(
BaseTransform
):
r
"""
Description
-----------
Randomly shuffle the nodes.
r
"""Randomly shuffle the nodes.
Example
-------
...
...
@@ -1116,11 +1030,7 @@ class NodeShuffle(BaseTransform):
# pylint: disable=C0103
class
DropNode
(
BaseTransform
):
r
"""
Description
-----------
Randomly drop nodes, as described in
r
"""Randomly drop nodes, as described in
`Graph Contrastive Learning with Augmentations <https://arxiv.org/abs/2010.13902>`__.
Parameters
...
...
@@ -1166,11 +1076,7 @@ class DropNode(BaseTransform):
# pylint: disable=C0103
class
DropEdge
(
BaseTransform
):
r
"""
Description
-----------
Randomly drop edges, as described in
r
"""Randomly drop edges, as described in
`DropEdge: Towards Deep Graph Convolutional Networks on Node Classification
<https://arxiv.org/abs/1907.10903>`__ and `Graph Contrastive Learning with Augmentations
<https://arxiv.org/abs/2010.13902>`__.
...
...
@@ -1214,11 +1120,7 @@ class DropEdge(BaseTransform):
return
g
class
AddEdge
(
BaseTransform
):
r
"""
Description
-----------
Randomly add edges, as described in `Graph Contrastive Learning with Augmentations
r
"""Randomly add edges, as described in `Graph Contrastive Learning with Augmentations
<https://arxiv.org/abs/2010.13902>`__.
Parameters
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment