"src/vscode:/vscode.git/clone" did not exist on "cb5e3489c0694107295bc70804de1559e97cdb89"
Unverified Commit ef78d675 authored by Jinjing Zhou's avatar Jinjing Zhou Committed by GitHub
Browse files

Fix docs (#2073)

* remove mxnet tutorial

* remove sse

* fix docs
parent 28deee4d
......@@ -35,18 +35,18 @@ predecessors (or *neighbors* if the graph is undirected) of :math:`v` on graph
For instance, to perform a message passing for updating the red node in
the following graph:
.. figure:: https://i.imgur.com/xYPtaoy.png
.. figure:: https://data.dgl.ai/asset/image/guide_6_4_0.png
:alt: Imgur
Imgur
One needs to aggregate the node features of its neighbors, shown as
green nodes:
.. figure:: https://i.imgur.com/OuvExp1.png
.. figure:: https://data.dgl.ai/asset/image/guide_6_4_1.png
:alt: Imgur
Imgur
Neighborhood sampling with pencil and paper
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
......@@ -76,10 +76,10 @@ Finding the message passing dependency
Consider computing with a 2-layer GNN the output of the seed node 8,
colored red, in the following graph:
.. figure:: https://i.imgur.com/xYPtaoy.png
.. figure:: https://data.dgl.ai/asset/image/guide_6_4_2.png
:alt: Imgur
Imgur
By the formulation:
......@@ -107,10 +107,10 @@ We can tell from the formulation that to compute
:math:`\boldsymbol{h}_8^{(2)}` we need messages from node 4, 5, 7 and 11
(colored green) along the edges visualized below.
.. figure:: https://i.imgur.com/Gwjz05H.png
.. figure:: https://data.dgl.ai/asset/image/guide_6_4_3.png
:alt: Imgur
Imgur
This graph contains all the nodes in the original graph but only the
edges necessary for message passing to the given output nodes. We call
......@@ -149,10 +149,10 @@ bipartite-structured graph that only contains the necessary input nodes
and output nodes a *block*. The following figure shows the block of the
second GNN layer for node 8.
.. figure:: https://i.imgur.com/stB2UlR.png
.. figure:: https://data.dgl.ai/asset/image/guide_6_4_4.png
:alt: Imgur
Imgur
Note that the output nodes also appear in the input nodes. The reason is
that representations of output nodes from the previous layer are needed
......@@ -234,10 +234,10 @@ destination of an edge in the frontier.
For example, consider the following frontier
.. figure:: https://i.imgur.com/g5Ptbj7.png
.. figure:: https://data.dgl.ai/asset/image/guide_6_4_5.png
:alt: Imgur
Imgur
where the red and green nodes (i.e. node 4, 5, 7, 8, and 11) are all
nodes that is a destination of an edge. Then the following code will
......
......@@ -26,17 +26,17 @@ passing.
The following animation shows how the computation would look like (note
that for every layer only the first three minibatches are drawn).
.. figure:: https://i.imgur.com/rr1FG7S.gif
.. figure:: https://data.dgl.ai/asset/image/guide_6_6_0.gif
:alt: Imgur
Imgur
Implementing Offline Inference
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Consider the two-layer GCN we have mentioned in Section 6.5.1. The way
to implement offline inference still involves using
```MultiLayerFullNeighborSampler`` <https://todo>`__, but sampling for
:class:`~dgl.dataloading.neighbor.MultiLayerFullNeighborSampler`, but sampling for
only one layer at a time. Note that offline inference is implemented as
a method of the GNN module because the computation on one layer depends
on how messages are aggregated and combined as well.
......
......@@ -25,10 +25,10 @@ process continues until we reach the input. This iterative process
builds the dependency graph starting from the output and working
backwards to the input, as the figure below shows:
.. figure:: https://i.imgur.com/Y0z0qcC.png
.. figure:: https://data.dgl.ai/asset/image/guide_6_0_0.png
:alt: Imgur
Imgur
With this, one can save the workload and computation resources for
training a GNN on a large graph.
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment