Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
OpenDAS
dgl
Commits
ef78d675
Unverified
Commit
ef78d675
authored
Aug 20, 2020
by
Jinjing Zhou
Committed by
GitHub
Aug 20, 2020
Browse files
Fix docs (#2073)
* remove mxnet tutorial * remove sse * fix docs
parent
28deee4d
Changes
7
Show whitespace changes
Inline
Side-by-side
Showing
7 changed files
with
17 additions
and
17 deletions
+17
-17
docs/_deprecate/5_giant_graph/1_sampling_mx.py
docs/_deprecate/5_giant_graph/1_sampling_mx.py
+0
-0
docs/_deprecate/5_giant_graph/2_giant.py
docs/_deprecate/5_giant_graph/2_giant.py
+0
-0
docs/_deprecate/5_giant_graph/README.txt
docs/_deprecate/5_giant_graph/README.txt
+0
-0
docs/_deprecate/8_sse_mx.py
docs/_deprecate/8_sse_mx.py
+0
-0
docs/source/guide/minibatch-custom-sampler.rst
docs/source/guide/minibatch-custom-sampler.rst
+12
-12
docs/source/guide/minibatch-inference.rst
docs/source/guide/minibatch-inference.rst
+3
-3
docs/source/guide/minibatch.rst
docs/source/guide/minibatch.rst
+2
-2
No files found.
tutorials/models
/5_giant_graph/1_sampling_mx.py
→
docs/_deprecate
/5_giant_graph/1_sampling_mx.py
View file @
ef78d675
File moved
tutorials/models
/5_giant_graph/2_giant.py
→
docs/_deprecate
/5_giant_graph/2_giant.py
View file @
ef78d675
File moved
tutorials/models
/5_giant_graph/README.txt
→
docs/_deprecate
/5_giant_graph/README.txt
View file @
ef78d675
File moved
tutorials/models/1_gnn
/8_sse_mx.py
→
docs/_deprecate
/8_sse_mx.py
View file @
ef78d675
File moved
docs/source/guide/minibatch-custom-sampler.rst
View file @
ef78d675
...
...
@@ -35,18 +35,18 @@ predecessors (or *neighbors* if the graph is undirected) of :math:`v` on graph
For
instance
,
to
perform
a
message
passing
for
updating
the
red
node
in
the
following
graph
:
..
figure
::
https
://
i
.
imgur
.
com
/
xYPtaoy
.
png
..
figure
::
https
://
data
.
dgl
.
ai
/
asset
/
image
/
guide_6_4_0
.
png
:
alt
:
Imgur
Imgur
One
needs
to
aggregate
the
node
features
of
its
neighbors
,
shown
as
green
nodes
:
..
figure
::
https
://
i
.
imgur
.
com
/
OuvExp
1
.
png
..
figure
::
https
://
data
.
dgl
.
ai
/
asset
/
image
/
guide_6_4_
1
.
png
:
alt
:
Imgur
Imgur
Neighborhood
sampling
with
pencil
and
paper
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
...
...
@@ -76,10 +76,10 @@ Finding the message passing dependency
Consider
computing
with
a
2
-
layer
GNN
the
output
of
the
seed
node
8
,
colored
red
,
in
the
following
graph
:
..
figure
::
https
://
i
.
imgur
.
com
/
xYPtaoy
.
png
..
figure
::
https
://
data
.
dgl
.
ai
/
asset
/
image
/
guide_6_4_2
.
png
:
alt
:
Imgur
Imgur
By
the
formulation
:
...
...
@@ -107,10 +107,10 @@ We can tell from the formulation that to compute
:
math
:`\
boldsymbol
{
h
}
_8
^{(
2
)}`
we
need
messages
from
node
4
,
5
,
7
and
11
(
colored
green
)
along
the
edges
visualized
below
.
..
figure
::
https
://
i
.
imgur
.
com
/
Gwjz05H
.
png
..
figure
::
https
://
data
.
dgl
.
ai
/
asset
/
image
/
guide_6_4_3
.
png
:
alt
:
Imgur
Imgur
This
graph
contains
all
the
nodes
in
the
original
graph
but
only
the
edges
necessary
for
message
passing
to
the
given
output
nodes
.
We
call
...
...
@@ -149,10 +149,10 @@ bipartite-structured graph that only contains the necessary input nodes
and
output
nodes
a
*
block
*.
The
following
figure
shows
the
block
of
the
second
GNN
layer
for
node
8.
..
figure
::
https
://
i
.
imgur
.
com
/
stB2UlR
.
png
..
figure
::
https
://
data
.
dgl
.
ai
/
asset
/
image
/
guide_6_4_4
.
png
:
alt
:
Imgur
Imgur
Note
that
the
output
nodes
also
appear
in
the
input
nodes
.
The
reason
is
that
representations
of
output
nodes
from
the
previous
layer
are
needed
...
...
@@ -234,10 +234,10 @@ destination of an edge in the frontier.
For
example
,
consider
the
following
frontier
..
figure
::
https
://
i
.
imgur
.
com
/
g5Ptbj7
.
png
..
figure
::
https
://
data
.
dgl
.
ai
/
asset
/
image
/
guide_6_4_5
.
png
:
alt
:
Imgur
Imgur
where
the
red
and
green
nodes
(
i
.
e
.
node
4
,
5
,
7
,
8
,
and
11
)
are
all
nodes
that
is
a
destination
of
an
edge
.
Then
the
following
code
will
...
...
docs/source/guide/minibatch-inference.rst
View file @
ef78d675
...
...
@@ -26,17 +26,17 @@ passing.
The following animation shows how the computation would look like (note
that for every layer only the first three minibatches are drawn).
.. figure:: https://
i.imgur.com/rr1FG7S
.gif
.. figure:: https://
data.dgl.ai/asset/image/guide_6_6_0
.gif
:alt: Imgur
Imgur
Implementing Offline Inference
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Consider the two-layer GCN we have mentioned in Section 6.5.1. The way
to implement offline inference still involves using
```
MultiLayerFullNeighborSampler`
` <https://todo>`__
, but sampling for
:class:`~dgl.dataloading.neighbor.
MultiLayerFullNeighborSampler`, but sampling for
only one layer at a time. Note that offline inference is implemented as
a method of the GNN module because the computation on one layer depends
on how messages are aggregated and combined as well.
...
...
docs/source/guide/minibatch.rst
View file @
ef78d675
...
...
@@ -25,10 +25,10 @@ process continues until we reach the input. This iterative process
builds the dependency graph starting from the output and working
backwards to the input, as the figure below shows:
.. figure:: https://
i.imgur.com/Y0z0qcC
.png
.. figure:: https://
data.dgl.ai/asset/image/guide_6_0_0
.png
:alt: Imgur
Imgur
With this, one can save the workload and computation resources for
training a GNN on a large graph.
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment