Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
OpenDAS
dgl
Commits
3c922899
Unverified
Commit
3c922899
authored
Dec 21, 2023
by
Rhett Ying
Committed by
GitHub
Dec 21, 2023
Browse files
[GraphBolt] remove num_workers=0 from doc (#6803)
parent
fd980d53
Changes
7
Hide whitespace changes
Inline
Side-by-side
Showing
7 changed files
with
12 additions
and
12 deletions
+12
-12
docs/source/guide/minibatch-custom-sampler.rst
docs/source/guide/minibatch-custom-sampler.rst
+2
-2
docs/source/guide/minibatch-edge.rst
docs/source/guide/minibatch-edge.rst
+3
-3
docs/source/guide/minibatch-inference.rst
docs/source/guide/minibatch-inference.rst
+1
-1
docs/source/guide/minibatch-link.rst
docs/source/guide/minibatch-link.rst
+2
-2
docs/source/guide/minibatch-node.rst
docs/source/guide/minibatch-node.rst
+2
-2
docs/source/guide/minibatch-parallelism.rst
docs/source/guide/minibatch-parallelism.rst
+1
-1
tutorials/multi/2_node_classification.py
tutorials/multi/2_node_classification.py
+1
-1
No files found.
docs/source/guide/minibatch-custom-sampler.rst
View file @
3c922899
...
...
@@ -48,7 +48,7 @@ To use this sampler with :class:`~dgl.graphbolt.DataLoader`:
datapipe = datapipe.customized_sample_neighbor(g, [10, 10]) # 2 layers.
datapipe = datapipe.fetch_feature(feature, node_feature_keys=["feat"])
datapipe = datapipe.copy_to(device)
dataloader = gb.DataLoader(datapipe
, num_workers=0
)
dataloader = gb.DataLoader(datapipe)
for data in dataloader:
input_features = data.node_features["feat"]
...
...
@@ -93,7 +93,7 @@ can be used on heterogeneous graphs:
feature, node_feature_keys={"user": ["feat"], "item": ["feat"]}
)
datapipe = datapipe.copy_to(device)
dataloader = gb.DataLoader(datapipe
, num_workers=0
)
dataloader = gb.DataLoader(datapipe)
for data in dataloader:
input_features = {
...
...
docs/source/guide/minibatch-edge.rst
View file @
3c922899
...
...
@@ -39,7 +39,7 @@ edges(namely, node pairs) in the training set instead of the nodes.
#
datapipe
=
gb
.
NeighborSampler
(
datapipe
,
g
,
[
10
,
10
])
datapipe
=
datapipe
.
fetch_feature
(
feature
,
node_feature_keys
=[
"feat"
])
datapipe
=
datapipe
.
copy_to
(
device
)
dataloader
=
gb
.
DataLoader
(
datapipe
,
num_workers
=
0
)
dataloader
=
gb
.
DataLoader
(
datapipe
)
Iterating
over
the
DataLoader
will
yield
:
class
:`~
dgl
.
graphbolt
.
MiniBatch
`
which
contains
a
list
of
specially
created
graphs
representing
the
computation
...
...
@@ -92,7 +92,7 @@ You can use :func:`~dgl.graphbolt.exclude_seed_edges` alongside with
datapipe
=
datapipe
.
transform
(
exclude_seed_edges
)
datapipe
=
datapipe
.
fetch_feature
(
feature
,
node_feature_keys
=[
"feat"
])
datapipe
=
datapipe
.
copy_to
(
device
)
dataloader
=
gb
.
DataLoader
(
datapipe
,
num_workers
=
0
)
dataloader
=
gb
.
DataLoader
(
datapipe
)
Adapt
your
model
for
minibatch
training
...
...
@@ -273,7 +273,7 @@ only difference is that the train_set is now an instance of
feature
,
node_feature_keys
={
"item"
:
[
"feat"
],
"user"
:
[
"feat"
]}
)
datapipe
=
datapipe
.
copy_to
(
device
)
dataloader
=
gb
.
DataLoader
(
datapipe
,
num_workers
=
0
)
dataloader
=
gb
.
DataLoader
(
datapipe
)
Things
become
a
little
different
if
you
wish
to
exclude
the
reverse
edges
on
heterogeneous
graphs
.
On
heterogeneous
graphs
,
reverse
edges
...
...
docs/source/guide/minibatch-inference.rst
View file @
3c922899
...
...
@@ -48,7 +48,7 @@ only one layer at a time.
datapipe = datapipe.sample_neighbor(g, [-1]) # 1 layers.
datapipe = datapipe.fetch_feature(feature, node_feature_keys=["feat"])
datapipe = datapipe.copy_to(device)
dataloader = gb.DataLoader(datapipe
, num_workers=0
)
dataloader = gb.DataLoader(datapipe)
Note that offline inference is implemented as a method of the GNN module
...
...
docs/source/guide/minibatch-link.rst
View file @
3c922899
...
...
@@ -28,7 +28,7 @@ The whole data loader pipeline is as follows:
datapipe = datapipe.transform(gb.exclude_seed_edges)
datapipe = datapipe.fetch_feature(feature, node_feature_keys=["feat"])
datapipe = datapipe.copy_to(device)
dataloader = gb.DataLoader(datapipe
, num_workers=0
)
dataloader = gb.DataLoader(datapipe)
For the details about the builtin uniform negative sampler please see
...
...
@@ -213,7 +213,7 @@ only difference is that you need to give edge types for feature fetching.
node_feature_keys={"user": ["feat"], "item": ["feat"]}
)
datapipe = datapipe.copy_to(device)
dataloader = gb.DataLoader(datapipe
, num_workers=0
)
dataloader = gb.DataLoader(datapipe)
If you want to give your own negative sampling function, just inherit from the
:class:`~dgl.graphbolt.NegativeSampler` class and override the
...
...
docs/source/guide/minibatch-node.rst
View file @
3c922899
...
...
@@ -51,7 +51,7 @@ putting the list of generated MFGs onto GPU.
#
datapipe
=
gb
.
NeighborSampler
(
datapipe
,
g
,
[
10
,
10
])
datapipe
=
datapipe
.
fetch_feature
(
feature
,
node_feature_keys
=[
"feat"
])
datapipe
=
datapipe
.
copy_to
(
device
)
dataloader
=
gb
.
DataLoader
(
datapipe
,
num_workers
=
0
)
dataloader
=
gb
.
DataLoader
(
datapipe
)
Iterating
over
the
DataLoader
will
yield
:
class
:`~
dgl
.
graphbolt
.
MiniBatch
`
...
...
@@ -216,7 +216,7 @@ of node types to node IDs.
feature
,
node_feature_keys
={
"author"
:
[
"feat"
],
"paper"
:
[
"feat"
]}
)
datapipe
=
datapipe
.
copy_to
(
device
)
dataloader
=
gb
.
DataLoader
(
datapipe
,
num_workers
=
0
)
dataloader
=
gb
.
DataLoader
(
datapipe
)
The
training
loop
is
almost
the
same
as
that
of
homogeneous
graphs
,
except
for
the
implementation
of
``
compute_loss
``
that
will
take
in
two
...
...
docs/source/guide/minibatch-parallelism.rst
View file @
3c922899
...
...
@@ -22,7 +22,7 @@ generate a minibatch, including:
datapipe = datapipe.transform(gb.exclude_seed_edges)
datapipe = datapipe.fetch_feature(feature, node_feature_keys=["feat"])
datapipe = datapipe.copy_to(device)
dataloader = gb.DataLoader(datapipe
, num_workers=0
)
dataloader = gb.DataLoader(datapipe)
All these stages are implemented in separate
`IterableDataPipe <https://pytorch.org/data/main/torchdata.datapipes.iter.html>`__
...
...
tutorials/multi/2_node_classification.py
View file @
3c922899
...
...
@@ -119,7 +119,7 @@ def create_dataloader(
datapipe
=
datapipe
.
sample_neighbor
(
graph
,
[
10
,
10
,
10
])
datapipe
=
datapipe
.
fetch_feature
(
features
,
node_feature_keys
=
[
"feat"
])
datapipe
=
datapipe
.
copy_to
(
device
)
dataloader
=
gb
.
DataLoader
(
datapipe
,
num_workers
=
0
)
dataloader
=
gb
.
DataLoader
(
datapipe
)
return
dataloader
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment