Unverified Commit 3c922899 authored by Rhett Ying's avatar Rhett Ying Committed by GitHub
Browse files

[GraphBolt] remove num_workers=0 from doc (#6803)

parent fd980d53
......@@ -48,7 +48,7 @@ To use this sampler with :class:`~dgl.graphbolt.DataLoader`:
datapipe = datapipe.customized_sample_neighbor(g, [10, 10]) # 2 layers.
datapipe = datapipe.fetch_feature(feature, node_feature_keys=["feat"])
datapipe = datapipe.copy_to(device)
dataloader = gb.DataLoader(datapipe, num_workers=0)
dataloader = gb.DataLoader(datapipe)
for data in dataloader:
input_features = data.node_features["feat"]
......@@ -93,7 +93,7 @@ can be used on heterogeneous graphs:
feature, node_feature_keys={"user": ["feat"], "item": ["feat"]}
)
datapipe = datapipe.copy_to(device)
dataloader = gb.DataLoader(datapipe, num_workers=0)
dataloader = gb.DataLoader(datapipe)
for data in dataloader:
input_features = {
......
......@@ -39,7 +39,7 @@ edges(namely, node pairs) in the training set instead of the nodes.
# datapipe = gb.NeighborSampler(datapipe, g, [10, 10])
datapipe = datapipe.fetch_feature(feature, node_feature_keys=["feat"])
datapipe = datapipe.copy_to(device)
dataloader = gb.DataLoader(datapipe, num_workers=0)
dataloader = gb.DataLoader(datapipe)
Iterating over the DataLoader will yield :class:`~dgl.graphbolt.MiniBatch`
which contains a list of specially created graphs representing the computation
......@@ -92,7 +92,7 @@ You can use :func:`~dgl.graphbolt.exclude_seed_edges` alongside with
datapipe = datapipe.transform(exclude_seed_edges)
datapipe = datapipe.fetch_feature(feature, node_feature_keys=["feat"])
datapipe = datapipe.copy_to(device)
dataloader = gb.DataLoader(datapipe, num_workers=0)
dataloader = gb.DataLoader(datapipe)
Adapt your model for minibatch training
......@@ -273,7 +273,7 @@ only difference is that the train_set is now an instance of
feature, node_feature_keys={"item": ["feat"], "user": ["feat"]}
)
datapipe = datapipe.copy_to(device)
dataloader = gb.DataLoader(datapipe, num_workers=0)
dataloader = gb.DataLoader(datapipe)
Things become a little different if you wish to exclude the reverse
edges on heterogeneous graphs. On heterogeneous graphs, reverse edges
......
......@@ -48,7 +48,7 @@ only one layer at a time.
datapipe = datapipe.sample_neighbor(g, [-1]) # 1 layers.
datapipe = datapipe.fetch_feature(feature, node_feature_keys=["feat"])
datapipe = datapipe.copy_to(device)
dataloader = gb.DataLoader(datapipe, num_workers=0)
dataloader = gb.DataLoader(datapipe)
Note that offline inference is implemented as a method of the GNN module
......
......@@ -28,7 +28,7 @@ The whole data loader pipeline is as follows:
datapipe = datapipe.transform(gb.exclude_seed_edges)
datapipe = datapipe.fetch_feature(feature, node_feature_keys=["feat"])
datapipe = datapipe.copy_to(device)
dataloader = gb.DataLoader(datapipe, num_workers=0)
dataloader = gb.DataLoader(datapipe)
For the details about the builtin uniform negative sampler please see
......@@ -213,7 +213,7 @@ only difference is that you need to give edge types for feature fetching.
node_feature_keys={"user": ["feat"], "item": ["feat"]}
)
datapipe = datapipe.copy_to(device)
dataloader = gb.DataLoader(datapipe, num_workers=0)
dataloader = gb.DataLoader(datapipe)
If you want to give your own negative sampling function, just inherit from the
:class:`~dgl.graphbolt.NegativeSampler` class and override the
......
......@@ -51,7 +51,7 @@ putting the list of generated MFGs onto GPU.
# datapipe = gb.NeighborSampler(datapipe, g, [10, 10])
datapipe = datapipe.fetch_feature(feature, node_feature_keys=["feat"])
datapipe = datapipe.copy_to(device)
dataloader = gb.DataLoader(datapipe, num_workers=0)
dataloader = gb.DataLoader(datapipe)
Iterating over the DataLoader will yield :class:`~dgl.graphbolt.MiniBatch`
......@@ -216,7 +216,7 @@ of node types to node IDs.
feature, node_feature_keys={"author": ["feat"], "paper": ["feat"]}
)
datapipe = datapipe.copy_to(device)
dataloader = gb.DataLoader(datapipe, num_workers=0)
dataloader = gb.DataLoader(datapipe)
The training loop is almost the same as that of homogeneous graphs,
except for the implementation of ``compute_loss`` that will take in two
......
......@@ -22,7 +22,7 @@ generate a minibatch, including:
datapipe = datapipe.transform(gb.exclude_seed_edges)
datapipe = datapipe.fetch_feature(feature, node_feature_keys=["feat"])
datapipe = datapipe.copy_to(device)
dataloader = gb.DataLoader(datapipe, num_workers=0)
dataloader = gb.DataLoader(datapipe)
All these stages are implemented in separate
`IterableDataPipe <https://pytorch.org/data/main/torchdata.datapipes.iter.html>`__
......
......@@ -119,7 +119,7 @@ def create_dataloader(
datapipe = datapipe.sample_neighbor(graph, [10, 10, 10])
datapipe = datapipe.fetch_feature(features, node_feature_keys=["feat"])
datapipe = datapipe.copy_to(device)
dataloader = gb.DataLoader(datapipe, num_workers=0)
dataloader = gb.DataLoader(datapipe)
return dataloader
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment