Unverified Commit 43fb73db authored by Rhett Ying's avatar Rhett Ying Committed by GitHub
Browse files

[GraphBolt] update doc page about to_dgl() (#6768)

parent 358db43a
...@@ -51,7 +51,6 @@ To use this sampler with :class:`~dgl.graphbolt.DataLoader`: ...@@ -51,7 +51,6 @@ To use this sampler with :class:`~dgl.graphbolt.DataLoader`:
dataloader = gb.DataLoader(datapipe, num_workers=0) dataloader = gb.DataLoader(datapipe, num_workers=0)
for data in dataloader: for data in dataloader:
data = data.to_dgl()
input_features = data.node_features["feat"] input_features = data.node_features["feat"]
output_labels = data.labels output_labels = data.labels
output_predictions = model(data.blocks, input_features) output_predictions = model(data.blocks, input_features)
...@@ -97,7 +96,6 @@ can be used on heterogeneous graphs: ...@@ -97,7 +96,6 @@ can be used on heterogeneous graphs:
dataloader = gb.DataLoader(datapipe, num_workers=0) dataloader = gb.DataLoader(datapipe, num_workers=0)
for data in dataloader: for data in dataloader:
data = data.to_dgl()
input_features = { input_features = {
ntype: data.node_features[(ntype, "feat")] ntype: data.node_features[(ntype, "feat")]
for ntype in data.blocks[0].srctypes for ntype in data.blocks[0].srctypes
......
...@@ -43,13 +43,11 @@ edges(namely, node pairs) in the training set instead of the nodes. ...@@ -43,13 +43,11 @@ edges(namely, node pairs) in the training set instead of the nodes.
Iterating over the DataLoader will yield :class:`~dgl.graphbolt.MiniBatch` Iterating over the DataLoader will yield :class:`~dgl.graphbolt.MiniBatch`
which contains a list of specially created graphs representing the computation which contains a list of specially created graphs representing the computation
dependencies on each layer. In order to train with DGL, you need to convert them dependencies on each layer. You can access the *message flow graphs* (MFGs) via
to :class:`~dgl.graphbolt.DGLMiniBatch`. Then you can access the `mini_batch.blocks`.
*message flow graphs* (MFGs).
.. code:: python .. code:: python
mini_batch = next(iter(dataloader)) mini_batch = next(iter(dataloader))
mini_batch = mini_batch.to_dgl()
print(mini_batch.blocks) print(mini_batch.blocks)
.. note:: .. note::
...@@ -182,7 +180,6 @@ their incident node representations. ...@@ -182,7 +180,6 @@ their incident node representations.
opt = torch.optim.Adam(model.parameters()) opt = torch.optim.Adam(model.parameters())
for data in dataloader: for data in dataloader:
data = data.to_dgl()
blocks = data.blocks blocks = data.blocks
x = data.edge_features("feat") x = data.edge_features("feat")
y_hat = model(data.blocks, x, data.positive_node_pairs) y_hat = model(data.blocks, x, data.positive_node_pairs)
...@@ -317,7 +314,6 @@ dictionaries of node types and predictions here. ...@@ -317,7 +314,6 @@ dictionaries of node types and predictions here.
opt = torch.optim.Adam(model.parameters()) opt = torch.optim.Adam(model.parameters())
for data in dataloader: for data in dataloader:
data = data.to_dgl()
blocks = data.blocks blocks = data.blocks
x = data.edge_features(("user:like:item", "feat")) x = data.edge_features(("user:like:item", "feat"))
y_hat = model(data.blocks, x, data.positive_node_pairs) y_hat = model(data.blocks, x, data.positive_node_pairs)
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment