Unverified Commit 53ea03ad authored by Lingfan Yu's avatar Lingfan Yu Committed by GitHub
Browse files

[Doc] Fix doc bug and warning (#213)

* fix dgmg tutorial indentation

* bug in incidence_matrix docstring

* remove print
parent 0f121eeb
......@@ -1511,17 +1511,23 @@ class DGLGraph(object):
value indicating whether the edge is incident to the node
or not.
There are three types of an incidence matrix `I`:
There are three types of an incidence matrix :math:`I`:
* "in":
- I[v, e] = 1 if e is the in-edge of v (or v is the dst node of e);
- I[v, e] = 0 otherwise.
- :math:`I[v, e] = 1` if e is the in-edge of v (or v is the dst node of e);
- :math:`I[v, e] = 0` otherwise.
* "out":
- I[v, e] = 1 if e is the out-edge of v (or v is the src node of e);
- I[v, e] = 0 otherwise.
- :math:`I[v, e] = 1` if e is the out-edge of v (or v is the src node of e);
- :math:`I[v, e] = 0` otherwise.
* "both":
- I[v, e] = 1 if e is the in-edge of v;
- I[v, e] = -1 if e is the out-edge of v;
- I[v, e] = 0 otherwise (including self-loop).
- :math:`I[v, e] = 1` if e is the in-edge of v;
- :math:`I[v, e] = -1` if e is the out-edge of v;
- :math:`I[v, e] = 0` otherwise (including self-loop).
Parameters
----------
......
......@@ -336,10 +336,8 @@ model.train()
for epoch in range(n_epochs):
optimizer.zero_grad()
logits = model.forward(g)
print("after forward")
loss = F.cross_entropy(logits[train_idx], labels[train_idx])
loss.backward()
print("after backward")
optimizer.step()
......
......@@ -55,6 +55,7 @@ g.add_edges([2, 0], [0, 2]) # Add edges (2, 0), (0, 2)
# with different sizes, topologies, node types, edge types, and the possibility
# of multigraphs. Besides, a same graph can be generated in many different
# orders. Regardless, the generative process entails a few steps:
#
# - Encode a changing graph,
# - Perform actions stochastically,
# - Collect error signals and optimize the model parameters (If we are training)
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment