Unverified Commit fa343873 authored by KoyamaSohei's avatar KoyamaSohei Committed by GitHub
Browse files

Fix docs on GraphSAGE normalization (#3711)


Co-authored-by: default avatarJinjing Zhou <VoVAllen@users.noreply.github.com>
parent 0767c5fc
...@@ -24,7 +24,7 @@ class SAGEConv(nn.Block): ...@@ -24,7 +24,7 @@ class SAGEConv(nn.Block):
h_{i}^{(l+1)} &= \sigma \left(W \cdot \mathrm{concat} h_{i}^{(l+1)} &= \sigma \left(W \cdot \mathrm{concat}
(h_{i}^{l}, h_{\mathcal{N}(i)}^{l+1}) \right) (h_{i}^{l}, h_{\mathcal{N}(i)}^{l+1}) \right)
h_{i}^{(l+1)} &= \mathrm{norm}(h_{i}^{l}) h_{i}^{(l+1)} &= \mathrm{norm}(h_{i}^{(l+1)})
Parameters Parameters
---------- ----------
......
...@@ -24,7 +24,7 @@ class SAGEConv(nn.Module): ...@@ -24,7 +24,7 @@ class SAGEConv(nn.Module):
h_{i}^{(l+1)} &= \sigma \left(W \cdot \mathrm{concat} h_{i}^{(l+1)} &= \sigma \left(W \cdot \mathrm{concat}
(h_{i}^{l}, h_{\mathcal{N}(i)}^{l+1}) \right) (h_{i}^{l}, h_{\mathcal{N}(i)}^{l+1}) \right)
h_{i}^{(l+1)} &= \mathrm{norm}(h_{i}^{l}) h_{i}^{(l+1)} &= \mathrm{norm}(h_{i}^{(l+1)})
If a weight tensor on each edge is provided, the aggregation becomes: If a weight tensor on each edge is provided, the aggregation becomes:
......
...@@ -23,7 +23,7 @@ class SAGEConv(layers.Layer): ...@@ -23,7 +23,7 @@ class SAGEConv(layers.Layer):
h_{i}^{(l+1)} &= \sigma \left(W \cdot \mathrm{concat} h_{i}^{(l+1)} &= \sigma \left(W \cdot \mathrm{concat}
(h_{i}^{l}, h_{\mathcal{N}(i)}^{l+1}) \right) (h_{i}^{l}, h_{\mathcal{N}(i)}^{l+1}) \right)
h_{i}^{(l+1)} &= \mathrm{norm}(h_{i}^{l}) h_{i}^{(l+1)} &= \mathrm{norm}(h_{i}^{(l+1)})
Parameters Parameters
---------- ----------
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment