Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
OpenDAS
dgl
Commits
f45178c3
Commit
f45178c3
authored
Dec 01, 2019
by
John Andrilla
Committed by
Minjie Wang
Dec 02, 2019
Browse files
[Doc] Generative models, edit for readability (#1033)
Edit pass for grammar and style
parent
65a904c5
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
9 additions
and
9 deletions
+9
-9
tutorials/models/3_generative_model/README.txt
tutorials/models/3_generative_model/README.txt
+9
-9
No files found.
tutorials/models/3_generative_model/README.txt
View file @
f45178c3
...
@@ -4,18 +4,18 @@ Generative models
...
@@ -4,18 +4,18 @@ Generative models
==================
==================
* **DGMG** `[paper] <https://arxiv.org/abs/1803.03324>`__ `[tutorial]
* **DGMG** `[paper] <https://arxiv.org/abs/1803.03324>`__ `[tutorial]
<3_generative_model/5_dgmg.html>`__ `[code]
<3_generative_model/5_dgmg.html>`__ `[
PyTorch
code]
<https://github.com/dmlc/dgl/tree/master/examples/pytorch/dgmg>`__:
<https://github.com/dmlc/dgl/tree/master/examples/pytorch/dgmg>`__:
t
his model belongs to the
important
family that deals with structural
T
his model belongs to the family that deals with structural
generation. D
GMG is interesting because its
state-machine approach
is the
generation. D
eep generative models of graphs (DGMG) uses a
state-machine approach
.
most general.
It is also very challenging because, unlike Tree-LSTM, every
It is also very challenging because, unlike Tree-LSTM, every
sample has a dynamic, probability-driven structure that is not available
sample has a dynamic, probability-driven structure that is not available
before training.
We are able to
progressively leverage intra- and
before training.
You can
progressively leverage intra- and
inter-graph parallelism to steadily improve the performance.
inter-graph parallelism to steadily improve the performance.
* **JTNN** `[paper] <https://arxiv.org/abs/1802.04364>`__ `[code]
* **JTNN** `[paper] <https://arxiv.org/abs/1802.04364>`__ `[
PyTorch
code]
<https://github.com/dmlc/dgl/tree/master/examples/pytorch/jtnn>`__:
<https://github.com/dmlc/dgl/tree/master/examples/pytorch/jtnn>`__:
unlike DGMG, this paper
generates molecular graphs using the framework of
This network
generates molecular graphs using the framework of
variational auto-encoder.
Perhaps more interesting is its approach to
build
a
variational auto-encoder.
The junction tree neural network (JTNN)
build
s
structure hierarchically
, i
n the case of molecular,
w
it
h
junction tree as
structure hierarchically
. I
n the case of molecular
graphs
, it
uses a
junction tree as
the middle scaffolding.
the middle scaffolding.
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment