Commit 7784f066 authored by John Andrilla's avatar John Andrilla Committed by Minjie Wang
Browse files

[Doc] Small graphs readme.txt, edit pass (#1030)

* [Doc] Small graphs readme.txt, edit pass

Edit for grammar and style. Should this be an .rst on .txt?

* Update README.txt
parent fff3dd95
.. _tutorials2-index:
Dealing with many small graphs
Batching many small graphs
==============================
* **Tree-LSTM** `[paper] <https://arxiv.org/abs/1503.00075>`__ `[tutorial]
<2_small_graph/3_tree-lstm.html>`__ `[code]
<2_small_graph/3_tree-lstm.html>`__ `[PyTorch code]
<https://github.com/dmlc/dgl/blob/master/examples/pytorch/tree_lstm>`__:
sentences of natural languages have inherent structures, which are thrown
Sentences have inherent structures that are thrown
away by treating them simply as sequences. Tree-LSTM is a powerful model
that learns the representation by leveraging prior syntactic structures
(e.g. parse-tree). The challenge to train it well is that simply by padding
a sentence to the maximum length no longer works, since trees of different
that learns the representation by using prior syntactic structures such as a parse-tree.
The challenge in training is that simply by padding
a sentence to the maximum length no longer works. Trees of different
sentences have different sizes and topologies. DGL solves this problem by
throwing the trees into a bigger "container" graph, and use message-passing
to explore maximum parallelism. The key API we use is batching.
adding the trees to a bigger container graph, and then using message-passing
to explore maximum parallelism. Batching is a key API for this.
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment