README.txt 1.42 KB
Newer Older
1
2
3
.. _tutorials4-index:


4
Revisit classic models from a graph perspective
5
-------------------------------------------------------
6

7
* **Capsule** `[paper] <https://arxiv.org/abs/1710.09829>`__ `[tutorial]
8
  <4_old_wines/2_capsule.html>`__ `[PyTorch code]
Minjie Wang's avatar
Minjie Wang committed
9
  <https://github.com/dmlc/dgl/tree/master/examples/pytorch/capsule>`__:
10
11
  This new computer vision model has two key ideas. First, enhancing the feature
  representation in a vector form (instead of a scalar) called *capsule*. Second,
12
  replacing max-pooling with dynamic routing. The idea of dynamic routing is to
13
14
  integrate a lower level capsule to one or several higher level capsules
  with non-parametric message-passing. A tutorial shows how the latter can be 
15
  implemented with DGL APIs.
16

Zihao Ye's avatar
Zihao Ye committed
17
18

* **Transformer** `[paper] <https://arxiv.org/abs/1706.03762>`__ `[tutorial] <4_old_wines/7_transformer.html>`__ 
19
  `[PyTorch code] <https://github.com/dmlc/dgl/tree/master/examples/pytorch/transformer>`__ and **Universal Transformer** 
Zihao Ye's avatar
Zihao Ye committed
20
  `[paper] <https://arxiv.org/abs/1807.03819>`__ `[tutorial] <4_old_wines/7_transformer.html>`__
21
22
  `[PyTorch code] <https://github.com/dmlc/dgl/tree/master/examples/pytorch/transformer/modules/act.py>`__:
  These two models replace recurrent neural networks (RNNs) with several layers of multi-head attention to
23
  encode and discover structures among tokens of a sentence. These attention
24
  mechanisms are similarly formulated as graph operations with message-passing.