index.rst 6.67 KB
Newer Older
Minjie Wang's avatar
Minjie Wang committed
1
2
3
4
5
.. DGL documentation master file, created by
   sphinx-quickstart on Fri Oct  5 14:18:01 2018.
   You can adapt this file completely to your liking, but it should at least
   contain the root `toctree` directive.

Minjie Wang's avatar
Minjie Wang committed
6
7
8
9
10
Overview of DGL
===============

Deep Graph Library (DGL) is a Python package built for easy implementation of
graph neural network model family, on top of existing DL frameworks (e.g.
Aston Zhang's avatar
Aston Zhang committed
11
PyTorch, MXNet, Gluon etc.).
Minjie Wang's avatar
Minjie Wang committed
12
13

DGL reduces the implementation of graph neural networks into declaring a set
VoVAllen's avatar
VoVAllen committed
14
of *functions* (or *modules* in PyTorch terminology).  In addition, DGL
Minjie Wang's avatar
Minjie Wang committed
15
16
17
18
19
20
21
22
23
24
provides:

* Versatile controls over message passing, ranging from low-level operations
  such as sending along selected edges and receiving on specific nodes, to
  high-level control such as graph-wide feature updates.
* Transparent speed optimization with automatic batching of computations and
  sparse matrix multiplication.
* Seamless integration with existing deep learning frameworks.
* Easy and friendly interfaces for node/edge feature access and graph
  structure manipulation.
Da Zheng's avatar
Da Zheng committed
25
* Good scalability to graphs with tens of millions of vertices.
Minjie Wang's avatar
Minjie Wang committed
26
27
28
29
30
31
32
33
34
35
36
37
38
39

To begin with, we have prototyped 10 models across various domains:
semi-supervised learning on graphs (with potentially billions of nodes/edges),
generative models on graphs, (previously) difficult-to-parallelize tree-based
models like TreeLSTM, etc. We also implement some conventional models in DGL
from a new graphical perspective yielding simplicity.

Relationship of DGL to other frameworks
---------------------------------------
DGL is designed to be compatible and agnostic to the existing tensor
frameworks. It provides a backend adapter interface that allows easy porting
to other tensor-based, autograd-enabled frameworks. Currently, our prototype
works with MXNet/Gluon and PyTorch.

40
41
Get Started
-----------
Minjie Wang's avatar
Minjie Wang committed
42
43

.. toctree::
Minjie Wang's avatar
Minjie Wang committed
44
45
   :maxdepth: 1
   :caption: Get Started
46
   :hidden:
Minjie Wang's avatar
Minjie Wang committed
47
48
49
   :glob:

   install/index
50
   install/backend
Minjie Wang's avatar
Minjie Wang committed
51

52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
Follow the :doc:`instructions<install/index>` to install DGL. The :doc:`DGL at a glance<tutorials/basics/1_first>`
is the most common place to get started with. Each tutorial is accompanied with a runnable
python script and jupyter notebook that can be downloaded.

.. ================================================================================================
   (start) MANUALLY INCLUDE THE GENERATED TUTORIALS/BASIC/INDEX.RST HERE TO EMBED THE EXAMPLES
   ================================================================================================

.. raw:: html

    <div class="sphx-glr-thumbcontainer">

.. only:: html

    .. figure:: /tutorials/basics/images/thumb/sphx_glr_1_first_thumb.png

        :ref:`sphx_glr_tutorials_basics_1_first.py`

.. raw:: html

    </div>


Minjie Wang's avatar
Minjie Wang committed
75
.. toctree::
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
   :hidden:

   /tutorials/basics/1_first

.. raw:: html

    <div class="sphx-glr-thumbcontainer">

.. only:: html

    .. figure:: /tutorials/basics/images/thumb/sphx_glr_2_basics_thumb.png

        :ref:`sphx_glr_tutorials_basics_2_basics.py`

.. raw:: html

    </div>


.. toctree::
   :hidden:

   /tutorials/basics/2_basics

.. raw:: html

    <div class="sphx-glr-thumbcontainer">

.. only:: html

    .. figure:: /tutorials/basics/images/thumb/sphx_glr_3_pagerank_thumb.png

        :ref:`sphx_glr_tutorials_basics_3_pagerank.py`

.. raw:: html

    </div>

.. toctree::
   :hidden:

   /tutorials/basics/3_pagerank

119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
.. raw:: html

    </div>

.. raw:: html

    <div class="sphx-glr-thumbcontainer">

.. only:: html

    .. figure:: /tutorials/basics/images/thumb/sphx_glr_4_batch_thumb.png

        :ref:`sphx_glr_tutorials_basics_4_batch.py`

.. raw:: html

    </div>

.. toctree::
   :hidden:

   /tutorials/basics/4_batch

142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
.. raw:: html

    <div class="sphx-glr-thumbcontainer">

.. only:: html

    .. figure:: /tutorials/hetero/images/thumb/sphx_glr_1_basics_thumb.png

        :ref:`sphx_glr_tutorials_hetero_1_basics.py`

.. raw:: html

    </div>

.. toctree::
   :hidden:

   /tutorials/hetero/1_basics

161
162
163
164
.. raw:: html

    <div style='clear:both'></div>

165

166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
.. ================================================================================================
   (end) MANUALLY INCLUDE THE GENERATED TUTORIALS/BASIC/INDEX.RST HERE TO EMBED THE EXAMPLES
   ================================================================================================

Learning DGL through examples
-----------------------------

The model tutorials are categorized based on the way they utilize DGL APIs.

* :ref:`Graph Neural Network and its variant <tutorials1-index>`: Learn how to use DGL to train
  popular **GNN models** on one input graph.
* :ref:`Dealing with many small graphs <tutorials2-index>`: Learn how to **batch** many
  graph samples for max efficiency.
* :ref:`Generative models <tutorials3-index>`: Learn how to deal with **dynamically-changing graphs**.
* :ref:`Old (new) wines in new bottle <tutorials4-index>`: Learn how to combine DGL with tensor-based
  DGL framework in a flexible way. Explore new perspective on traditional models by graphs.
182
183
* :ref:`Training on giant graphs <tutorials5-index>`: Learn how to train graph neural networks
  on giant graphs.
184
185
186

Or go through all of them :doc:`here <tutorials/models/index>`.

187
188
189
190
191
192
193
.. toctree::
   :maxdepth: 1
   :caption: Features
   :hidden:
   :glob:

   features/builtin
VoVAllen's avatar
VoVAllen committed
194
   features/nn
195

196
197
198
199
.. toctree::
   :maxdepth: 3
   :caption: Model Tutorials
   :hidden:
Minjie Wang's avatar
Minjie Wang committed
200
   :glob:
Minjie Wang's avatar
Minjie Wang committed
201

202
   tutorials/models/index
Minjie Wang's avatar
Minjie Wang committed
203
204
205

.. toctree::
   :maxdepth: 2
Minjie Wang's avatar
Minjie Wang committed
206
207
   :caption: API Reference
   :glob:
Minjie Wang's avatar
Minjie Wang committed
208
209
210

   api/python/index

Minjie Wang's avatar
Minjie Wang committed
211
212
.. toctree::
   :maxdepth: 1
213
   :caption: Developer Notes
214
   :hidden:
Minjie Wang's avatar
Minjie Wang committed
215
216
   :glob:

217
   contribute
218
219
220
221
222
223
224
225
   developer/ffi

.. toctree::
   :maxdepth: 1
   :caption: Misc
   :hidden:
   :glob:

Minjie Wang's avatar
Minjie Wang committed
226
227
   faq
   env_var
228
229
230
231
232
233
234
   resources

Free software
-------------
DGL is free software; you can redistribute it and/or modify it under the terms
of the Apache License 2.0. We welcome contributions.
Join us on `GitHub <https://github.com/dmlc/dgl>`_ and check out our
235
:doc:`contribution guidelines <contribute>`.
236
237
238
239
240
241
242
243
244
245
246
247
248
249

History
-------
Prototype of DGL started in early Spring, 2018, at NYU Shanghai by Prof. `Zheng
Zhang <https://shanghai.nyu.edu/academics/faculty/directory/zheng-zhang>`_ and
Quan Gan. Serious development began when `Minjie
<https://jermainewang.github.io/>`_, `Lingfan <https://cs.nyu.edu/~lingfan/>`_
and Prof. `Jinyang Li <http://www.news.cs.nyu.edu/~jinyang/>`_ from NYU's
system group joined, flanked by a team of student volunteers at NYU Shanghai,
Fudan and other universities (Yu, Zihao, Murphy, Allen, Qipeng, Qi, Hao), as
well as early adopters at the CILVR lab (Jake Zhao). Development accelerated
when AWS MXNet Science team joined force, with Da Zheng, Alex Smola, Haibin
Lin, Chao Ma and a number of others. For full credit, see `here
<https://www.dgl.ai/ack>`_.
Minjie Wang's avatar
Minjie Wang committed
250

Minjie Wang's avatar
Minjie Wang committed
251
252
253
Index
-----
* :ref:`genindex`