index.rst 1.54 KB
Newer Older
1
.. FairScale documentation master file, created by
2
   sphinx-quickstart on Tue Sep  8 16:19:17 2020.
Min Xu's avatar
Min Xu committed
3
4
5
   You can adapt this file completely to your liking,
   but it should at least contain the root `toctree`
   directive.
6

7
Welcome to FairScale's documentation!
8
9
=====================================

10
*FairScale* is a PyTorch extension library for high performance and
Min Xu's avatar
Min Xu committed
11
12
13
large scale training for optimizing training on one or across multiple
machines/nodes. This library extend basic pytorch capabilities while
adding new experimental ones.
14
15
16
17
18
19


Components
----------

* Parallelism:
20
   * `Pipeline parallelism <../../en/latest/api/nn/pipe.html>`_
21

22
23
24
25
26
27
* Sharded training:
    * `Optimizer state sharding <../../en/latest/api/optim/oss.html>`_
    * `Sharded grad scaler - automatic mixed precision <../../en/latest/api/optim/grad_scaler.html>`_
    * `Sharded distributed data parallel <../../en/latest/api/nn/sharded_ddp.html>`_

* Optimization at scale:
28
   * `AdaScale SGD <../../en/latest/api/optim/adascale.html>`_
29

30
31
32
* GPU memory optimization:
   * `Activation checkpointing wrapper <../../en/latest/api/nn/misc/checkpoint_activations.html>`_

33

34
35
36
* `Tutorials <../../en/latest/tutorials/index.html>`_


37
.. warning::
Min Xu's avatar
Min Xu committed
38
39
40
41
    This library is under active development.
    Please be mindful and create an
    `issue <https://github.com/facebookresearch/fairscale/issues>`_
    if you have any trouble and/or suggestion.
42

43
44
45
46
47
48
49
50
.. toctree::
   :maxdepth: 5
   :caption: Contents:
   :hidden:

   tutorials/index
   api/index

51
52
53
54
55

Reference
=========

:ref:`genindex` | :ref:`modindex` | :ref:`search`