Unverified Commit 93d115c6 authored by Min Xu's avatar Min Xu Committed by GitHub
Browse files

update readme (#439)

parent 506d6209
...@@ -10,20 +10,22 @@ FairScale is a PyTorch extension library for high performance and large scale tr ...@@ -10,20 +10,22 @@ FairScale is a PyTorch extension library for high performance and large scale tr
FairScale supports: FairScale supports:
* Parallelism: * Parallelism:
* Pipeline parallelism (fairscale.nn.pipe) * Pipeline parallelism (`fairscale.nn.pipe`)
* Asynchronous Pipeline parallelism (fairscale.nn.async_pipe) * Asynchronous Pipeline parallelism (`fairscale.nn.async_pipe`)
* Mixture of experts (fairscale.nn.moe.moe_layer) * Mixture of experts (`fairscale.nn.moe.moe_layer`)
* Model Parallelism (fairscale.nn.model_parallel.layers) * Model Parallelism (`fairscale.nn.model_parallel.layers`)
* _experimental_ AmpNet (fairscale.experimental.nn.ampnet_pipe) * _experimental_ AmpNet (`fairscale.experimental.nn.ampnet_pipe`)
* Sharded training: * Sharded training:
* Optimizer state sharding (fairscale.optim.OSS) * Optimizer state sharding (`fairscale.optim.OSS`)
* Sharded grad scaler - automatic mixed precision (fairscale.optim.grad_scaler) * Sharded Data Parallel (SDP) (`fairscale.nn.ShardedDataParallel`)
* Sharded distributed data parallel (fairscale.nn.ShardedDataParallel) * Fully Sharded Data Parallel (FSDP) (`fairscale.nn.FullyShardedDataParallel`)
* Fully Sharded Data Parallel (FSDP) (fairscale.nn.FullyShardedDataParallel)
* Optimization at scale: * Optimization at scale:
* AdaScale SGD (fairscale.optim.AdaScale) * AdaScale SGD (`fairscale.optim.AdaScale`)
* GPU memory optimization: * GPU memory optimization:
* Activation checkpointing wrapper(fairscale.nn.misc.checkpoint_wrapper) * Activation checkpointing wrapper (`fairscale.nn.misc.checkpoint_wrapper`)
* _experimental_ CPU offloaded model (`fairscale.experimental.nn.offload.OffloadModel`)
* GPU speed optimization:
* Sharded grad scaler - automatic mixed precision (`fairscale.optim.grad_scaler`)
## Requirements ## Requirements
......
# Copyright (c) Facebook, Inc. and its affiliates.
#
# This source code is licensed under the BSD license found in the
# LICENSE file in the root directory of this source tree.
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment