Commit 01464726 authored by Jiezhong Qiu's avatar Jiezhong Qiu
Browse files

Merge remote-tracking branch 'origin/master' into bias

parents 3d8e8a43 ed9277f9
......@@ -22,8 +22,16 @@ Fast MoE contains a set of PyTorch customized opearators, including both C and
Python components. Use `python setup.py install` to easily install and enjoy
using Fast MoE for training.
The distributed expert feature is enabled by default. If you want to disable
it, pass environment variable `USE_NCCL=0` to the setup script.
The distributed expert feature is disabled by default. If you want to disable
it, pass environment variable `USE_NCCL=1` to the setup script.
Note that an extra NCCL developer package is needed, which has to be consistant
with your PyTorch's NCCL version, which can be inspected by running
`torch.cuda.nccl.version()`. The [official PyTorch docker image]() is
recommended, as the environment is well-setup there. Otherwise, you can access
the [download link of all NCCL
versions](https://developer.nvidia.com/nccl/nccl-legacy-downloads) to download
the NCCL package that is suitable for you.
## Usage
......
......@@ -10,6 +10,7 @@ cxx_flags = [
ext_libs = []
if os.environ.get('USE_NCCL', '0') == '1':
cxx_flags.append('-DMOE_USE_NCCL')
ext_libs.append('nccl')
if __name__ == '__main__':
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment