Commit 069da0e8 authored by heheda's avatar heheda
Browse files

update readme: enable NCCL by default

parent 5a5dfa18
...@@ -25,8 +25,8 @@ FastMoE contains a set of PyTorch customized opearators, including both C and ...@@ -25,8 +25,8 @@ FastMoE contains a set of PyTorch customized opearators, including both C and
Python components. Use `python setup.py install` to easily install and enjoy Python components. Use `python setup.py install` to easily install and enjoy
using FastMoE for training. using FastMoE for training.
The distributed expert feature is disabled by default. If you want to enable The distributed expert feature is enabled by default. If you want to disable
it, pass environment variable `USE_NCCL=1` to the setup script. it, pass environment variable `USE_NCCL=0` to the setup script.
Note that an extra NCCL developer package is needed, which has to be consistent Note that an extra NCCL developer package is needed, which has to be consistent
with your PyTorch's NCCL version, which can be inspected by running with your PyTorch's NCCL version, which can be inspected by running
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment