Unverified Commit 23d1aa66 authored by Rick Ho's avatar Rick Ho Committed by GitHub
Browse files

add slack link to readme

parent 38d34c9c
FastMoE FastMoE
=== ===
[Release note](docs/release-note.md) | [中文 Readme](docs/readme-cn.md) | [Slack workspace](https://join.slack.com/t/fastmoe/shared_invite/zt-mz0ai6ol-ggov75D62YsgHfzShw8KYw)
## Introduction ## Introduction
An easy-to-use but efficient implementation of the Mixture of Experts (MoE) An easy-to-use but efficient implementation of the Mixture of Experts (MoE)
...@@ -95,4 +97,6 @@ FastMoE's model parallel requires sophiscated parallel strategies that neither P ...@@ -95,4 +97,6 @@ FastMoE's model parallel requires sophiscated parallel strategies that neither P
Megatron-LM provides. The `fmoe.DistributedGroupedDataParallel` module is Megatron-LM provides. The `fmoe.DistributedGroupedDataParallel` module is
introduced to replace PyTorch's DDP module. introduced to replace PyTorch's DDP module.
## Troubleshootings / Discussion
If you have any problem using FastMoE, or you are interested in getting involved in developing FastMoE, feel free to join the [our slack channel](https://join.slack.com/t/fastmoe/shared_invite/zt-mz0ai6ol-ggov75D62YsgHfzShw8KYw).
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment