Unverified Commit d43a671a authored by Frank Lee's avatar Frank Lee Committed by GitHub
Browse files

Hotfix/tutorial readme index (#1922)

* [tutorial] removed tutorial index in readme

* [tutorial] removed tutorial index in readme
parent 24cbee0e
# Handson 3: Auto-Parallelism with ResNet # Auto-Parallelism with ResNet
## Prepare Dataset ## Prepare Dataset
......
# Handson 1: Multi-dimensional Parallelism with Colossal-AI # Multi-dimensional Parallelism with Colossal-AI
## Install Titans Model Zoo ## Install Titans Model Zoo
......
# Handson 4: Comparison of Large Batch Training Optimization # Comparison of Large Batch Training Optimization
## Prepare Dataset ## Prepare Dataset
......
# Handson 5: Fine-tuning and Serving for OPT from Hugging Face # Fine-tuning and Serving for OPT from Hugging Face
# Handson 2: Sequence Parallelism with BERT # Sequence Parallelism with BERT
In this example, we implemented BERT with sequence parallelism. Sequence parallelism splits the input tensor and intermediate In this example, we implemented BERT with sequence parallelism. Sequence parallelism splits the input tensor and intermediate
activation along the sequence dimension. This method can achieve better memory efficiency and allows us to train with larger batch size and longer sequence length. activation along the sequence dimension. This method can achieve better memory efficiency and allows us to train with larger batch size and longer sequence length.
...@@ -140,4 +140,3 @@ machine setting. ...@@ -140,4 +140,3 @@ machine setting.
launch_from_slurm` or `colossalai.launch_from_openmpi` as it is easier to use SLURM and OpenMPI launch_from_slurm` or `colossalai.launch_from_openmpi` as it is easier to use SLURM and OpenMPI
to start multiple processes over multiple nodes. If you have your own launcher, you can fall back to start multiple processes over multiple nodes. If you have your own launcher, you can fall back
to the default `colossalai.launch` function. to the default `colossalai.launch` function.
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment