README.md 888 Bytes
Newer Older
1
# Comparison of Large Batch Training Optimization
2

3
## Table of contents
4

5
6
- [Overview](#-overview)
- [Quick Start](#-quick-start)
7

8
## 📚 Overview
9

10
This example lets you to quickly try out the large batch training optimization provided by Colossal-AI. We use synthetic dataset to go through the process, thus, you don't need to prepare any dataset. You can try out the `Lamb` and `Lars` optimizers from Colossal-AI with the following code.
11

12
13
```python
from colossalai.nn.optimizer import Lamb, Lars
14
15
```

16
17
18
## 🚀 Quick Start

1. Install PyTorch
19

20
21
22
23
24
2. Install the dependencies.

```bash
pip install -r requirements.txt
```
25

26
3. Run the training scripts with synthetic data.
27
28

```bash
29
30
31
# run on 4 GPUs
# run with lars
colossalai run --nproc_per_node 4 train.py --config config.py --optimizer lars
32

33
34
# run with lamb
colossalai run --nproc_per_node 4 train.py --config config.py --optimizer lamb
35
```