- 04 Sep, 2023 1 commit
-
-
flybird11111 authored
* [shardformer] fix opt test hanging * fix * test * test * test * fix test * fix test * remove print * add fix * [shardformer] add bert finetune example * [shardformer] add bert finetune example * [shardformer] add bert finetune example * [shardformer] add bert finetune example * [shardformer] add bert finetune example * [shardformer] add bert finetune example * [shardformer] fix epoch change * [shardformer] broadcast add pp group * [shardformer] fix opt test hanging * fix * test * test * [shardformer] zero1+pp and the corresponding tests (#4517) * pause * finish pp+zero1 * Update test_shard_vit.py * [shardformer/fix overlap bug] fix overlap bug, add overlap as an option in shardco… (#4516) * fix overlap bug and support bert, add overlap as an option in shardconfig * support overlap for chatglm and bloom * [shardformer] fix emerged bugs after updating transformers (#4526) * test * fix test * fix test * remove print * add fix * [shardformer] add bert finetune example * [shardformer] add bert finetune example * [shardformer] Add overlap support for gpt2 (#4535) * add overlap support for gpt2 * remove unused code * remove unused code * [shardformer] support pp+tp+zero1 tests (#4531) * [shardformer] fix opt test hanging * fix * test * test * test * fix test * fix test * remove print * add fix * [shardformer] pp+tp+zero1 [shardformer] pp+tp+zero1 [shardformer] pp+tp+zero1 [shardformer] pp+tp+zero1 [shardformer] pp+tp+zero1 [shardformer] pp+tp+zero1 * [shardformer] pp+tp+zero1 * [shardformer] pp+tp+zero1 * [shardformer] pp+tp+zero1 * [shardformer] pp+tp+zero1 * [shardformer] fix submodule replacement bug when enabling pp (#4544) * [shardformer] support sharded optimizer checkpointIO of HybridParallelPlugin (#4540) * implement sharded optimizer saving * add more param info * finish implementation of sharded optimizer saving * fix bugs in optimizer sharded saving * add pp+zero test * param group loading * greedy loading of optimizer * fix bug when loading * implement optimizer sharded saving * add optimizer test & arrange checkpointIO utils * fix gemini sharding state_dict * add verbose option * add loading of master params * fix typehint * fix master/working mapping in fp16 amp * [shardformer] add bert finetune example * [shardformer] add bert finetune example * [shardformer] add bert finetune example * [shardformer] add bert finetune example * [shardformer] fix epoch change * [shardformer] broadcast add pp group * rebase feature/shardformer * update pipeline * [shardformer] fix * [shardformer] fix * [shardformer] bert finetune fix * [shardformer] add all_reduce operation to loss add all_reduce operation to loss * [shardformer] make compatible with pytree. make compatible with pytree. * [shardformer] disable tp disable tp * [shardformer] add 3d plugin to ci test * [shardformer] update num_microbatches to None * [shardformer] update microbatchsize * [shardformer] update assert * update scheduler * update scheduler --------- Co-authored-by:
Jianghai <72591262+CjhHa1@users.noreply.github.com> Co-authored-by:
Bin Jia <45593998+FoolPlayer@users.noreply.github.com> Co-authored-by:
Baizhou Zhang <eddiezhang@pku.edu.cn>
-
- 26 Jul, 2023 1 commit
-
-
binmakeswell authored
-
- 17 Jul, 2023 1 commit
-
-
binmakeswell authored
-
- 28 Jun, 2023 1 commit
-
-
digger yu authored
-
- 26 Jun, 2023 1 commit
-
-
Baizhou Zhang authored
-
- 19 Jun, 2023 1 commit
-
-
LuGY authored
-
- 12 Jun, 2023 1 commit
-
-
Baizhou Zhang authored
-
- 08 Jun, 2023 2 commits
-
-
digger yu authored
-
Baizhou Zhang authored
-
- 07 Jun, 2023 2 commits
-
-
Liu Ziming authored
* Modify torch version requirement to adapt torch 2.0 * modify palm example using new booster API * roll back * fix port * polish * polish
-
wukong1992 authored
-
- 30 May, 2023 1 commit
-
-
jiangmingyan authored
* [example]update gemini examples * [example]update gemini examples
-
- 24 May, 2023 1 commit
-
-
digger yu authored
* fix typo colossalai/autochunk auto_parallel amp * fix typo colossalai/auto_parallel nn utils etc. * fix typo colossalai/auto_parallel autochunk fx/passes etc. * fix typo docs/ * change placememt_policy to placement_policy in docs/ and examples/
-
- 18 May, 2023 1 commit
-
-
binmakeswell authored
-
- 26 Apr, 2023 1 commit
-
-
digger-yu authored
* Fixed several spelling errors under colossalai * Fix the spelling error in colossalai and docs directory * Cautious Changed the spelling error under the example folder * Update runtime_preparation_pass.py revert autograft to autograd * Update search_chunk.py utile to until * Update check_installation.py change misteach to mismatch in line 91 * Update 1D_tensor_parallel.md revert to perceptron * Update 2D_tensor_parallel.md revert to perceptron in line 73 * Update 2p5D_tensor_parallel.md revert to perceptron in line 71 * Update 3D_tensor_parallel.md revert to perceptron in line 80 * Update README.md revert to resnet in line 42 * Update reorder_graph.py revert to indice in line 7 * Update p2p.py revert to megatron in line 94 * Update initialize.py revert to torchrun in line 198 * Update routers.py change to detailed in line 63 * Update routers.py change to detailed in line 146 * Update README.md revert random number in line 402
-
- 14 Apr, 2023 1 commit
-
-
binmakeswell authored
-
- 07 Apr, 2023 2 commits
-
-
mandoxzhang authored
* update roberta example * update roberta example * modify conflict & update roberta
-
mandoxzhang authored
* update roberta example * update roberta example
-
- 06 Apr, 2023 1 commit
-
-
Frank Lee authored
* [test] added spawn decorator * polish code * polish code * polish code * polish code * polish code * polish code
-
- 04 Apr, 2023 2 commits
-
-
ver217 authored
* [zero] update legacy import * [zero] update examples * [example] fix opt tutorial * [example] fix opt tutorial * [example] fix opt tutorial * [example] fix opt tutorial * [example] fix import
-
ver217 authored
* [zero] refactor low-level zero folder structure * [zero] fix legacy zero import path * [zero] fix legacy zero import path * [zero] remove useless import * [zero] refactor gemini folder structure * [zero] refactor gemini folder structure * [zero] refactor legacy zero import path * [zero] refactor gemini folder structure * [zero] refactor gemini folder structure * [zero] refactor gemini folder structure * [zero] refactor legacy zero import path * [zero] fix test import path * [zero] fix test * [zero] fix circular import * [zero] update import
-
- 23 Mar, 2023 1 commit
-
-
Yan Fang authored
-
- 21 Mar, 2023 1 commit
-
-
Zihao authored
* add auto-offload feature * polish code * fix syn offload runtime pass bug * add offload example * fix offload testing bug * fix example testing bug
-
- 09 Mar, 2023 2 commits
-
-
binmakeswell authored
-
Tomek authored
-
- 08 Mar, 2023 1 commit
-
-
ramos authored
Co-authored-by:poe <poe@nemoramo>
-
- 07 Mar, 2023 1 commit
-
-
Ziyue Jiang authored
* add alpa dp split * add alpa dp split * use fwd+bwd instead of fwd only --------- Co-authored-by:Ziyue Jiang <ziyue.jiang@gmail.com>
-
- 27 Feb, 2023 2 commits
-
-
github-actions[bot] authored
Co-authored-by:github-actions <github-actions@github.com>
-
binmakeswell authored
-
- 22 Feb, 2023 2 commits
-
-
Alex_996 authored
Fix typos, `6.7 -> 6.7b`
-
dawei-wang authored
Fix hpcaitech/ColossalAI#2851
-
- 20 Feb, 2023 1 commit
-
-
Jiarui Fang authored
-
- 15 Feb, 2023 1 commit
-
-
cloudhuang authored
-
- 09 Feb, 2023 1 commit
-
-
Jiatong (Julius) Han authored
* [tutorial] polish readme.md * [example] Update README.md
-
- 31 Jan, 2023 1 commit
-
-
HELSON authored
-
- 30 Jan, 2023 1 commit
-
-
HELSON authored
-
- 28 Jan, 2023 1 commit
-
-
HELSON authored
* [zero] add strict ddp mode for chunk init * [gemini] update gpt example
-
- 20 Jan, 2023 1 commit
-
-
HELSON authored
* [zero] add strict ddp mode * [polish] add comments for strict ddp mode * [zero] fix test error
-
- 18 Jan, 2023 2 commits
-
-
Jiarui Fang authored
-
jiaruifang authored
-