- 04 Jul, 2023 30 commits
-
-
Frank Lee authored
* [shardformer] refactored some doc and api * polish code
-
Frank Lee authored
-
Frank Lee authored
-
Frank Lee authored
-
Frank Lee authored
-
Kun Lin authored
* first v of vit shardformer * keep vit * update * vit shard add vitattention vitlayer * update num head shard para * finish test for vit * add new_model_class & postprocess * add vit readme * delete old files & fix the conflict * fix sth
-
jiangmingyan authored
* [shardformer] shardformer support opt models * [shardformer] shardformer support opt models, fix * [shardformer] shardformer support opt models, fix * [shardformer] shardformer support opt models, fix
-
Frank Lee authored
-
Frank Lee authored
* [test] fixed tests failed due to dtensor change * polish code
-
FoolPlayer authored
* add layernorm to bert * add layernorm test * add layernorm test with load state dict * add use_mixedfusedLN in shard config * refactor policy to support fused_layernorm
-
Frank Lee authored
-
FoolPlayer authored
* add linearconv1d test * add linearconv1d test
-
Frank Lee authored
* [shardformer] support module saving and loading * polish code
-
FoolPlayer authored
* support kit use for bert test * support kit test for gpt2
-
Frank Lee authored
-
Frank Lee authored
* [shardformer] adapted T5 and LLaMa test to use kit * polish code
-
FoolPlayer authored
* add gpt2 test and layer class refactor * add dropout in gpt2 policy
-
Frank Lee authored
-
Frank Lee authored
-
FoolPlayer authored
* fix bert downstream with new api * remove comment line
-
FoolPlayer authored
-
Frank Lee authored
* [shardformer] refactored embedding and dropout to parallel module * polish code
-
FoolPlayer authored
-
Frank Lee authored
* [shardformer] integrated linear 1D with dtensor * polish code
-
Frank Lee authored
-
FoolPlayer authored
* add dist dropout in model * update docstring and bert policy with dropout * refactor basepolicy and sharded, update bert * update format * update gpt2 policy * update bert policy * remove unused code * update readme for new policy usage * add downstream model of bert * remove unused code
-
wukong1992 authored
test t5
-
wukong1992 authored
adjust layer attr
-
FoolPlayer authored
* fix bug in slicer, add slicer unit test * add dropout test * use pid as dropout seed * updata dropout test with local pattern * ad todo
-
FoolPlayer authored
* add bert align test, fix dist loss bug * forward and backward align * add ignore index * add shardformer CI * add gather_output optional for user in shardconfig * update readme with optional gather_ouput * add dist crossentropy loss test, remove unused files * remove unused file * remove unused file * rename the file * polish code
-
- 25 Jun, 2023 1 commit
-
-
Baizhou Zhang authored
-
- 19 Jun, 2023 1 commit
-
-
github-actions[bot] authored
Co-authored-by:github-actions <github-actions@github.com>
-
- 16 Jun, 2023 2 commits
-
-
Frank Lee authored
-
Baizhou Zhang authored
-
- 15 Jun, 2023 2 commits
-
-
Wenhao Chen authored
* feat: make optimizer optional in Booster.boost * test: skip unet test if diffusers version > 0.10.2
-
Baizhou Zhang authored
-
- 09 Jun, 2023 2 commits
- 08 Jun, 2023 1 commit
-
-
Frank Lee authored
-
- 05 Jun, 2023 1 commit
-
-
Hongxin Liu authored
* [bf16] add bf16 support for fused adam (#3844) * [bf16] fused adam kernel support bf16 * [test] update fused adam kernel test * [test] update fused adam test * [bf16] cpu adam and hybrid adam optimizers support bf16 (#3860) * [bf16] implement mixed precision mixin and add bf16 support for low level zero (#3869) * [bf16] add mixed precision mixin * [bf16] low level zero optim support bf16 * [text] update low level zero test * [text] fix low level zero grad acc test * [bf16] add bf16 support for gemini (#3872) * [bf16] gemini support bf16 * [test] update gemini bf16 test * [doc] update gemini docstring * [bf16] add bf16 support for plugins (#3877) * [bf16] add bf16 support for legacy zero (#3879) * [zero] init context support bf16 * [zero] legacy zero support bf16 * [test] add zero bf16 test * [doc] add bf16 related docstring for legacy zero
-