1. 30 Aug, 2023 2 commits
    • flybird11111's avatar
      [shardformer] support pp+tp+zero1 tests (#4531) · ec18fc73
      flybird11111 authored
      * [shardformer] fix opt test hanging
      
      * fix
      
      * test
      
      * test
      
      * test
      
      * fix test
      
      * fix test
      
      * remove print
      
      * add fix
      
      * [shardformer] pp+tp+zero1
      
      [shardformer] pp+tp+zero1
      
      [shardformer] pp+tp+zero1
      
      [shardformer] pp+tp+zero1
      
      [shardformer] pp+tp+zero1
      
      [shardformer] pp+tp+zero1
      
      * [shardformer] pp+tp+zero1
      
      * [shardformer] pp+tp+zero1
      
      * [shardformer] pp+tp+zero1
      
      * [shardformer] pp+tp+zero1
      ec18fc73
    • flybird11111's avatar
      [shardformer] fix opt test hanging (#4521) · d367b887
      flybird11111 authored
      * [shardformer] fix opt test hanging
      
      * fix
      
      * test
      
      * test
      
      * test
      
      * fix test
      
      * fix test
      
      * remove print
      
      * add fix
      d367b887
  2. 29 Aug, 2023 2 commits
  3. 28 Aug, 2023 2 commits
  4. 25 Aug, 2023 2 commits
    • Baizhou Zhang's avatar
      [shardformer] support sharded checkpoint IO for models of HybridParallelPlugin (#4506) · 44eab2b2
      Baizhou Zhang authored
      * add APIs
      
      * implement save_sharded_model
      
      * add test for hybrid checkpointio
      
      * implement naive loading for sharded model
      
      * implement efficient sharded model loading
      
      * open a new file for hybrid checkpoint_io
      
      * small fix
      
      * fix circular importing
      
      * fix docstring
      
      * arrange arguments and apis
      
      * small fix
      44eab2b2
    • flybird11111's avatar
      [shardformer] opt fix. (#4514) · de8a65ba
      flybird11111 authored
      * [shardformer] chatglm support sequence parallel
      
      [shardformer] chatglm support sequence parallel
      
      [shardformer] chatglm support sequence parallel
      
      [shardformer] chatglm support sequence parallel
      
      [shardformer] chatglm support sequence parallel
      
      [shardformer] chatglm support sequence parallel
      
      * fix
      
      fix
      
      fix
      
      fix
      
      * [shardformer] jit fused fix
      
      * [shardformer] jit fused fix
      
      * [shardformer] jit fused fix
      
      * [shardformer] jit fused fix
      
      * [shardformer] jit fused fix
      
      * [shardformer] jit fused fix
      
      * [shardformer] jit fused fix
      
      * activate checks
      
      * [Test] test ci
      
      * test ci
      
      * test ci
      
      * test ci
      
      * test ci
      
      * test ci
      
      * test ci
      
      * fix
      de8a65ba
  5. 24 Aug, 2023 1 commit
    • flybird11111's avatar
      [shardformer] vit/llama/t5 ignore the sequence parallelism flag and some fix. (#4498) · 3353e55c
      flybird11111 authored
      * [shardformer] chatglm support sequence parallel
      
      [shardformer] chatglm support sequence parallel
      
      [shardformer] chatglm support sequence parallel
      
      [shardformer] chatglm support sequence parallel
      
      [shardformer] chatglm support sequence parallel
      
      [shardformer] chatglm support sequence parallel
      
      * fix
      
      fix
      
      fix
      
      fix
      
      * [shardformer] jit fused fix
      
      * [shardformer] jit fused fix
      
      * [shardformer] jit fused fix
      
      * [shardformer] jit fused fix
      
      * [shardformer] jit fused fix
      
      * [shardformer] jit fused fix
      
      * [shardformer] jit fused fix
      
      * activate checks
      3353e55c
  6. 23 Aug, 2023 1 commit
  7. 22 Aug, 2023 3 commits
  8. 21 Aug, 2023 1 commit
  9. 18 Aug, 2023 4 commits
    • Jianghai's avatar
      [shardformer] Pipeline/whisper (#4456) · 8739aa7f
      Jianghai authored
      * add some base tests and policies
      
      * finish whisper base model
      
      * add conditional generation
      
      * finish basic tests
      
      * whisper
      
      * finish whisper
      
      * finish whisper
      
      * del useless  whisper test
      
      * fix
      
      * add argmin to replace
      
      * finish revision
      8739aa7f
    • flybird11111's avatar
      [shardformer] bert support sequence parallel. (#4455) · a27e0bb4
      flybird11111 authored
      * [shardformer] bert support sequence parallel
      
      [shardformer] bert support sequence parallel
      
      [shardformer] bert support sequence parallel
      
      [shardformer] bert support sequence parallel
      
      [shardformer] bert support sequence parallel
      
      [shardformer] bert support sequence parallel
      
      [shardformer] bert support sequence parallel
      
      [shardformer] bert support sequence parallel
      
      [shardformer] bert support sequence parallel
      
      * [shardformer] bert support sequence parallel
      
      [shardformer] bert support sequence parallel
      
      [shardformer] bert support sequence parallel
      
      * [shardformer] bert support sequence parallel
      a27e0bb4
    • flybird11111's avatar
      [shardformer] bloom support sequence parallel (#4465) · 0ecd71e0
      flybird11111 authored
      [shardformer] bloom support sequence parallel
      0ecd71e0
    • Bin Jia's avatar
      [shardformer/sequence parallel] support gpt2 seq parallel with pp/dp/tp (#4460) · 7c8be770
      Bin Jia authored
      * support gpt2 seq parallel with pp/dp/tp
      
      * fix a bug when waiting for stream done
      
      * delete unused gpt2_seq file
      7c8be770
  10. 16 Aug, 2023 5 commits
  11. 15 Aug, 2023 17 commits
    • ver217's avatar
      [shardformer] fix import · 5d4efdf5
      ver217 authored
      5d4efdf5
    • ver217's avatar
      [shardformer] fix embedding · 73a4144b
      ver217 authored
      73a4144b
    • ver217's avatar
      [misc] update requirements · 92230226
      ver217 authored
      92230226
    • Hongxin Liu's avatar
      [misc] resolve code factor issues (#4433) · 172f7fa3
      Hongxin Liu authored
      172f7fa3
    • flybird11111's avatar
      [shardformer] update bloom/llama/vit/chatglm tests (#4420) · 328a791d
      flybird11111 authored
      [shardformer] update bloom/llama/vit/chatglm tests
      
      [shardformer] update opt tests
      
      [shardformer] update opt tests
      
      [shardformer] update bloom/llama/vit/chatglm tests
      
      [shardformer] update bloom/llama/vit/chatglm tests
      
      [shardformer] update bloom/llama/vit/chatglm tests
      328a791d
    • flybird11111's avatar
      [shardformer]update t5 tests for using all optimizations. (#4407) · 108e54a0
      flybird11111 authored
      * [shardformer] gpt2 tests fix
      
      [shardformer] test all optimizations (#4399)
      
      [shardformer] test all optimizations
      
      [shardformer] test all optimizations
      
      [shardformer] test all optimizations
      
      [shardformer] gpt2 tests fix
      
      * [shardformer]update t5 to use all optimizations
      108e54a0
    • flybird11111's avatar
      [shardformer] update tests for all optimization (#4413) · 1edc9b5f
      flybird11111 authored
      [shardformer] update tests for all optimization
      1edc9b5f
    • Baizhou Zhang's avatar
      [shardformer] rewrite tests for opt/bloom/llama/vit/chatglm (#4395) · 7711bd52
      Baizhou Zhang authored
      * rewrite opt tests
      
      * rewrite llama tests
      
      * rewrite bloom & vit tests
      
      * rewrite chatglm tests
      
      * fix LinearCol for classfiers
      
      * add judge for other tp layers, fix lazy init in util
      7711bd52
    • flybird11111's avatar
      [shardformer]fix, test gpt2 for AMP+TP (#4403) · 21e0a42f
      flybird11111 authored
      * [shardformer] gpt2 tests fix
      
      [shardformer] test all optimizations (#4399)
      
      [shardformer] test all optimizations
      
      [shardformer] test all optimizations
      
      [shardformer] test all optimizations
      
      [shardformer] gpt2 tests fix
      
      * [shardformer] gpt2 tests fix
      21e0a42f
    • Jianghai's avatar
      [pipeline] rewrite bert tests and fix some bugs (#4409) · 7596e9ae
      Jianghai authored
      * add pipeline policy and bert forward to be done
      
      * add bertmodel pipeline forward and make tests
      
      * add Bert_Policy and test for policy
      
      * update formatting
      
      * update formatting
      
      * update the code
      
      * fix bugs
      
      * fix name confilt
      
      * add bloom model and policy ,revise the base class of policy
      
      * revise
      
      * revision
      
      * add bert_for_pretraining
      
      * add bert_for_pretraining forward and policy
      
      * fix typos
      
      * cancel warning
      
      * change the imediate output to default dict
      
      * change the default output of get_shared_params
      
      * rewrite bert test
      
      * rewrite bert test
      
      * fix some bugs
      
      * del pipeline tests
      
      * del pipeline tests
      
      * del useless print
      
      * del useless print
      
      * rewrite data repeats
      7596e9ae
    • flybird1111's avatar
      [shardformer] test all optimizations (#4399) · d2cd48e0
      flybird1111 authored
      [shardformer] test all optimizations
      
      [shardformer] test all optimizations
      
      [shardformer] test all optimizations
      d2cd48e0
    • flybird1111's avatar
      [shardformer] update shardformer to use flash attention 2 (#4392) · 7a3dfd0c
      flybird1111 authored
      * cherry-pick flash attention 2
      
      cherry-pick flash attention 2
      
      * [shardformer] update shardformer to use flash attention 2
      
      [shardformer] update shardformer to use flash attention 2, fix
      
      [shardformer] update shardformer to use flash attention 2, fix
      
      [shardformer] update shardformer to use flash attention 2, fix
      7a3dfd0c
    • Baizhou Zhang's avatar
      [pipeline] rewrite t5 tests & support multi-tensor transmitting in pipeline (#4388) · ed4c4484
      Baizhou Zhang authored
      * fix remaining t5 bugs/rewrite t5 tests
      
      * fix multi-tensor communication in pipeline
      
      * rearrange test_config
      
      * fix keyerror in sync_shared_params
      
      * fix get_held_layers & Randomnizer, complete t5 tests
      
      * erase printing
      
      * fix get_held_layers through modifying _release_unheld_layers
      
      * fix _get_recursive_held_layers bug
      ed4c4484
    • flybird1111's avatar
      [Shardformer] Merge flash attention branch to pipeline branch (#4362) · 906426cb
      flybird1111 authored
      
      
      * [shardformer] supported flash attention test dependency (#4158)
      
      * [shardformer] fix flash attention utils test (#4180)
      
      * [shardformer] opt support flash attention (#4163)
      
      * [shardformer] opt support flash attention
      
      * [shardformer] opt support flash attention
      
      * [shardformer] opt support flash attention
      
      * [shardformer] opt support flash attention
      
      * [shardformer] move to modeling
      
      * [shardformer] move to modeling
      
      * [shardformer] add performance benchmark of shardformer (#4175)
      
      * [shardformer] opt support flash attention
      
      * [shardformer] opt support flash attention
      
      * [shardformer] opt support flash attention
      
      * [shardformer] benchmark fix
      
      * [shardformer] benchmark fix
      
      * [shardformer] llama support flash attention (#4185)
      
      * [shardformer] opt support flash attention
      
      * [shardformer] opt support flash attention
      
      * [shardformer] opt support flash attention
      
      * [shardformer] opt support flash attention
      
      * [shardformer] move to modeling
      
      * [shardformer] move to modeling
      
      * [shardformer] llama support flash attention
      
      * [shardformer] llama support flash attention
      
      * [shardformer] Move the import statement for xformer outside the forward function.
      
      * [shardformer] gpt2 support flash attention. (#4191)
      
      * [shardformer] opt support flash attention
      
      * [shardformer] opt support flash attention
      
      * [shardformer] opt support flash attention
      
      * [shardformer] opt support flash attention
      
      * [shardformer] move to modeling
      
      * [shardformer] move to modeling
      
      * [shardformer] gpt2 support flash attention
      
      * [shardformer] gpt2 support flash attention
      
      * [shardformer] gpt2 support flash attention
      
      * [shardformer] bloom support flash attention (#4188)
      
      * [shardformer] opt support flash attention
      
      * [shardformer] opt support flash attention
      
      * [shardformer] opt support flash attention
      
      * [shardformer] opt support flash attention
      
      * [shardformer] move to modeling
      
      * [shardformer] move to modeling
      
      * [shardformer] bloom suport flash attention
      
      * [shardformer] add assert to sequence length
      
      * [shardformer] fix
      
      * [shardformer] fix
      
      * [shardformer] fix
      
      * [shardformer] bert support flash attention. (#4206)
      
      * [shardformer] opt support flash attention
      
      * [shardformer] opt support flash attention
      
      * [shardformer] opt support flash attention
      
      * [shardformer] opt support flash attention
      
      * [shardformer] move to modeling
      
      * [shardformer] move to modeling
      
      * [shardformer] bert support flash attention
      
      * [shardformer] t5 support flash attention. (#4216)
      
      * [shardformer] opt support flash attention
      
      * [shardformer] opt support flash attention
      
      * [shardformer] opt support flash attention
      
      * [shardformer] opt support flash attention
      
      * [shardformer] move to modeling
      
      * [shardformer] move to modeling
      
      * [shardformer] t5 support flash attention
      
      * [shardformer] t5 support flash attention
      
      * fix typo
      
      * fix typo
      
      * fix typo
      
      * fix typo
      
      * fix typo
      
      * fix typo
      
      * [shardformer] support 'paddedcausal'  type of attention mask in Coloattention. (#4215)
      
      * added padded causal attn mask type for ColoAttention
      
      * [shardformer]t5 flash attention fix (#4239)
      
      * [shardformer] opt support flash attention
      
      * [shardformer] opt support flash attention
      
      * [shardformer] opt support flash attention
      
      * [shardformer] opt support flash attention
      
      * [shardformer] move to modeling
      
      * [shardformer] move to modeling
      
      * [shardformer] t5 flash attention fix
      
      * [shardformer] update gpt2 to use coloattention. (#4234)
      
      * [shardformer] opt support flash attention
      
      * [shardformer] opt support flash attention
      
      * [shardformer] opt support flash attention
      
      * [shardformer] opt support flash attention
      
      * [shardformer] move to modeling
      
      * [shardformer] move to modeling
      
      * [shardformer] update gpt2 to use coloattention
      
      * [shardformer] update gpt2 to use coloattention
      
      * [shardformer] update gpt2 to use coloattention
      
      * [shardformer] update gpt2 to use coloattention
      
      * [shardformer] update gpt2
      
      * [shardformer] update opt and llama to use coloattention. (#4226)
      
      * [shardformer] opt support flash attention
      
      * [shardformer] opt support flash attention
      
      * [shardformer] opt support flash attention
      
      * [shardformer] opt support flash attention
      
      * [shardformer] move to modeling
      
      * [shardformer] move to modeling
      
      * update opt to use coloattention
      
      * [shardformer]update opt to use coloattention
      
      * [shardformer]update opt to use coloattention
      
      * [shardformer]update opt to use coloattention
      
      * [shardformer]update opt to use coloattention
      
      * [shardformer]update opt to use coloattention
      
      * [shardformer]update opt to use coloattention
      
      * [shardformer]update opt
      
      * [shardformer] shardformer support jit fused operator. (#4236)
      
      * [shardformer] opt support flash attention
      
      * [shardformer] opt support flash attention
      
      * [shardformer] opt support flash attention
      
      * [shardformer] opt support flash attention
      
      * [shardformer] move to modeling
      
      * [shardformer] move to modeling
      
      * [shardformer] bloom support jit fused operator
      
      * [shardformer] bloom support jit fused operator
      
      * [shardformer] bloom support jit fused operator
      
      * [shardformer] t5 support jit fused operator
      
      * [shardformer] t5 support jit fused operator
      
      * [shardformer] t5 support jit fused operator
      
      * [shardformer] add roadmap of flash attention
      
      * [shardformer] add roadmap of flash attention
      
      * [shardformer] add roadmap of flash attention
      
      * [shardformer] add type hint to 'self' param of forward
      
      * [shardformer] merge feature/shardformer-models branch to feature/flash-attention-shardformer branch. (#4290)
      
      * Feature/vit support (#4182)
      
      * [shardformer] added tests
      
      * [shardformer] vit test finish and support
      
      * fix attention dropout
      
      * [shardformer] support SAM (#4231)
      
      * 1.support sam 2.add fused qkv for nn.Linear
      
      * update utils support set element in list
      
      * overtwrite SamVisionAttention foward to use DropoutForParallelInput
      
      * remove unused code
      
      * [shardformer] support whisper (#4212)
      
      * support whisper
      
      * fix bug in vocabembedding
      
      * support downstream model of whisper
      
      * update readme
      
      * Feature/chatglm (#4240)
      
      * [shardformer] added tests
      
      * [shardformer] vit test finish and support
      
      * [shardformer] chatglm ready
      
      * import chatglm
      
      * [shardformer] add test kit in model zoo for chatglm
      
      * [sharformer] add first version of policy of chatglm
      
      * [shardformer] polish chatglm code
      
      * [shardformer] polish code
      
      * [shardformer] support chatglm without layernorm
      
      * [shardformer] chatglm shard without mlp sharding
      
      * [shardformer] delete some file
      
      * [shardformer] ChatGLM support layernorm sharding
      
      * [shardformer] register without auto policy
      
      * [shardformer] pre-commit check files
      
      * [shardformer] fix chatglm configuration with pre-commit
      
      ---------
      Co-authored-by: default avatarKun Lin <81014421+klhhhhh@users.noreply.github.com>
      Co-authored-by: default avatarFoolPlayer <45593998+FoolPlayer@users.noreply.github.com>
      
      * [shardformer] whisper support flash attention (#4301)
      
      * Feature/vit support (#4182)
      
      * [shardformer] added tests
      
      * [shardformer] vit test finish and support
      
      * fix attention dropout
      
      * [shardformer] support SAM (#4231)
      
      * 1.support sam 2.add fused qkv for nn.Linear
      
      * update utils support set element in list
      
      * overtwrite SamVisionAttention foward to use DropoutForParallelInput
      
      * remove unused code
      
      * [shardformer] support whisper (#4212)
      
      * support whisper
      
      * fix bug in vocabembedding
      
      * support downstream model of whisper
      
      * update readme
      
      * Feature/chatglm (#4240)
      
      * [shardformer] added tests
      
      * [shardformer] vit test finish and support
      
      * [shardformer] chatglm ready
      
      * import chatglm
      
      * [shardformer] add test kit in model zoo for chatglm
      
      * [sharformer] add first version of policy of chatglm
      
      * [shardformer] polish chatglm code
      
      * [shardformer] polish code
      
      * [shardformer] support chatglm without layernorm
      
      * [shardformer] chatglm shard without mlp sharding
      
      * [shardformer] delete some file
      
      * [shardformer] ChatGLM support layernorm sharding
      
      * [shardformer] register without auto policy
      
      * [shardformer] pre-commit check files
      
      * [shardformer] fix chatglm configuration with pre-commit
      
      * [shardformer] whisper support flash attention
      
      * [shardformer] whisper support flash attention
      
      * [shardformer]whisper support jit operator
      
      ---------
      Co-authored-by: default avatarKun Lin <81014421+klhhhhh@users.noreply.github.com>
      Co-authored-by: default avatarFoolPlayer <45593998+FoolPlayer@users.noreply.github.com>
      
      * [shardformer] sam support flash attention (#4316)
      
      * Feature/vit support (#4182)
      
      * [shardformer] added tests
      
      * [shardformer] vit test finish and support
      
      * fix attention dropout
      
      * [shardformer] support SAM (#4231)
      
      * 1.support sam 2.add fused qkv for nn.Linear
      
      * update utils support set element in list
      
      * overtwrite SamVisionAttention foward to use DropoutForParallelInput
      
      * remove unused code
      
      * [shardformer] support whisper (#4212)
      
      * support whisper
      
      * fix bug in vocabembedding
      
      * support downstream model of whisper
      
      * update readme
      
      * Feature/chatglm (#4240)
      
      * [shardformer] added tests
      
      * [shardformer] vit test finish and support
      
      * [shardformer] chatglm ready
      
      * import chatglm
      
      * [shardformer] add test kit in model zoo for chatglm
      
      * [sharformer] add first version of policy of chatglm
      
      * [shardformer] polish chatglm code
      
      * [shardformer] polish code
      
      * [shardformer] support chatglm without layernorm
      
      * [shardformer] chatglm shard without mlp sharding
      
      * [shardformer] delete some file
      
      * [shardformer] ChatGLM support layernorm sharding
      
      * [shardformer] register without auto policy
      
      * [shardformer] pre-commit check files
      
      * [shardformer] fix chatglm configuration with pre-commit
      
      * [shardformer] sam support flash attention
      
      ---------
      Co-authored-by: default avatarKun Lin <81014421+klhhhhh@users.noreply.github.com>
      Co-authored-by: default avatarFoolPlayer <45593998+FoolPlayer@users.noreply.github.com>
      
      * [shardformer] merge blip2/chatglm  (#4321)
      
      * Feature/vit support (#4182)
      
      * [shardformer] added tests
      
      * [shardformer] vit test finish and support
      
      * fix attention dropout
      
      * [shardformer] support SAM (#4231)
      
      * 1.support sam 2.add fused qkv for nn.Linear
      
      * update utils support set element in list
      
      * overtwrite SamVisionAttention foward to use DropoutForParallelInput
      
      * remove unused code
      
      * [shardformer] support whisper (#4212)
      
      * support whisper
      
      * fix bug in vocabembedding
      
      * support downstream model of whisper
      
      * update readme
      
      * Feature/chatglm (#4240)
      
      * [shardformer] added tests
      
      * [shardformer] vit test finish and support
      
      * [shardformer] chatglm ready
      
      * import chatglm
      
      * [shardformer] add test kit in model zoo for chatglm
      
      * [sharformer] add first version of policy of chatglm
      
      * [shardformer] polish chatglm code
      
      * [shardformer] polish code
      
      * [shardformer] support chatglm without layernorm
      
      * [shardformer] chatglm shard without mlp sharding
      
      * [shardformer] delete some file
      
      * [shardformer] ChatGLM support layernorm sharding
      
      * [shardformer] register without auto policy
      
      * [shardformer] pre-commit check files
      
      * [shardformer] fix chatglm configuration with pre-commit
      
      * [shardformer] added tests
      
      * [shardformer] vit test finish and support
      
      * import chatglm
      
      * [shardformer] add test kit in model zoo for chatglm
      
      * [sharformer] add first version of policy of chatglm
      
      * [shardformer] polish chatglm code
      
      * [shardformer] polish code
      
      * [shardformer] support chatglm without layernorm
      
      * [shardformer] delete some file
      
      * [shardformer] ChatGLM support layernorm sharding
      
      * [shardformer] register without auto policy
      
      * [shardformer] pre-commit check files
      
      * [shardformer] support ChatGLMForConditionalGeneration & add fusedlayernorm for vit
      
      * [shardformer] support Blip2 (#4243)
      
      * support base blip2
      
      * add support for downstream blip2 model
      
      * update readme
      
      * add forward injection
      
      * skip not compatible models test
      
      * fix test for gemini and low_level_zero_pugin
      
      ---------
      Co-authored-by: default avatarKun Lin <81014421+klhhhhh@users.noreply.github.com>
      Co-authored-by: default avatarFoolPlayer <45593998+FoolPlayer@users.noreply.github.com>
      Co-authored-by: default avatarklhhhhh <1412841649@qq.com>
      
      * [shardformer] blip2 support flash attention and jit operator (#4325)
      
      * Feature/vit support (#4182)
      
      * [shardformer] added tests
      
      * [shardformer] vit test finish and support
      
      * fix attention dropout
      
      * [shardformer] support SAM (#4231)
      
      * 1.support sam 2.add fused qkv for nn.Linear
      
      * update utils support set element in list
      
      * overtwrite SamVisionAttention foward to use DropoutForParallelInput
      
      * remove unused code
      
      * [shardformer] support whisper (#4212)
      
      * support whisper
      
      * fix bug in vocabembedding
      
      * support downstream model of whisper
      
      * update readme
      
      * Feature/chatglm (#4240)
      
      * [shardformer] added tests
      
      * [shardformer] vit test finish and support
      
      * [shardformer] chatglm ready
      
      * import chatglm
      
      * [shardformer] add test kit in model zoo for chatglm
      
      * [sharformer] add first version of policy of chatglm
      
      * [shardformer] polish chatglm code
      
      * [shardformer] polish code
      
      * [shardformer] support chatglm without layernorm
      
      * [shardformer] chatglm shard without mlp sharding
      
      * [shardformer] delete some file
      
      * [shardformer] ChatGLM support layernorm sharding
      
      * [shardformer] register without auto policy
      
      * [shardformer] pre-commit check files
      
      * [shardformer] fix chatglm configuration with pre-commit
      
      * [shardformer] added tests
      
      * [shardformer] vit test finish and support
      
      * import chatglm
      
      * [shardformer] add test kit in model zoo for chatglm
      
      * [sharformer] add first version of policy of chatglm
      
      * [shardformer] polish chatglm code
      
      * [shardformer] polish code
      
      * [shardformer] support chatglm without layernorm
      
      * [shardformer] delete some file
      
      * [shardformer] ChatGLM support layernorm sharding
      
      * [shardformer] register without auto policy
      
      * [shardformer] pre-commit check files
      
      * [shardformer] support ChatGLMForConditionalGeneration & add fusedlayernorm for vit
      
      * [shardformer] support Blip2 (#4243)
      
      * support base blip2
      
      * add support for downstream blip2 model
      
      * update readme
      
      * add forward injection
      
      * skip not compatible models test
      
      * fix test for gemini and low_level_zero_pugin
      
      * [shardformer] blip2 support flash attention and jit operator
      
      * [shardformer] blip2 support flash attention and jit operator
      
      * [shardformer] blip2 support flash attention and jit operator
      
      ---------
      Co-authored-by: default avatarKun Lin <81014421+klhhhhh@users.noreply.github.com>
      Co-authored-by: default avatarFoolPlayer <45593998+FoolPlayer@users.noreply.github.com>
      Co-authored-by: default avatarklhhhhh <1412841649@qq.com>
      
      * [shardformer] chatglm support flash attention and jit operator (#4330)
      
      * Feature/vit support (#4182)
      
      * [shardformer] added tests
      
      * [shardformer] vit test finish and support
      
      * fix attention dropout
      
      * [shardformer] support SAM (#4231)
      
      * 1.support sam 2.add fused qkv for nn.Linear
      
      * update utils support set element in list
      
      * overtwrite SamVisionAttention foward to use DropoutForParallelInput
      
      * remove unused code
      
      * [shardformer] support whisper (#4212)
      
      * support whisper
      
      * fix bug in vocabembedding
      
      * support downstream model of whisper
      
      * update readme
      
      * Feature/chatglm (#4240)
      
      * [shardformer] added tests
      
      * [shardformer] vit test finish and support
      
      * [shardformer] chatglm ready
      
      * import chatglm
      
      * [shardformer] add test kit in model zoo for chatglm
      
      * [sharformer] add first version of policy of chatglm
      
      * [shardformer] polish chatglm code
      
      * [shardformer] polish code
      
      * [shardformer] support chatglm without layernorm
      
      * [shardformer] chatglm shard without mlp sharding
      
      * [shardformer] delete some file
      
      * [shardformer] ChatGLM support layernorm sharding
      
      * [shardformer] register without auto policy
      
      * [shardformer] pre-commit check files
      
      * [shardformer] fix chatglm configuration with pre-commit
      
      * [shardformer] added tests
      
      * [shardformer] vit test finish and support
      
      * import chatglm
      
      * [shardformer] add test kit in model zoo for chatglm
      
      * [sharformer] add first version of policy of chatglm
      
      * [shardformer] polish chatglm code
      
      * [shardformer] polish code
      
      * [shardformer] support chatglm without layernorm
      
      * [shardformer] delete some file
      
      * [shardformer] ChatGLM support layernorm sharding
      
      * [shardformer] register without auto policy
      
      * [shardformer] pre-commit check files
      
      * [shardformer] support ChatGLMForConditionalGeneration & add fusedlayernorm for vit
      
      * [shardformer] support Blip2 (#4243)
      
      * support base blip2
      
      * add support for downstream blip2 model
      
      * update readme
      
      * add forward injection
      
      * skip not compatible models test
      
      * fix test for gemini and low_level_zero_pugin
      
      * [shardformer] chatglm support flash attention and jit operator
      
      * [shardformer] chatglm support flash attention and jit operator
      
      * [shardformer] chatglm support flash attention and jit operator
      
      * [shardformer] chatglm support flash attention and jit operator
      
      ---------
      Co-authored-by: default avatarKun Lin <81014421+klhhhhh@users.noreply.github.com>
      Co-authored-by: default avatarFoolPlayer <45593998+FoolPlayer@users.noreply.github.com>
      Co-authored-by: default avatarklhhhhh <1412841649@qq.com>
      
      * [shardformer] vit support flash attention and jit operator (#4334)
      
      * Feature/vit support (#4182)
      
      * [shardformer] added tests
      
      * [shardformer] vit test finish and support
      
      * fix attention dropout
      
      * [shardformer] support SAM (#4231)
      
      * 1.support sam 2.add fused qkv for nn.Linear
      
      * update utils support set element in list
      
      * overtwrite SamVisionAttention foward to use DropoutForParallelInput
      
      * remove unused code
      
      * [shardformer] support whisper (#4212)
      
      * support whisper
      
      * fix bug in vocabembedding
      
      * support downstream model of whisper
      
      * update readme
      
      * Feature/chatglm (#4240)
      
      * [shardformer] added tests
      
      * [shardformer] vit test finish and support
      
      * [shardformer] chatglm ready
      
      * import chatglm
      
      * [shardformer] add test kit in model zoo for chatglm
      
      * [sharformer] add first version of policy of chatglm
      
      * [shardformer] polish chatglm code
      
      * [shardformer] polish code
      
      * [shardformer] support chatglm without layernorm
      
      * [shardformer] chatglm shard without mlp sharding
      
      * [shardformer] delete some file
      
      * [shardformer] ChatGLM support layernorm sharding
      
      * [shardformer] register without auto policy
      
      * [shardformer] pre-commit check files
      
      * [shardformer] fix chatglm configuration with pre-commit
      
      * [shardformer] added tests
      
      * [shardformer] vit test finish and support
      
      * import chatglm
      
      * [shardformer] add test kit in model zoo for chatglm
      
      * [sharformer] add first version of policy of chatglm
      
      * [shardformer] polish chatglm code
      
      * [shardformer] polish code
      
      * [shardformer] support chatglm without layernorm
      
      * [shardformer] delete some file
      
      * [shardformer] ChatGLM support layernorm sharding
      
      * [shardformer] register without auto policy
      
      * [shardformer] pre-commit check files
      
      * [shardformer] support ChatGLMForConditionalGeneration & add fusedlayernorm for vit
      
      * [shardformer] support Blip2 (#4243)
      
      * support base blip2
      
      * add support for downstream blip2 model
      
      * update readme
      
      * add forward injection
      
      * skip not compatible models test
      
      * fix test for gemini and low_level_zero_pugin
      
      * [shardformer] vit support flash attention and jit operator
      
      * [shardformer] vit support flash attention and jit operator
      
      ---------
      Co-authored-by: default avatarKun Lin <81014421+klhhhhh@users.noreply.github.com>
      Co-authored-by: default avatarFoolPlayer <45593998+FoolPlayer@users.noreply.github.com>
      Co-authored-by: default avatarklhhhhh <1412841649@qq.com>
      
      * [pipeline] merge flash attention branch
      
      * [pipeline] merge flash attention branch
      
      * [pipeline] merge flash attention branch
      
      * [pipeline] fix conflict
      
      * [pipeline] fix conflict
      
      * Merge branch 'feature/pipeline' into feature/pipeline
      
      * Merge branch 'feature/pipeline' into feature/pipeline
      
      * Merge branch 'feature/pipeline' into feature/pipeline
      
      * activate checks
      
      * activate checks
      
      * activate checks
      
      * activate checks
      
      * activate checks
      
      * activate checks
      
      * activate checks
      
      * activate checks
      
      * fix flash attention tests
      
      * gemini ignore whisper
      
      * fix vit
      
      * fix xformers import handle
      
      ---------
      Co-authored-by: default avatarFrank Lee <somerlee.9@gmail.com>
      Co-authored-by: default avatarKun Lin <81014421+klhhhhh@users.noreply.github.com>
      Co-authored-by: default avatarFoolPlayer <45593998+FoolPlayer@users.noreply.github.com>
      Co-authored-by: default avatarklhhhhh <1412841649@qq.com>
      906426cb
    • Jianghai's avatar
      [pipeline] add chatglm (#4363) · a88e9225
      Jianghai authored
      * add pipeline policy and bert forward to be done
      
      * add bertmodel pipeline forward and make tests
      
      * add Bert_Policy and test for policy
      
      * update formatting
      
      * update formatting
      
      * update the code
      
      * fix bugs
      
      * fix name confilt
      
      * add bloom model and policy ,revise the base class of policy
      
      * revise
      
      * revision
      
      * add bert_for_pretraining
      
      * add bert_for_pretraining forward and policy
      
      * fix typos
      
      * cancel warning
      
      * change the imediate output to default dict
      
      * change the default output of get_shared_params
      
      * add chatglm
      
      * add
      
      * chatglm
      
      * chatglm
      
      * finish chatglm
      
      * deletes
      
      * fix rmsnorm
      
      * chatglm
      
      * fix chatglm shard
      
      * init
      a88e9225
    • Baizhou Zhang's avatar
      [shardformer] add util functions for shardformer tests/fix sync_shared_param (#4366) · b1feeced
      Baizhou Zhang authored
      * add util functions for shardformer tests & rewrite gpt2 test
      
      * fix shared_params & embedding/merging
      
      * fix precision
      b1feeced
    • Bin Jia's avatar
      [test] Hotfix/fix some model test and refactor check util api (#4369) · 5c6f1831
      Bin Jia authored
      * fix llama test
      
      * fix test bug of bert, blip2, bloom, gpt2
      
      * fix llama test
      
      * fix opt test
      
      * fix sam test
      
      * fix sam test
      
      * fix t5 test
      
      * fix vit test
      
      * fix whisper test
      
      * fix whisper test
      
      * polish code
      
      * adjust allclose parameter
      
      * Add mistakenly deleted code
      
      * addjust allclose
      
      * change loss function for some base model
      5c6f1831