1. 10 May, 2023 2 commits
  2. 09 May, 2023 1 commit
    • Hongxin Liu's avatar
      [booster] fix no_sync method (#3709) · 6552cbf8
      Hongxin Liu authored
      * [booster] fix no_sync method
      
      * [booster] add test for ddp no_sync
      
      * [booster] fix merge
      
      * [booster] update unit test
      
      * [booster] update unit test
      
      * [booster] update unit test
      6552cbf8
  3. 08 May, 2023 1 commit
  4. 05 May, 2023 3 commits
    • Hongxin Liu's avatar
      [booster] refactor all dp fashion plugins (#3684) · d0915f54
      Hongxin Liu authored
      * [booster] add dp plugin base
      
      * [booster] inherit dp plugin base
      
      * [booster] refactor unit tests
      d0915f54
    • digger-yu's avatar
      [CI] Update test_sharded_optim_with_sync_bn.py (#3688) · b49020c1
      digger-yu authored
      fix spelling error in line23
      change "cudnn_determinstic"=True to "cudnn_deterministic=True"
      b49020c1
    • jiangmingyan's avatar
      [booster] gemini plugin support shard checkpoint (#3610) · 307894f7
      jiangmingyan authored
      
      
      * gemini plugin add shard checkpoint save/load
      
      * gemini plugin add shard checkpoint save/load
      
      * gemini plugin add shard checkpoint save/load
      
      * gemini plugin add shard checkpoint save/load
      
      * gemini plugin add shard checkpoint save/load
      
      * gemini plugin add shard checkpoint save/load
      
      * gemini plugin add shard checkpoint save/load
      
      * gemini plugin add shard checkpoint save/load
      
      * gemini plugin add shard checkpoint save/load
      
      * gemini plugin add shard checkpoint save/load
      
      * gemini plugin add shard checkpoint save/load
      
      * gemini plugin add shard checkpoint save/load
      
      * gemini plugin add shard checkpoint save/load
      
      * gemini plugin add shard checkpoint save/load
      
      * gemini plugin support shard checkpoint
      
      * [API Refactoring]gemini plugin support shard checkpoint
      
      * [API Refactoring]gemini plugin support shard checkpoint
      
      * [API Refactoring]gemini plugin support shard checkpoint
      
      * [API Refactoring]gemini plugin support shard checkpoint
      
      * [API Refactoring]gemini plugin support shard checkpoint
      
      * [API Refactoring]gemini plugin support shard checkpoint
      
      * [API Refactoring]gemini plugin support shard checkpoint
      
      * [API Refactoring]gemini plugin support shard checkpoint
      
      * [API Refactoring]gemini plugin support shard checkpoint
      
      * [API Refactoring]gemini plugin support shard checkpoint
      
      * [API Refactoring]gemini plugin support shard checkpoint
      
      * [API Refactoring]gemini plugin support shard checkpoint
      
      * [API Refactoring]gemini plugin support shard checkpoint
      
      ---------
      Co-authored-by: default avatarluchen <luchen@luchendeMBP.lan>
      Co-authored-by: default avatarluchen <luchen@luchendeMacBook-Pro.local>
      307894f7
  5. 26 Apr, 2023 2 commits
    • Hongxin Liu's avatar
      [gemini] accelerate inference (#3641) · 50793b35
      Hongxin Liu authored
      * [gemini] support don't scatter after inference
      
      * [chat] update colossalai strategy
      
      * [chat] fix opt benchmark
      
      * [chat] update opt benchmark
      
      * [gemini] optimize inference
      
      * [test] add gemini inference test
      
      * [chat] fix unit test ci
      
      * [chat] fix ci
      
      * [chat] fix ci
      
      * [chat] skip checkpoint test
      50793b35
    • Hongxin Liu's avatar
      [booster] add low level zero plugin (#3594) · 4b3240cb
      Hongxin Liu authored
      * [booster] add low level zero plugin
      
      * [booster] fix gemini plugin test
      
      * [booster] fix precision
      
      * [booster] add low level zero plugin test
      
      * [test] fix booster plugin test oom
      
      * [test] fix booster plugin test oom
      
      * [test] fix googlenet and inception output trans
      
      * [test] fix diffuser clip vision model
      
      * [test] fix torchaudio_wav2vec2_base
      
      * [test] fix low level zero plugin test
      4b3240cb
  6. 17 Apr, 2023 1 commit
  7. 12 Apr, 2023 1 commit
    • Hongxin Liu's avatar
      [gemini] gemini supports lazy init (#3379) · 152239bb
      Hongxin Liu authored
      * [gemini] fix nvme optimizer init
      
      * [gemini] gemini supports lazy init
      
      * [gemini] add init example
      
      * [gemini] add fool model
      
      * [zero] update gemini ddp
      
      * [zero] update init example
      
      * add chunk method
      
      * add chunk method
      
      * [lazyinit] fix lazy tensor tolist
      
      * [gemini] fix buffer materialization
      
      * [misc] remove useless file
      
      * [booster] update gemini plugin
      
      * [test] update gemini plugin test
      
      * [test] fix gemini plugin test
      
      * [gemini] fix import
      
      * [gemini] fix import
      
      * [lazyinit] use new metatensor
      
      * [lazyinit] use new metatensor
      
      * [lazyinit] fix __set__ method
      152239bb
  8. 06 Apr, 2023 3 commits
  9. 04 Apr, 2023 3 commits
    • YuliangLiu0306's avatar
      [autoparallel]integrate auto parallel feature with new tracer (#3408) · ffcdbf0f
      YuliangLiu0306 authored
      * [autoparallel] integrate new analyzer in module level
      
      * unify the profiling method
      
      * polish
      
      * fix no codegen bug
      
      * fix pass bug
      
      * fix liveness test
      
      * polish
      ffcdbf0f
    • Frank Lee's avatar
      [checkpoint] refactored the API and added safetensors support (#3427) · 1beb85cc
      Frank Lee authored
      * [checkpoint] refactored the API and added safetensors support
      
      * polish code
      1beb85cc
    • ver217's avatar
      [zero] reorganize zero/gemini folder structure (#3424) · 26b7aac0
      ver217 authored
      * [zero] refactor low-level zero folder structure
      
      * [zero] fix legacy zero import path
      
      * [zero] fix legacy zero import path
      
      * [zero] remove useless import
      
      * [zero] refactor gemini folder structure
      
      * [zero] refactor gemini folder structure
      
      * [zero] refactor legacy zero import path
      
      * [zero] refactor gemini folder structure
      
      * [zero] refactor gemini folder structure
      
      * [zero] refactor gemini folder structure
      
      * [zero] refactor legacy zero import path
      
      * [zero] fix test import path
      
      * [zero] fix test
      
      * [zero] fix circular import
      
      * [zero] update import
      26b7aac0
  10. 03 Apr, 2023 1 commit
  11. 31 Mar, 2023 2 commits
    • ver217's avatar
      [booster] implement Gemini plugin (#3352) · 5f2e34e6
      ver217 authored
      * [booster] add gemini plugin
      
      * [booster] update docstr
      
      * [booster] gemini plugin add coloparam convertor
      
      * [booster] fix coloparam convertor
      
      * [booster] fix gemini plugin device
      
      * [booster] add gemini plugin test
      
      * [booster] gemini plugin ignore sync bn
      
      * [booster] skip some model
      
      * [booster] skip some model
      
      * [booster] modify test world size
      
      * [booster] modify test world size
      
      * [booster] skip test
      5f2e34e6
    • HELSON's avatar
      [moe] add checkpoint for moe models (#3354) · 1a1d68b0
      HELSON authored
      * [moe] add checkpoint for moe models
      
      * [hotfix] fix bugs in unit test
      1a1d68b0
  12. 30 Mar, 2023 1 commit
  13. 27 Mar, 2023 1 commit
  14. 24 Mar, 2023 2 commits
  15. 23 Mar, 2023 2 commits
    • Frank Lee's avatar
      [api] implemented the checkpoint io module (#3205) · cd142fbe
      Frank Lee authored
      * [api] implemented the checkpoint io module
      
      * polish code
      
      * polish code
      cd142fbe
    • ver217's avatar
      [lazyinit] combine lazy tensor with dtensor (#3204) · f8289d42
      ver217 authored
      * [lazyinit] lazy tensor add distribute
      
      * [lazyinit] refactor distribute
      
      * [lazyinit] add test dist lazy init
      
      * [lazyinit] add verbose info for dist lazy init
      
      * [lazyinit] fix rnn flatten weight op
      
      * [lazyinit] polish test
      
      * [lazyinit] polish test
      
      * [lazyinit] fix lazy tensor data setter
      
      * [lazyinit] polish test
      
      * [lazyinit] fix clean
      
      * [lazyinit] make materialize inplace
      
      * [lazyinit] refactor materialize
      
      * [lazyinit] refactor test distribute
      
      * [lazyinit] fix requires_grad
      
      * [lazyinit] fix tolist after materialization
      
      * [lazyinit] refactor distribute module
      
      * [lazyinit] polish docstr
      
      * [lazyinit] polish lazy init context
      
      * [lazyinit] temporarily skip test
      
      * [lazyinit] polish test
      
      * [lazyinit] add docstr
      f8289d42
  16. 22 Mar, 2023 2 commits
  17. 21 Mar, 2023 2 commits
  18. 20 Mar, 2023 4 commits
  19. 17 Mar, 2023 2 commits
    • ver217's avatar
      [lazyinit] add correctness verification (#3147) · 6ae8ed04
      ver217 authored
      * [lazyinit] fix shared module
      
      * [tests] add lazy init test utils
      
      * [tests] add torchvision for lazy init
      
      * [lazyinit] fix pre op fn
      
      * [lazyinit] handle legacy constructor
      
      * [tests] refactor lazy init test models
      
      * [tests] refactor lazy init test utils
      
      * [lazyinit] fix ops don't support meta
      
      * [tests] lazy init test timm models
      
      * [lazyinit] fix set data
      
      * [lazyinit] handle apex layers
      
      * [tests] lazy init test transformers models
      
      * [tests] lazy init test torchaudio models
      
      * [lazyinit] fix import path
      
      * [tests] lazy init test torchrec models
      
      * [tests] update torch version in CI
      
      * [tests] revert torch version in CI
      
      * [tests] skip lazy init test
      6ae8ed04
    • Frank Lee's avatar
      [booster] implemented mixed precision class (#3151) · ed192905
      Frank Lee authored
      * [booster] implemented mixed precision class
      
      * polish code
      ed192905
  20. 15 Mar, 2023 4 commits