1. 19 Jun, 2023 1 commit
  2. 16 Jun, 2023 2 commits
  3. 15 Jun, 2023 2 commits
  4. 09 Jun, 2023 2 commits
  5. 08 Jun, 2023 1 commit
  6. 05 Jun, 2023 2 commits
    • Hongxin Liu's avatar
      [bf16] add bf16 support (#3882) · ae02d4e4
      Hongxin Liu authored
      * [bf16] add bf16 support for fused adam (#3844)
      
      * [bf16] fused adam kernel support bf16
      
      * [test] update fused adam kernel test
      
      * [test] update fused adam test
      
      * [bf16] cpu adam and hybrid adam optimizers support bf16 (#3860)
      
      * [bf16] implement mixed precision mixin and add bf16 support for low level zero (#3869)
      
      * [bf16] add mixed precision mixin
      
      * [bf16] low level zero optim support bf16
      
      * [text] update low level zero test
      
      * [text] fix low level zero grad acc test
      
      * [bf16] add bf16 support for gemini (#3872)
      
      * [bf16] gemini support bf16
      
      * [test] update gemini bf16 test
      
      * [doc] update gemini docstring
      
      * [bf16] add bf16 support for plugins (#3877)
      
      * [bf16] add bf16 support for legacy zero (#3879)
      
      * [zero] init context support bf16
      
      * [zero] legacy zero support bf16
      
      * [test] add zero bf16 test
      
      * [doc] add bf16 related docstring for legacy zero
      ae02d4e4
    • Hongxin Liu's avatar
      [lazy] refactor lazy init (#3891) · dbb32692
      Hongxin Liu authored
      * [lazy] remove old lazy init
      
      * [lazy] refactor lazy init folder structure
      
      * [lazy] fix lazy tensor deepcopy
      
      * [test] update lazy init test
      dbb32692
  7. 23 May, 2023 2 commits
  8. 19 May, 2023 1 commit
  9. 18 May, 2023 1 commit
    • Hongxin Liu's avatar
      [plugin] torch ddp plugin supports sharded model checkpoint (#3775) · 5452df63
      Hongxin Liu authored
      * [plugin] torch ddp plugin add save sharded model
      
      * [test] fix torch ddp ckpt io test
      
      * [test] fix torch ddp ckpt io test
      
      * [test] fix low level zero plugin test
      
      * [test] fix low level zero plugin test
      
      * [test] add debug info
      
      * [test] add debug info
      
      * [test] add debug info
      
      * [test] add debug info
      
      * [test] add debug info
      
      * [test] fix low level zero plugin test
      
      * [test] fix low level zero plugin test
      
      * [test] remove debug info
      5452df63
  10. 15 May, 2023 3 commits
  11. 11 May, 2023 1 commit
    • digger-yu's avatar
      [CI] fix typo with tests/ etc. (#3727) · 1f73609a
      digger-yu authored
      * fix spelling error with examples/comminity/
      
      * fix spelling error with tests/
      
      * fix some spelling error with tests/ colossalai/ etc.
      
      * fix spelling error with tests/ etc. date:2023.5.10
      1f73609a
  12. 10 May, 2023 2 commits
  13. 09 May, 2023 1 commit
    • Hongxin Liu's avatar
      [booster] fix no_sync method (#3709) · 6552cbf8
      Hongxin Liu authored
      * [booster] fix no_sync method
      
      * [booster] add test for ddp no_sync
      
      * [booster] fix merge
      
      * [booster] update unit test
      
      * [booster] update unit test
      
      * [booster] update unit test
      6552cbf8
  14. 08 May, 2023 1 commit
  15. 05 May, 2023 3 commits
    • Hongxin Liu's avatar
      [booster] refactor all dp fashion plugins (#3684) · d0915f54
      Hongxin Liu authored
      * [booster] add dp plugin base
      
      * [booster] inherit dp plugin base
      
      * [booster] refactor unit tests
      d0915f54
    • digger-yu's avatar
      [CI] Update test_sharded_optim_with_sync_bn.py (#3688) · b49020c1
      digger-yu authored
      fix spelling error in line23
      change "cudnn_determinstic"=True to "cudnn_deterministic=True"
      b49020c1
    • jiangmingyan's avatar
      [booster] gemini plugin support shard checkpoint (#3610) · 307894f7
      jiangmingyan authored
      
      
      * gemini plugin add shard checkpoint save/load
      
      * gemini plugin add shard checkpoint save/load
      
      * gemini plugin add shard checkpoint save/load
      
      * gemini plugin add shard checkpoint save/load
      
      * gemini plugin add shard checkpoint save/load
      
      * gemini plugin add shard checkpoint save/load
      
      * gemini plugin add shard checkpoint save/load
      
      * gemini plugin add shard checkpoint save/load
      
      * gemini plugin add shard checkpoint save/load
      
      * gemini plugin add shard checkpoint save/load
      
      * gemini plugin add shard checkpoint save/load
      
      * gemini plugin add shard checkpoint save/load
      
      * gemini plugin add shard checkpoint save/load
      
      * gemini plugin add shard checkpoint save/load
      
      * gemini plugin support shard checkpoint
      
      * [API Refactoring]gemini plugin support shard checkpoint
      
      * [API Refactoring]gemini plugin support shard checkpoint
      
      * [API Refactoring]gemini plugin support shard checkpoint
      
      * [API Refactoring]gemini plugin support shard checkpoint
      
      * [API Refactoring]gemini plugin support shard checkpoint
      
      * [API Refactoring]gemini plugin support shard checkpoint
      
      * [API Refactoring]gemini plugin support shard checkpoint
      
      * [API Refactoring]gemini plugin support shard checkpoint
      
      * [API Refactoring]gemini plugin support shard checkpoint
      
      * [API Refactoring]gemini plugin support shard checkpoint
      
      * [API Refactoring]gemini plugin support shard checkpoint
      
      * [API Refactoring]gemini plugin support shard checkpoint
      
      * [API Refactoring]gemini plugin support shard checkpoint
      
      ---------
      Co-authored-by: default avatarluchen <luchen@luchendeMBP.lan>
      Co-authored-by: default avatarluchen <luchen@luchendeMacBook-Pro.local>
      307894f7
  16. 26 Apr, 2023 2 commits
    • Hongxin Liu's avatar
      [gemini] accelerate inference (#3641) · 50793b35
      Hongxin Liu authored
      * [gemini] support don't scatter after inference
      
      * [chat] update colossalai strategy
      
      * [chat] fix opt benchmark
      
      * [chat] update opt benchmark
      
      * [gemini] optimize inference
      
      * [test] add gemini inference test
      
      * [chat] fix unit test ci
      
      * [chat] fix ci
      
      * [chat] fix ci
      
      * [chat] skip checkpoint test
      50793b35
    • Hongxin Liu's avatar
      [booster] add low level zero plugin (#3594) · 4b3240cb
      Hongxin Liu authored
      * [booster] add low level zero plugin
      
      * [booster] fix gemini plugin test
      
      * [booster] fix precision
      
      * [booster] add low level zero plugin test
      
      * [test] fix booster plugin test oom
      
      * [test] fix booster plugin test oom
      
      * [test] fix googlenet and inception output trans
      
      * [test] fix diffuser clip vision model
      
      * [test] fix torchaudio_wav2vec2_base
      
      * [test] fix low level zero plugin test
      4b3240cb
  17. 17 Apr, 2023 1 commit
  18. 12 Apr, 2023 1 commit
    • Hongxin Liu's avatar
      [gemini] gemini supports lazy init (#3379) · 152239bb
      Hongxin Liu authored
      * [gemini] fix nvme optimizer init
      
      * [gemini] gemini supports lazy init
      
      * [gemini] add init example
      
      * [gemini] add fool model
      
      * [zero] update gemini ddp
      
      * [zero] update init example
      
      * add chunk method
      
      * add chunk method
      
      * [lazyinit] fix lazy tensor tolist
      
      * [gemini] fix buffer materialization
      
      * [misc] remove useless file
      
      * [booster] update gemini plugin
      
      * [test] update gemini plugin test
      
      * [test] fix gemini plugin test
      
      * [gemini] fix import
      
      * [gemini] fix import
      
      * [lazyinit] use new metatensor
      
      * [lazyinit] use new metatensor
      
      * [lazyinit] fix __set__ method
      152239bb
  19. 06 Apr, 2023 3 commits
  20. 04 Apr, 2023 3 commits
    • YuliangLiu0306's avatar
      [autoparallel]integrate auto parallel feature with new tracer (#3408) · ffcdbf0f
      YuliangLiu0306 authored
      * [autoparallel] integrate new analyzer in module level
      
      * unify the profiling method
      
      * polish
      
      * fix no codegen bug
      
      * fix pass bug
      
      * fix liveness test
      
      * polish
      ffcdbf0f
    • Frank Lee's avatar
      [checkpoint] refactored the API and added safetensors support (#3427) · 1beb85cc
      Frank Lee authored
      * [checkpoint] refactored the API and added safetensors support
      
      * polish code
      1beb85cc
    • ver217's avatar
      [zero] reorganize zero/gemini folder structure (#3424) · 26b7aac0
      ver217 authored
      * [zero] refactor low-level zero folder structure
      
      * [zero] fix legacy zero import path
      
      * [zero] fix legacy zero import path
      
      * [zero] remove useless import
      
      * [zero] refactor gemini folder structure
      
      * [zero] refactor gemini folder structure
      
      * [zero] refactor legacy zero import path
      
      * [zero] refactor gemini folder structure
      
      * [zero] refactor gemini folder structure
      
      * [zero] refactor gemini folder structure
      
      * [zero] refactor legacy zero import path
      
      * [zero] fix test import path
      
      * [zero] fix test
      
      * [zero] fix circular import
      
      * [zero] update import
      26b7aac0
  21. 03 Apr, 2023 1 commit
  22. 31 Mar, 2023 2 commits
    • ver217's avatar
      [booster] implement Gemini plugin (#3352) · 5f2e34e6
      ver217 authored
      * [booster] add gemini plugin
      
      * [booster] update docstr
      
      * [booster] gemini plugin add coloparam convertor
      
      * [booster] fix coloparam convertor
      
      * [booster] fix gemini plugin device
      
      * [booster] add gemini plugin test
      
      * [booster] gemini plugin ignore sync bn
      
      * [booster] skip some model
      
      * [booster] skip some model
      
      * [booster] modify test world size
      
      * [booster] modify test world size
      
      * [booster] skip test
      5f2e34e6
    • HELSON's avatar
      [moe] add checkpoint for moe models (#3354) · 1a1d68b0
      HELSON authored
      * [moe] add checkpoint for moe models
      
      * [hotfix] fix bugs in unit test
      1a1d68b0
  23. 30 Mar, 2023 1 commit
  24. 27 Mar, 2023 1 commit