- 24 Feb, 2023 1 commit
-
-
Frank Lee authored
-
- 23 Feb, 2023 3 commits
-
-
Jiatong (Julius) Han authored
* Remove math.prod dependency * Fix style * Fix style --------- Co-authored-by:Jiatong Han <jiatong.han@u.nus.edu>
-
YuliangLiu0306 authored
-
YuliangLiu0306 authored
* [autoparallel] find repeat blocks * polish * polish * polish
-
- 22 Feb, 2023 9 commits
-
-
BlueRum authored
-
junxu authored
-
HELSON authored
* [hotfix] fix chunk size can not be divided * [hotfix] use numpy for python3.8
-
Alex_996 authored
Fix typos, `6.7 -> 6.7b`
-
dawei-wang authored
Fix hpcaitech/ColossalAI#2851
-
Boyuan Yao authored
* [autoparallel] non spmd meta information generator * [autoparallel] patch meta information for non spmd nodes
-
Boyuan Yao authored
* [autoparallel] patch meta information of torch.where * [autoparallel] pre-commit modified
-
Boyuan Yao authored
* [autoparallel] tanh meta information * [autoparallel] remove redundant code * [autoparallel] patch meta information of torch.nn.Dropout
-
BlueRum authored
* [chatgpt]fix train_rm bug with lora * [chatgpt]support colossalai strategy to train rm * fix pre-commit * fix pre-commit 2 * [chatgpt]fix rm eval typo * fix rm eval * fix pre commit * add support of saving ckpt in examples * fix single-gpu save
-
- 21 Feb, 2023 4 commits
-
-
Zheng Zeng authored
-
Frank Lee authored
* [cli] handled version check exceptions * polish code
-
BlueRum authored
* [chatgpt]fix train_rm bug with lora * [chatgpt]support colossalai strategy to train rm * fix pre-commit * fix pre-commit 2 * [chatgpt]fix rm eval typo * fix rm eval * fix pre commit
-
Frank Lee authored
* [triton] added copyright information for flash attention * polish code
-
- 20 Feb, 2023 8 commits
-
-
Boyuan Yao authored
* [autoparallel] tensor related meta information prototype * [autoparallel] tensor related meta information * [autoparallel] tensor related meta information * [autoparallel] tensor related meta information * [autoparallel] tensor related meta information
-
github-actions[bot] authored
Co-authored-by:github-actions <github-actions@github.com>
-
Haofan Wang authored
* add lora * format
-
ver217 authored
* [chatgpt] add test checkpoint * [chatgpt] test checkpoint use smaller model
-
Michelle authored
-
mickogoin authored
Fixed typo on line 285 from "defualt" to "default"
-
Marco Rodrigues authored
-
Jiarui Fang authored
-
- 19 Feb, 2023 1 commit
-
-
YuliangLiu0306 authored
* [hotfix] fix autoparallel zh docs * polish * polish
-
- 18 Feb, 2023 3 commits
-
-
YuliangLiu0306 authored
* [hotfix] add copyright for solver and device mesh * add readme * add alpa license * polish
-
LuGY authored
* [CI/CD] fix nightly release CD running on forker repo * fix misunderstanding of dispatch * remove some build condition, enable notify even when release failed
-
Boyuan Yao authored
* [autoparallel] rotor solver refactor * [autoparallel] rotor solver refactor
-
- 17 Feb, 2023 8 commits
-
-
binmakeswell authored
* [doc] update OPT serving * [doc] update OPT serving
-
HELSON authored
-
ver217 authored
* [chatgpt] add save/load checkpoint sample code * [chatgpt] add save/load checkpoint readme * [chatgpt] refactor save/load checkpoint readme
-
ver217 authored
* [chatgpt] startegy add prepare method * [chatgpt] refactor examples * [chatgpt] refactor strategy.prepare * [chatgpt] support save/load checkpoint * [chatgpt] fix unwrap actor * [chatgpt] fix unwrap actor
-
Boyuan Yao authored
* [autoparallel] embedding metainfo * [autoparallel] fix function name in test_activation_metainfo * [autoparallel] undo changes in activation metainfo and related tests
-
Boyuan Yao authored
-
Fazzie-Maqianli authored
-
Nikita Shulga authored
* Don't use `torch._six` This is a private API which is gone after https://github.com/pytorch/pytorch/pull/94709 * Update common.py
-
- 16 Feb, 2023 3 commits
-
-
ver217 authored
-
binmakeswell authored
* [doc] update OPT serving link * [doc] update example and OPT serving link * [doc] update example and OPT serving link
-
Frank Lee authored
-