- 01 Mar, 2024 1 commit
-
-
Camille Zhong authored
-
- 29 Feb, 2024 3 commits
-
-
binmakeswell authored
-
binmakeswell authored
-
Frank Lee authored
-
- 28 Feb, 2024 1 commit
-
-
Tong Li authored
-
- 27 Feb, 2024 4 commits
-
-
flybird11111 authored
* gather llama logits * fix
-
Frank Lee authored
-
QinLuo authored
-
Hongxin Liu authored
-
- 26 Feb, 2024 1 commit
-
-
Hongxin Liu authored
-
- 20 Feb, 2024 1 commit
-
-
Stephan Kölker authored
-
- 19 Feb, 2024 4 commits
-
-
CZYCW authored
Co-authored-by:binmakeswell <binmakeswell@gmail.com>
-
Frank Lee authored
-
yixiaoer authored
-
Hongxin Liu authored
* [llama] refactor inference example to fit sft * [llama] fix training script to fit gemini * [llama] fix inference script
-
- 08 Feb, 2024 4 commits
-
-
Hongxin Liu authored
-
Frank Lee authored
-
Frank Lee authored
[llama] support npu for Colossal-LLaMA-2
-
ver217 authored
-
- 07 Feb, 2024 6 commits
-
-
Hongxin Liu authored
-
Hongxin Liu authored
-
Hongxin Liu authored
-
Hongxin Liu authored
* [moe] add mixtral block for single expert * [moe] mixtral block fwd support uneven ep * [moe] mixtral block bwd support uneven ep * [moe] add mixtral moe layer * [moe] simplify replace * [meo] support save sharded mixtral * [meo] support load sharded mixtral * [meo] support save sharded optim * [meo] integrate moe manager into plug * [meo] fix optimizer load * [meo] fix mixtral layer
-
Hongxin Liu authored
* [moe] top2 allow uneven input * [moe] update capacity computing * [moe] remove debug info * [moe] update capacity computing * [moe] update capacity computing
-
Xuanlei Zhao authored
-
- 06 Feb, 2024 4 commits
-
-
Hongxin Liu authored
* [llama] fix memory issue * [llama] add comment
-
Hongxin Liu authored
-
Hongxin Liu authored
-
Camille Zhong authored
-
- 05 Feb, 2024 4 commits
-
-
Camille Zhong authored
-
Hongxin Liu authored
-
Hongxin Liu authored
* [llama] update training script * [doc] polish docstr
-
Hongxin Liu authored
* [plugin] refactor prepare dataloader * [plugin] update train script
-
- 04 Feb, 2024 1 commit
-
-
Hongxin Liu authored
* [gemini] fix param op hook when output is tuple * [gemini] fix param op hook
-
- 02 Feb, 2024 1 commit
-
-
Wenhao Chen authored
* fix: remove unnecessary assert * test: add more 3d plugin tests * fix: add warning
-
- 01 Feb, 2024 2 commits
-
-
Hongxin Liu authored
* [checkpointio] fix hybrid parallel optim checkpoint * [extension] fix cuda extension * [checkpointio] fix gemini optimizer checkpoint * polish code
-
YeAnbang authored
* fix script * fix script * fix chat nan * fix chat nan
-
- 31 Jan, 2024 1 commit
-
-
Frank Lee authored
-
- 30 Jan, 2024 2 commits