- 06 Apr, 2023 1 commit
-
-
Frank Lee authored
* [test] added spawn decorator * polish code * polish code * polish code * polish code * polish code * polish code
-
- 30 Mar, 2023 1 commit
-
-
YuliangLiu0306 authored
* [autoparallel] adapt autoparallel with new analyzer * fix all node handler tests * polish * polish
-
- 24 Mar, 2023 1 commit
-
-
YuliangLiu0306 authored
* [hotfix] skip torchaudio tracing test * fix lazy init test issue
-
- 22 Mar, 2023 1 commit
-
-
YuliangLiu0306 authored
* pass gpt trace and meta_prop * pass t5 trace and meta_prop * [FX] refactor experimental tracer and adapt it with hf models * pass all mainstream model zoo * fix CI * fix CI * fix CI * fix CI * fix CI * fix CI * fix CI * fix CI * skip tests * fix CI * using packaging version * polish
-
- 20 Mar, 2023 3 commits
- 15 Mar, 2023 4 commits
-
-
YuliangLiu0306 authored
-
ver217 authored
* [tests] model zoo add torchaudio models * [tests] refactor torchaudio wavernn * [tests] refactor fx torchaudio tests
-
Frank Lee authored
-
Frank Lee authored
* [test] added torchvision models to test model zoo * polish code * polish code * polish code * polish code * polish code * polish code
-
- 14 Mar, 2023 2 commits
- 07 Mar, 2023 1 commit
-
-
YuliangLiu0306 authored
* [hotfix] skip auto checkpointing tests * fix test name issue
-
- 02 Dec, 2022 2 commits
-
-
Ziyue Jiang authored
* use Topo class to rewrite DAG * polish code * polish code * polish code * add comment * add else to unended if Co-authored-by:Ziyue Jiang <ziyue.jiang@gmail.com>
-
YuliangLiu0306 authored
-
- 25 Nov, 2022 1 commit
-
-
Ziyue Jiang authored
* add DAG to split_module * add comment * add test case for DAG * remove print Co-authored-by:Ziyue Jiang <ziyue.jiang@gmail.com>
-
- 10 Nov, 2022 3 commits
-
-
Jiarui Fang authored
-
Jiarui Fang authored
-
Frank Lee authored
-
- 08 Nov, 2022 2 commits
-
-
Super Daniel authored
* [fx] add a symbolic_trace api. * [fx] fix import errors.
-
Jiarui Fang authored
-
- 04 Nov, 2022 1 commit
-
-
YuliangLiu0306 authored
-
- 02 Nov, 2022 1 commit
-
-
Jiarui Fang authored
-
- 01 Nov, 2022 2 commits
-
-
YuliangLiu0306 authored
* [autoparallel] refactor tracer to fix bias addition issue * [fx] support module with bias addition * create bias_addition_module * refactor file structure * polish code * fix unit test
-
Super Daniel authored
* [autoparallel] first move. * [autoparallel] add solver rotor. * [autoparallel] add ckpt solvers. * [autoparallel] modify codegen. * [fx] fix annotation in test. * [fx] remove check. * [autoparallel] polish docstring. * [fx] refactor MetaTensor.
-
- 26 Oct, 2022 1 commit
-
-
Super Daniel authored
* [fx] change memory.py to memory_utils.py. * [fx] add shard utils. * [fx] fix import. * [fx] check code style. * [fx] add comment. * [autoparallel] first move. * [fx] add time computations.
-
- 20 Oct, 2022 1 commit
-
-
Super Daniel authored
* [fx] test tracer on diffuser modules. * [fx] shorter seq_len. * Update requirements-test.txt
-
- 19 Oct, 2022 1 commit
-
-
Super Daniel authored
* [fx/profiler] add test. * [fx] fix file names. * [fx] add docstring and comment. * [fx] polish profiler.py. * [fx] fix import errors. * [fx] fix profiler. * [fx] fix names.
-
- 18 Oct, 2022 1 commit
-
-
Super Daniel authored
[fx/meta/rpc] move _meta_registration.py to fx folder / register fx functions with compatibility checks / remove color debug (#1710) * [fx] move meta registration * [fx] fix tests. * [fx] fix test. * [fx] fix. * [meta] refactor meta registration.py. * [fx] add compatibility descriptions. * [fx] polish import. * [fx] add a decorator. * [fx] fix tests. * [fx] remove print. * [fx] edit raise error. * [fx] edit raise error. * [fx] add type hint. * [fx] fix import in experimental. * [rpc] remove color debug. * [meta] fix naming.
-
- 12 Oct, 2022 1 commit
-
-
Boyuan Yao authored
-
- 11 Oct, 2022 1 commit
-
-
Super Daniel authored
* [fx/profiler] modify data_ptr into uuid for all tensors. * [fx] modify uuid. * [fx/profiler] tune performance on GPT-2. * [fx] updates. * [fx] debug. * [fx] debug. * [fx] cuda.
-
- 03 Oct, 2022 1 commit
-
-
Boyuan Yao authored
* [autoparallel] add rotor c version * [fx] remove metainfoprop in rotor solver * [autoparallel] modify C code format * [autoparallel] remove build.py * [autoparallel] fix C extension build * [autoparallel] add C solver consistency test * [autoparallel] remove some unused imports * [autoparallel] refactor rotor solver code * [autoparallel] replace print with colossalai logger * [autoparallel] ranks fixed
-
- 27 Sep, 2022 2 commits
-
-
Frank Lee authored
-
Boyuan Yao authored
* [fx] fix offload codegen test * [fx] modify typing
-
- 23 Sep, 2022 2 commits
-
-
Boyuan Yao authored
* [fx] modify offload codegen * [fx] remove repeated hook definitions * [fx] modify offload test
-
Super Daniel authored
* [fx] tuned the meta info and rotor solver. * [fx] remove import. * [fx] remove import. * [fx] remove import. * [fx] tune the meta calculations. * [fx] polish comments. * [fx] remove assertions. * [fx] modify test cases. * [fx] modify test cases. * [fx] optimize import. * [fx
-
- 20 Sep, 2022 1 commit
-
-
Boyuan Yao authored
* [fx] add pofo algorithm * [fx] Add pofo solver * [fx] code refactor * [fx] fix test_linearize import
-
- 14 Sep, 2022 2 commits
-
-
Boyuan Yao authored
* [fx] add input activation offload to codegen * [fx] modify unit test * [fx] remove two skips in torch11 * [fx] use all_input_nodes instead of _input_nodes
-
Super Daniel authored
* [fx] add some comment and docstrings. * [fx] add dataflow analysis for an autograd graph. * add intepretation for graph analysis. * [fx] before doing save_tensor_hooks. * [fx] provide an accurate estimation of memory except for GPT-2. * [fx] provide an accurate estimation of memory except for GPT-2. * [fx] provide an accurate estimation of memory except for GPT-2. * [fx] a very accurate version on GPT-2. * [fx] refactor code. * [fx] remove redundant inplace=True. * [fx] refactor code. * [fx] refactor code. * [fx] refactor code. * [fx] dive into backward memory. * [fx] fix variable names in ckpt_solvers and unskip tests. * [fx] commit my changes. * [fx] restore skips. * [fx] restore skips. * [fx] chaange stage into phase. * [fx] chaange stage into phase. * [fx] chaange stage into phase.
-