Compression doc structure refactor (#2676)
* init sapruner
* seperate sapruners from other one-shot pruners
* update
* fix model params issue
* make the process runnable
* show evaluation result in example
* sort the sparsities and scale it
* fix rescale issue
* fix scale issue; add pruning history
* record the actual total sparsity
* fix sparsity 0/1 problem
* revert useless modif
* revert useless modif
* fix 0 pruning weights problem
* save pruning history in csv file
* fix typo
* remove check perm in Makefile
* use os path
* save config list in json format
* update analyze py; update docker
* update
* update analyze
* update log info in compressor
* init NetAdapt Pruner
* refine examples
* update
* fine tune
* update
* fix quote issue
* add code for imagenet integrity
* update
* use datasets.ImageNet
* update
* update
* add channel pruning in SAPruner; refine example
* update net_adapt pruner; add dependency constraint in sapruner(beta)
* update
* update
* update
* fix zero division problem
* fix typo
* update
* fix naive issue of NetAdaptPruner
* fix data issue for no-dependency modules
* add cifar10 vgg16 examplel
* update
* update
* fix folder creation issue; change lr for vgg exp
* update
* add save model arg
* fix model copy issue
* init related weights calc
* update analyze file
* NetAdaptPruner: use fine-tuned weights after each iteration; fix modules_wrapper iteration issue
* consider channel/filter cross pruning
* NetAdapt: consider previous op when calc total sparsity
* update
* use customized vgg
* add performances comparison plt
* fix netadaptPruner mask copy issue
* add resnet18 example
* fix example issue
* update experiment data
* fix bool arg parsing issue
* update
* init ADMMPruner
* ADMMPruner: update
* ADMMPruner: finish v1.0
* ADMMPruner: refine
* update
* AutoCompress init
* AutoCompress: update
* AutoCompressPruner: fix issues:
* add test for auto pruners
* add doc for auto pruners
* fix link in md
* remove irrelevant files
* Clean code
* code clean
* fix pylint issue
* fix pylint issue
* rename admm & autoCompress param
* use abs link in doc
* reorder import to fix import issue: autocompress relies on speedup
* refine doc
* NetAdaptPruner: decay pruning step
* take changes from testing branch
* refine
* fix typo
* ADMMPruenr: check base_algo together with config schema
* fix broken link
* doc refine
* ADMM:refine
* refine doc
* refine doc
* refince doc
* refine doc
* refine doc
* refine doc
* update
* update
* refactor AGP doc
* update
* fix optimizer issue
* fix comments: typo, rename AGP_Pruner
* fix torch.nn.Module issue; refine SA docstring
* fix typo
Co-authored-by:
Yuge Zhang <scottyugochang@gmail.com>
Showing
This diff is collapsed.
| W: | H:
| W: | H: