1. 11 Aug, 2020 1 commit
  2. 31 Jul, 2020 2 commits
    • Guoxin's avatar
      Compression doc structure refactor (#2676) · 41312de5
      Guoxin authored
      
      
      * init sapruner
      
      * seperate sapruners from other one-shot pruners
      
      * update
      
      * fix model params issue
      
      * make the process runnable
      
      * show evaluation result in example
      
      * sort the sparsities and scale it
      
      * fix rescale issue
      
      * fix scale issue; add pruning history
      
      * record the actual total sparsity
      
      * fix sparsity 0/1 problem
      
      * revert useless modif
      
      * revert useless modif
      
      * fix 0 pruning weights problem
      
      * save pruning history in csv file
      
      * fix typo
      
      * remove check perm in Makefile
      
      * use os path
      
      * save config list in json format
      
      * update analyze py; update docker
      
      * update
      
      * update analyze
      
      * update log info in compressor
      
      * init NetAdapt Pruner
      
      * refine examples
      
      * update
      
      * fine tune
      
      * update
      
      * fix quote issue
      
      * add code for imagenet  integrity
      
      * update
      
      * use datasets.ImageNet
      
      * update
      
      * update
      
      * add channel pruning in SAPruner; refine example
      
      * update net_adapt pruner; add dependency constraint in sapruner(beta)
      
      * update
      
      * update
      
      * update
      
      * fix zero division problem
      
      * fix typo
      
      * update
      
      * fix naive issue of NetAdaptPruner
      
      * fix data issue for no-dependency modules
      
      * add cifar10 vgg16 examplel
      
      * update
      
      * update
      
      * fix folder creation issue; change lr for vgg exp
      
      * update
      
      * add save model arg
      
      * fix model copy issue
      
      * init related weights calc
      
      * update analyze file
      
      * NetAdaptPruner: use fine-tuned weights after each iteration; fix modules_wrapper iteration issue
      
      * consider channel/filter cross pruning
      
      * NetAdapt: consider previous op when calc total sparsity
      
      * update
      
      * use customized vgg
      
      * add performances comparison plt
      
      * fix netadaptPruner mask copy issue
      
      * add resnet18 example
      
      * fix example issue
      
      * update experiment data
      
      * fix bool arg parsing issue
      
      * update
      
      * init ADMMPruner
      
      * ADMMPruner: update
      
      * ADMMPruner: finish v1.0
      
      * ADMMPruner: refine
      
      * update
      
      * AutoCompress init
      
      * AutoCompress: update
      
      * AutoCompressPruner: fix issues:
      
      * add test for auto pruners
      
      * add doc for auto pruners
      
      * fix link in md
      
      * remove irrelevant files
      
      * Clean code
      
      * code clean
      
      * fix pylint issue
      
      * fix pylint issue
      
      * rename admm & autoCompress param
      
      * use abs link in doc
      
      * reorder import to fix import issue: autocompress relies on speedup
      
      * refine doc
      
      * NetAdaptPruner: decay pruning step
      
      * take changes from testing branch
      
      * refine
      
      * fix typo
      
      * ADMMPruenr: check base_algo together with config schema
      
      * fix broken link
      
      * doc refine
      
      * ADMM:refine
      
      * refine doc
      
      * refine doc
      
      * refince doc
      
      * refine doc
      
      * refine doc
      
      * refine doc
      
      * update
      
      * update
      
      * refactor AGP doc
      
      * update
      
      * fix optimizer issue
      
      * fix comments: typo, rename AGP_Pruner
      
      * fix torch.nn.Module issue; refine SA docstring
      
      * fix typo
      Co-authored-by: default avatarYuge Zhang <scottyugochang@gmail.com>
      41312de5
    • Yuge Zhang's avatar
  3. 29 Jul, 2020 1 commit
  4. 27 Jul, 2020 1 commit
  5. 24 Jul, 2020 2 commits
  6. 17 Jul, 2020 2 commits
  7. 16 Jul, 2020 2 commits
  8. 08 Jul, 2020 8 commits
  9. 07 Jul, 2020 5 commits
  10. 01 Jul, 2020 1 commit
  11. 30 Jun, 2020 5 commits
    • Guoxin's avatar
      Auto pruners (#2490) · f5caa193
      Guoxin authored
      f5caa193
    • colorjam's avatar
      Add flops and params counter (#2535) · a3b0bd7d
      colorjam authored
      a3b0bd7d
    • chicm-ms's avatar
      19eabd69
    • Chi Song's avatar
      Reuse OpenPAI jobs to run multiple trials (#2521) · 0b9d6ce6
      Chi Song authored
      Designed new interface to support reusable training service, currently only applies to OpenPAI, and default disabled.
      
      Replace trial_keeper.py to trial_runner.py, trial_runner holds an environment, and receives commands from nni manager to run or stop an trial, and return events to nni manager.
      Add trial dispatcher, which inherits from original trianing service interface. It uses to share as many as possible code of all training service, and isolate with training services.
      Add EnvironmentService interface to manage environment, including start/stop an environment, refresh status of environments.
      Add command channel on both nni manager and trial runner parts, it supports different ways to pass messages between them. Current supported channels are file, web sockets. and supported commands from nni manager are start, kill trial, send new parameters; from runner are initialized(support some channel doesn't know which runner connected), trial end, stdout ((new type), including metric like before), version check (new type), gpu info (new type).
      Add storage service to wrapper a storage to standard file operations, like NFS, azure storage and so on.
      Partial support run multiple trials in parallel on runner side, but not supported by trial dispatcher side.
      Other minor changes,
      
      Add log_level to TS UT, so that UT can show debug level log.
      Expose platform to start info.
      Add RouterTrainingService to keep origianl OpenPAI training service, and support dynamic IOC binding.
      Add more GPU info for future usage, including GPU mem total/free/used, gpu type.
      Make some license information consistence.
      Fix async/await problems on Array.forEach, this method doesn't support async actually.
      Fix IT errors on download data, which causes by my #2484 .
      Accelerate some run loop pattern by reducing sleep seconds.
      0b9d6ce6
    • gxiaotian's avatar
      Add OpEvo example (#2549) · 6de15707
      gxiaotian authored
      6de15707
  12. 29 Jun, 2020 6 commits
    • Yuge Zhang's avatar
      Add docs for nniignore and improve docs of training service (#2561) · 25c4c3b5
      Yuge Zhang authored
      * Improve docs
      
      * All using # to start
      
      * uploaded -> excluded
      
      * Set versionCheck default to true
      
      * Add .nniignore example
      
      * Rename training service docs
      
      * Update training service docs
      
      * Fix step numbering
      
      * Resolve comments
      
      * Fix typo
      
      * Resolve comments
      
      * Fix broken link
      25c4c3b5
    • QuanluZhang's avatar
      3d57dd73
    • Tab Zhang's avatar
      format tuner and accessor README (#2588) · 813e4115
      Tab Zhang authored
      813e4115
    • Tab Zhang's avatar
      ab640634
    • Tab Zhang's avatar
      format nnictl tutorial (#2599) · 5a805e36
      Tab Zhang authored
      5a805e36
    • Yuge Zhang's avatar
      NAS Benchmark (#2578) · b91aba39
      Yuge Zhang authored
      * Adding NAS Benchmark (201)
      
      * Add missing endline
      
      * Update script
      
      * Draft for NAS-Bench-101
      
      * Update NAS-Bench-101
      
      * Update constants
      
      * Add API
      
      * Update API
      
      * Fix typo
      
      * Draft for NDS
      
      * Fix issues in storing loss
      
      * Fix cell_spec problem
      
      * Finalize NDS
      
      * Update time consumption
      
      * Add nds query function
      
      * Update documentation for NAS-Bench-101
      
      * Reformat generators
      
      * Add NAS-Bench-201 docs
      
      * Unite constant names
      
      * Update docstring
      
      * Update docstring
      
      * Update rst
      
      * Update scripts
      
      * Add git as dependency
      
      * Apt update
      
      * Update installation scripts
      
      * Fix dependency for pipeline
      
      * Fix NDS script
      
      * Fix NAS-Bench-201 installation
      
      * Add example notebook
      
      * Correct latency dimension
      
      * shortcuts -> query
      
      * Change run -> trial, ComputedStats -> TrialStats
      
      * ipynb needs re-generation
      
      * Fix NAS rst
      
      * Fix documentation and pylint
      
      * Fix pylint
      
      * Add pandoc as dependency
      
      * Update pandoc dependency
      
      * Fix documentation broken link
      b91aba39
  13. 27 Jun, 2020 1 commit
  14. 24 Jun, 2020 1 commit
  15. 23 Jun, 2020 1 commit
  16. 19 Jun, 2020 1 commit