AutoPruningUsingTuners.rst 2.57 KB
Newer Older
1
2
3
4
5
6
7
8
Automatic Model Pruning using NNI Tuners
========================================

It's convenient to implement auto model pruning with NNI compression and NNI tuners

First, model compression with NNI
---------------------------------

9
You can easily compress a model with NNI compression. Take pruning for example, you can prune a pretrained model with L2FilterPruner like this
10
11
12

.. code-block:: python

13
14
15
   from nni.algorithms.compression.pytorch.pruning import L2FilterPruner
   config_list = [{ 'sparsity': 0.5, 'op_types': ['Conv2d'] }]
   pruner = L2FilterPruner(model, config_list)
16
17
   pruner.compress()

18
The 'Conv2d' op_type stands for the module types defined in :githublink:`default_layers.py <nni/compression/pytorch/default_layers.py>` for pytorch.
19

20
Therefore ``{ 'sparsity': 0.5, 'op_types': ['Conv2d'] }``\ means that **all layers with specified op_types will be compressed with the same 0.5 sparsity**. When ``pruner.compress()`` called, the model is compressed with masks and after that you can normally fine tune this model and **pruned weights won't be updated** which have been masked.
21
22
23
24

Then, make this automatic
-------------------------

25
26
27
The previous example manually chose L2FilterPruner and pruned with a specified sparsity. Different sparsity and different pruners may have different effects on different models. This process can be done with NNI tuners.

Firstly, modify our codes for few lines
28
29
30

.. code-block:: python

31
32
33
34
35
36
37
38
39
40
41
42
43
44
    import nni
    from nni.algorithms.compression.pytorch.pruning import *
   
    params = nni.get_parameters()
    sparsity = params['sparsity']
    pruner_name = params['pruner']
    model_name = params['model']

    model, pruner = get_model_pruner(model_name, pruner_name, sparsity)
    pruner.compress()

    train(model)  # your code for fine-tuning the model
    acc = test(model)  # test the fine-tuned model
    nni.report_final_results(acc)
45

46
Then, define a ``config`` file in YAML to automatically tuning model, pruning algorithm and sparsity.
47
48
49

.. code-block:: yaml

50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
    searchSpace:
    sparsity:
      _type: choice
      _value: [0.25, 0.5, 0.75]
    pruner:
      _type: choice
      _value: ['slim', 'l2filter', 'fpgm', 'apoz']
    model:
      _type: choice
      _value: ['vgg16', 'vgg19']
    trainingService:
    platform: local
    trialCodeDirectory: .
    trialCommand: python3 basic_pruners_torch.py --nni
    trialConcurrency: 1
    trialGpuNumber: 0
    tuner:
      name: grid

The full example can be found :githublink:`here <examples/model_compress/pruning/config.yml>`

Finally, start the searching via

.. code-block:: bash

   nnictl create -c config.yml