@@ -8,10 +8,16 @@ With this motivation, our ambition is to provide a unified architecture in NNI,
...
@@ -8,10 +8,16 @@ With this motivation, our ambition is to provide a unified architecture in NNI,
## Supported algorithms
## Supported algorithms
NNI supports below NAS algorithms now, and being adding more. User can reproduce an algorithm, or use it on owned dataset. we also encourage user to implement other algorithms with [NNI API](#use-nni-api), to benefit more people.
NNI supports below NAS algorithms now and being adding more. User can reproduce an algorithm or use it on owned dataset. we also encourage user to implement other algorithms with [NNI API](#use-nni-api), to benefit more people.
Note, these algorithms run standalone without nnictl, and supports PyTorch only.
Note, these algorithms run standalone without nnictl, and supports PyTorch only.
### Dependencies
* Install latest NNI
* PyTorch 1.2+
* git
### DARTS
### DARTS
The main contribution of [DARTS: Differentiable Architecture Search][3] on algorithm is to introduce a novel algorithm for differentiable network architecture search on bilevel optimization.
The main contribution of [DARTS: Differentiable Architecture Search][3] on algorithm is to introduce a novel algorithm for differentiable network architecture search on bilevel optimization.
...
@@ -19,25 +25,34 @@ The main contribution of [DARTS: Differentiable Architecture Search][3] on algor
...
@@ -19,25 +25,34 @@ The main contribution of [DARTS: Differentiable Architecture Search][3] on algor
#### Usage
#### Usage
```bash
```bash
### In case NNI code is not cloned.
# In case NNI code is not cloned. If the code is cloned already, ignore this line and enter code folder.
[Progressive Differentiable Architecture Search: Bridging the Depth Gap between Search and Evaluation](https://arxiv.org/abs/1904.12760) bases on DARTS(#DARTS). It main contribution on algorithm is to introduce an efficient algorithm which allows the depth of searched architectures to grow gradually during the training procedure.
[Progressive Differentiable Architecture Search: Bridging the Depth Gap between Search and Evaluation](https://arxiv.org/abs/1904.12760) bases on [DARTS](#DARTS). It's contribution on algorithm is to introduce an efficient algorithm which allows the depth of searched architectures to grow gradually during the training procedure.
#### Usage
#### Usage
```bash
```bash
### In case NNI code is not cloned.
# In case NNI code is not cloned. If the code is cloned already, ignore this line and enter code folder.
git clone https://github.com/Microsoft/nni.git
git clone https://github.com/Microsoft/nni.git
# search the best architecture
cd examples/nas/pdarts
cd examples/nas/pdarts
python main.py
python3 search.py
# train the best architecture, it's the same progress as darts.
@@ -50,10 +65,10 @@ NOTE, we are trying to support various NAS algorithms with unified programming i
...
@@ -50,10 +65,10 @@ NOTE, we are trying to support various NAS algorithms with unified programming i
The programming interface of designing and searching a model is often demanded in two scenarios.
The programming interface of designing and searching a model is often demanded in two scenarios.
1. When designing a neural network, there may be multiple operation choices on a layer, sub-model, or connection, and it's undetermined which one or combination performs best. So it needs an easy way to express the candidate layers or sub-models.
1. When designing a neural network, there may be multiple operation choices on a layer, sub-model, or connection, and it's undetermined which one or combination performs best. So, it needs an easy way to express the candidate layers or sub-models.
2. When applying NAS on a neural network, it needs an unified way to express the search space of architectures, so that it doesn't need to update trial code for different searching algorithms.
2. When applying NAS on a neural network, it needs an unified way to express the search space of architectures, so that it doesn't need to update trial code for different searching algorithms.
NNI proposed API is [here](https://github.com/microsoft/nni/tree/dev-nas-refactor/src/sdk/pynni/nni/nas/pytorch). And [here](https://github.com/microsoft/nni/tree/dev-nas-refactor/examples/nas/darts) is an example of NAS implementation, which bases on NNI proposed interface.
NNI proposed API is [here](https://github.com/microsoft/nni/tree/master/src/sdk/pynni/nni/nas/pytorch). And [here](https://github.com/microsoft/nni/tree/master/examples/nas/darts) is an example of NAS implementation, which bases on NNI proposed interface.
@@ -73,12 +73,6 @@ All types of sampling strategies and their parameter are listed here:
...
@@ -73,12 +73,6 @@ All types of sampling strategies and their parameter are listed here:
* Which means the variable value is a value like `round(exp(normal(mu, sigma)) / q) * q`
* Which means the variable value is a value like `round(exp(normal(mu, sigma)) / q) * q`
* Suitable for a discrete variable with respect to which the objective is smooth and gets smoother with the size of the variable, which is bounded from one side.
* Suitable for a discrete variable with respect to which the objective is smooth and gets smoother with the size of the variable, which is bounded from one side.
* Type for [Neural Architecture Search Space][1]. Value is also a dictionary, which contains key-value pairs representing respectively name and search space of each mutable_layer.
* For now, users can only use this type of search space with annotation, which means that there is no need to define a json file for search space since it will be automatically generated according to the annotation in trial code.
* The following HPO tuners can be adapted to tune this search space: TPE, Random, Anneal, Evolution, Grid Search,
Hyperband and BOHB.
* For detailed usage, please refer to [General NAS Interfaces][1].
## Search Space Types Supported by Each Tuner
## Search Space Types Supported by Each Tuner
...
@@ -105,5 +99,3 @@ Known Limitations:
...
@@ -105,5 +99,3 @@ Known Limitations:
* Only Random Search/TPE/Anneal/Evolution tuner supports nested search space
* Only Random Search/TPE/Anneal/Evolution tuner supports nested search space
* We do not support nested search space "Hyper Parameter" in visualization now, the enhancement is being considered in [#1110](https://github.com/microsoft/nni/issues/1110), any suggestions or discussions or contributions are warmly welcomed
* We do not support nested search space "Hyper Parameter" in visualization now, the enhancement is being considered in [#1110](https://github.com/microsoft/nni/issues/1110), any suggestions or discussions or contributions are warmly welcomed