Unverified Commit c78c523b authored by xuehui's avatar xuehui Committed by GitHub
Browse files

Update doc (#438)

* update readme in ga_squad

* fix typo

* Update README.md

* Update README.md

* Update README.md

* fix path

* update README reference

* fix bug in config file about batch tuner
parent da76b6b0
...@@ -87,7 +87,7 @@ You can refer to [here](NNICTLDOC.md) for more usage guide of *nnictl* command l ...@@ -87,7 +87,7 @@ You can refer to [here](NNICTLDOC.md) for more usage guide of *nnictl* command l
The experiment has been running now, NNI provides WebUI for you to view experiment progress, to control your experiment, and some other appealing features. The WebUI is opened by default by `nnictl create`. The experiment has been running now, NNI provides WebUI for you to view experiment progress, to control your experiment, and some other appealing features. The WebUI is opened by default by `nnictl create`.
## Read more ## Read more
* [Tuners supported in the latest NNI release](../src/sdk/pynni/nni/README.md) * [Tuners supported in the latest NNI release](./HowToChooseTuner.md)
* [Overview](Overview.md) * [Overview](Overview.md)
* [Installation](InstallNNI_Ubuntu.md) * [Installation](InstallNNI_Ubuntu.md)
* [Use command line tool nnictl](NNICTLDOC.md) * [Use command line tool nnictl](NNICTLDOC.md)
......
...@@ -40,7 +40,7 @@ _Usage_: ...@@ -40,7 +40,7 @@ _Usage_:
In [Random Search for Hyper-Parameter Optimization][2] show that Random Search might be surprisingly simple and effective. We suggests that we could use Random Search as baseline when we have no knowledge about the prior distribution of hyper-parameters. In [Random Search for Hyper-Parameter Optimization][2] show that Random Search might be surprisingly simple and effective. We suggests that we could use Random Search as baseline when we have no knowledge about the prior distribution of hyper-parameters.
_Suggested scenario_: Random search is suggested when each trial does not take too long (e.g., each trial can be completed very soon, or early stopped by assessor quickly), and you have enough computation resource. Or you want to uniformly explore the search space. _Suggested scenario_: Random search is suggested when each trial does not take too long (e.g., each trial can be completed very soon, or early stopped by assessor quickly), and you have enough computation resource. Or you want to uniformly explore the search space. Random Search could be considered as baseline of search algorithm.
_Usage_: _Usage_:
``` ```
...@@ -52,11 +52,19 @@ _Usage_: ...@@ -52,11 +52,19 @@ _Usage_:
<a name="Anneal"></a> <a name="Anneal"></a>
**Anneal** **Anneal**
Intro here. This simple annealing algorithm begins by sampling from the prior, but tends over time to sample from points closer and closer to the best ones observed. This algorithm is a simple variation on random search that leverages smoothness in the response surface. The annealing rate is not adaptive.
_Suggested scenario_: _Suggested scenario_: Anneal is suggested when each trial does not take too long, and you have enough computation resource(almost same with Random Search). Or the variables in search space could be sample from some prior distribution.
_Usage_: _Usage_:
```
# config.yaml
tuner:
builtinTunerName: Anneal
classArgs:
# choice: maximize, minimize
optimize_mode: maximize
```
<a name="Evolution"></a> <a name="Evolution"></a>
**Naive Evolution** **Naive Evolution**
...@@ -80,7 +88,7 @@ _Usage_: ...@@ -80,7 +88,7 @@ _Usage_:
[SMAC][4] is based on Sequential Model-Based Optimization (SMBO). It adapts the most prominent previously used model class (Gaussian stochastic process models) and introduces the model class of random forests to SMBO, in order to handle categorical parameters. The SMAC supported by nni is a wrapper on [the SMAC3 github repo][5]. [SMAC][4] is based on Sequential Model-Based Optimization (SMBO). It adapts the most prominent previously used model class (Gaussian stochastic process models) and introduces the model class of random forests to SMBO, in order to handle categorical parameters. The SMAC supported by nni is a wrapper on [the SMAC3 github repo][5].
Note that SMAC on nni only supports a subset of the types in [search space spec](../../../../docs/SearchSpaceSpec.md), including `choice`, `randint`, `uniform`, `loguniform`, `quniform(q=1)`. Note that SMAC on nni only supports a subset of the types in [search space spec](./SearchSpaceSpec.md), including `choice`, `randint`, `uniform`, `loguniform`, `quniform(q=1)`.
_Suggested scenario_: Similar to TPE, SMAC is also a black-box tuner which can be tried in various scenarios, and is suggested when computation resource is limited. It is optimized for discrete hyperparameters, thus, suggested when most of your hyperparameters are discrete. _Suggested scenario_: Similar to TPE, SMAC is also a black-box tuner which can be tried in various scenarios, and is suggested when computation resource is limited. It is optimized for discrete hyperparameters, thus, suggested when most of your hyperparameters are discrete.
...@@ -97,7 +105,7 @@ _Usage_: ...@@ -97,7 +105,7 @@ _Usage_:
<a name="Batch"></a> <a name="Batch"></a>
**Batch tuner** **Batch tuner**
Batch tuner allows users to simply provide several configurations (i.e., choices of hyper-parameters) for their trial code. After finishing all the configurations, the experiment is done. Batch tuner only supports the type `choice` in [search space spec](../../../../docs/SearchSpaceSpec.md). Batch tuner allows users to simply provide several configurations (i.e., choices of hyper-parameters) for their trial code. After finishing all the configurations, the experiment is done. Batch tuner only supports the type `choice` in [search space spec](./SearchSpaceSpec.md).
_Suggested sceanrio_: If the configurations you want to try have been decided, you can list them in searchspace file (using `choice`) and run them using batch tuner. _Suggested sceanrio_: If the configurations you want to try have been decided, you can list them in searchspace file (using `choice`) and run them using batch tuner.
...@@ -108,11 +116,29 @@ _Usage_: ...@@ -108,11 +116,29 @@ _Usage_:
builtinTunerName: BatchTuner builtinTunerName: BatchTuner
``` ```
Note that the search space that BatchTuner supported like:
```
{
"combine_params":
{
"_type" : "choice",
"_value" : [{"optimizer": "Adam", "learning_rate": 0.00001},
{"optimizer": "Adam", "learning_rate": 0.0001},
{"optimizer": "Adam", "learning_rate": 0.001},
{"optimizer": "SGD", "learning_rate": 0.01},
{"optimizer": "SGD", "learning_rate": 0.005},
{"optimizer": "SGD", "learning_rate": 0.0002}]
}
}
```
The search space file including the high-level key `combine_params`. The type of params in search space must be `choice` and the `values` including all the combined-params value.
<a name="Grid"></a> <a name="Grid"></a>
**Grid Search** **Grid Search**
Grid Search performs an exhaustive searching through a manually specified subset of the hyperparameter space defined in the searchspace file. Grid Search performs an exhaustive searching through a manually specified subset of the hyperparameter space defined in the searchspace file.
Note that the only acceptable types of search space are `choice`, `quniform`, `qloguniform`. **The number `q` in `quniform` and `qloguniform` has special meaning (different from the spec in [search space spec](../../../../docs/SearchSpaceSpec.md)). It means the number of values that will be sampled evenly from the range `low` and `high`.** Note that the only acceptable types of search space are `choice`, `quniform`, `qloguniform`. **The number `q` in `quniform` and `qloguniform` has special meaning (different from the spec in [search space spec](./SearchSpaceSpec.md)). It means the number of values that will be sampled evenly from the range `low` and `high`.**
_Suggested scenario_: It is suggested when search space is small, it is feasible to exhaustively sweeping the whole search space. _Suggested scenario_: It is suggested when search space is small, it is feasible to exhaustively sweeping the whole search space.
...@@ -126,7 +152,7 @@ _Usage_: ...@@ -126,7 +152,7 @@ _Usage_:
<a name="Hyperband"></a> <a name="Hyperband"></a>
**Hyperband** **Hyperband**
[Hyperband][6] tries to use limited resource to explore as many configurations as possible, and finds out the promising ones to get the final result. The basic idea is generating many configurations and to run them for small number of STEPs to find out promising one, then further training those promising ones to select several more promising one. More detail can be refered to [here](hyperband_advisor/README.md) [Hyperband][6] tries to use limited resource to explore as many configurations as possible, and finds out the promising ones to get the final result. The basic idea is generating many configurations and to run them for small number of STEPs to find out promising one, then further training those promising ones to select several more promising one. More detail can be refered to [here](../src/sdk/pynni/nni/hyperband_advisor/README.md)
_Suggested scenario_: It is suggested when you have limited computation resource but have relatively large search space. It performs good in the scenario that intermediate result (e.g., accuracy) can reflect good or bad of final result (e.g., accuracy) to some extent. _Suggested scenario_: It is suggested when you have limited computation resource but have relatively large search space. It performs good in the scenario that intermediate result (e.g., accuracy) can reflect good or bad of final result (e.g., accuracy) to some extent.
......
...@@ -32,7 +32,7 @@ NNI provides a set of examples in the package to get you familiar with the above ...@@ -32,7 +32,7 @@ NNI provides a set of examples in the package to get you familiar with the above
**Trial** in NNI is an individual attempt at applying a set of parameters on a model. **Trial** in NNI is an individual attempt at applying a set of parameters on a model.
### **Tuner** ### **Tuner**
**Tuner** in NNI is an implementation of Tuner API for a special tuning algorithm. [Read more about the Tuners supported in the latest NNI release](../src/sdk/pynni/nni/README.md) **Tuner** in NNI is an implementation of Tuner API for a special tuning algorithm. [Read more about the Tuners supported in the latest NNI release](HowToChooseTuner.md)
### **Assessor** ### **Assessor**
**Assessor** in NNI is an implementation of Assessor API for optimizing the execution of experiment. **Assessor** in NNI is an implementation of Assessor API for optimizing the execution of experiment.
......
...@@ -41,7 +41,7 @@ ...@@ -41,7 +41,7 @@
* Support [OpenPAI](https://github.com/Microsoft/pai) (aka pai) Training Service (See [here](./PAIMode.md) for instructions about how to submit NNI job in pai mode) * Support [OpenPAI](https://github.com/Microsoft/pai) (aka pai) Training Service (See [here](./PAIMode.md) for instructions about how to submit NNI job in pai mode)
* Support training services on pai mode. NNI trials will be scheduled to run on OpenPAI cluster * Support training services on pai mode. NNI trials will be scheduled to run on OpenPAI cluster
* NNI trial's output (including logs and model file) will be copied to OpenPAI HDFS for further debugging and checking * NNI trial's output (including logs and model file) will be copied to OpenPAI HDFS for further debugging and checking
* Support [SMAC](https://www.cs.ubc.ca/~hutter/papers/10-TR-SMAC.pdf) tuner (See [here](../src/sdk/pynni/nni/README.md) for instructions about how to use SMAC tuner) * Support [SMAC](https://www.cs.ubc.ca/~hutter/papers/10-TR-SMAC.pdf) tuner (See [here](HowToChooseTuner.md) for instructions about how to use SMAC tuner)
* [SMAC](https://www.cs.ubc.ca/~hutter/papers/10-TR-SMAC.pdf) is based on Sequential Model-Based Optimization (SMBO). It adapts the most prominent previously used model class (Gaussian stochastic process models) and introduces the model class of random forests to SMBO to handle categorical parameters. The SMAC supported by NNI is a wrapper on [SMAC3](https://github.com/automl/SMAC3) * [SMAC](https://www.cs.ubc.ca/~hutter/papers/10-TR-SMAC.pdf) is based on Sequential Model-Based Optimization (SMBO). It adapts the most prominent previously used model class (Gaussian stochastic process models) and introduces the model class of random forests to SMBO to handle categorical parameters. The SMAC supported by NNI is a wrapper on [SMAC3](https://github.com/automl/SMAC3)
* Support NNI installation on [conda](https://conda.io/docs/index.html) and python virtual environment * Support NNI installation on [conda](https://conda.io/docs/index.html) and python virtual environment
* Others * Others
......
...@@ -95,4 +95,4 @@ More detail example you could see: ...@@ -95,4 +95,4 @@ More detail example you could see:
> * [evolution-based-customized-tuner](../examples/tuners/ga_customer_tuner) > * [evolution-based-customized-tuner](../examples/tuners/ga_customer_tuner)
## Write a more advanced automl algorithm ## Write a more advanced automl algorithm
The methods above are usually enough to write a general tuner. However, users may also want more methods, for example, intermediate results, trials' state (e.g., the methods in assessor), in order to have a more powerful automl algorithm. Therefore, we have another concept called `advisor` which directly inherits from `MsgDispatcherBase` in [`src/sdk/pynni/nni/msg_dispatcher_base.py`](../src/sdk/pynni/nni/msg_dispatcher_base.py). Please refer to [here](howto_3_CustomizedAdvisor) for how to write a customized advisor. The methods above are usually enough to write a general tuner. However, users may also want more methods, for example, intermediate results, trials' state (e.g., the methods in assessor), in order to have a more powerful automl algorithm. Therefore, we have another concept called `advisor` which directly inherits from `MsgDispatcherBase` in [`src/sdk/pynni/nni/msg_dispatcher_base.py`](../src/sdk/pynni/nni/msg_dispatcher_base.py). Please refer to [here](./howto_3_CustomizedAdvisor.md) for how to write a customized advisor.
...@@ -13,7 +13,7 @@ NNI provides an easy to adopt approach to set up parameter tuning algorithms as ...@@ -13,7 +13,7 @@ NNI provides an easy to adopt approach to set up parameter tuning algorithms as
### **Learn More about tuners** ### **Learn More about tuners**
* For detailed defintion and usage aobut the required field, please refer to [Config an experiment](ExperimentConfig.md) * For detailed defintion and usage aobut the required field, please refer to [Config an experiment](ExperimentConfig.md)
* [Tuners in the latest NNI release](../src/sdk/pynni/nni/README.md) * [Tuners in the latest NNI release](HowToChooseTuner.md)
* [How to implement your own tuner](howto_2_CustomizedTuner.md) * [How to implement your own tuner](howto_2_CustomizedTuner.md)
......
...@@ -12,9 +12,6 @@ tuner: ...@@ -12,9 +12,6 @@ tuner:
#choice: TPE, Random, Anneal, Evolution, BatchTuner #choice: TPE, Random, Anneal, Evolution, BatchTuner
#SMAC (SMAC should be installed through nnictl) #SMAC (SMAC should be installed through nnictl)
builtinTunerName: BatchTuner builtinTunerName: BatchTuner
classArgs:
#choice: maximize, minimize
optimize_mode: maximize
trial: trial:
command: python3 mnist-keras.py command: python3 mnist-keras.py
codeDir: . codeDir: .
......
...@@ -12,9 +12,6 @@ tuner: ...@@ -12,9 +12,6 @@ tuner:
#choice: TPE, Random, Anneal, Evolution, BatchTuner #choice: TPE, Random, Anneal, Evolution, BatchTuner
#SMAC (SMAC should be installed through nnictl) #SMAC (SMAC should be installed through nnictl)
builtinTunerName: BatchTuner builtinTunerName: BatchTuner
classArgs:
#choice: maximize, minimize
optimize_mode: maximize
trial: trial:
command: python3 mnist-keras.py command: python3 mnist-keras.py
codeDir: . codeDir: .
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment