SearchSpaceSpec.md 6.8 KB
Newer Older
Yan Ni's avatar
Yan Ni committed
1
# Search Space
2

Yan Ni's avatar
Yan Ni committed
3
## Overview
4

Yan Ni's avatar
Yan Ni committed
5
6
7
8
In NNI, tuner will sample parameters/architecture according to the search space, which is defined as a json file.

To define a search space, users should define the name of variable, the type of sampling strategy and its parameters.

Lee's avatar
Lee committed
9
* An example of search space definition as follow:
10

11
```yaml
12
13
14
15
16
17
18
19
20
21
{
    "dropout_rate":{"_type":"uniform","_value":[0.1,0.5]},
    "conv_size":{"_type":"choice","_value":[2,3,5,7]},
    "hidden_size":{"_type":"choice","_value":[124, 512, 1024]},
    "batch_size":{"_type":"choice","_value":[50, 250, 500]},
    "learning_rate":{"_type":"uniform","_value":[0.0001, 0.1]}
}

```

22
Take the first line as an example. `dropout_rate` is defined as a variable whose priori distribution is a uniform distribution of a range from `0.1` and `0.5`.
Yan Ni's avatar
Yan Ni committed
23
24
25
26

## Types

All types of sampling strategies and their parameter are listed here:
27
28

* {"_type":"choice","_value":options}
Lee's avatar
Lee committed
29
30
31
32
33
34
35
36
37

  * Which means the variable's value is one of the options. Here 'options' should be a list. Each element of options is a number of string. It could also be a nested sub-search-space, this sub-search-space takes effect only when the corresponding element is chosen. The variables in this sub-search-space could be seen as conditional variables.

  * An simple [example](../../examples/trials/mnist-cascading-search-space/search_space.json) of [nested] search space definition. If an element in the options list is a dict, it is a sub-search-space, and for our built-in tuners you have to add a key '_name' in this dict, which helps you to identify which element is chosen. Accordingly, here is a [sample](../../examples/trials/mnist-cascading-search-space/sample.json) which users can get from nni with nested search space definition. Tuners which support nested search space is as follows:

    - Random Search 
    - TPE
    - Anneal
    - Evolution
xuehui's avatar
xuehui committed
38

39
* {"_type":"randint","_value":[upper]}
Lee's avatar
Lee committed
40

41
  * Which means the variable value is a random integer in the range [0, upper). The semantics of this distribution is that there is no more correlation in the loss function between nearby integer values, as compared with more distant integer values. This is an appropriate distribution for describing random seeds for example. If the loss function is probably more correlated for nearby integer values, then you should probably use one of the "quantized" continuous distributions, such as either quniform, qloguniform, qnormal or qlognormal. Note that if you want to change lower bound, you can use `quniform` for now.
xuehui's avatar
xuehui committed
42

43
* {"_type":"uniform","_value":[low, high]}
44
45
  * Which means the variable value is a value uniformly between low and high.
  * When optimizing, this variable is constrained to a two-sided interval.
xuehui's avatar
xuehui committed
46

47
* {"_type":"quniform","_value":[low, high, q]}
48
49
  * Which means the variable value is a value like round(uniform(low, high) / q) * q
  * Suitable for a discrete value with respect to which the objective is still somewhat "smooth", but which should be bounded both above and below. If you want to uniformly choose integer from a range [low, high], you can write `_value` like this: `[low, high, 1]`.
xuehui's avatar
xuehui committed
50

51
* {"_type":"loguniform","_value":[low, high]}
52
53
  * Which means the variable value is a value drawn from a range [low, high] according to a loguniform distribution like exp(uniform(log(low), log(high))), so that the logarithm of the return value is uniformly distributed.
  * When optimizing, this variable is constrained to be positive.
xuehui's avatar
xuehui committed
54

55
* {"_type":"qloguniform","_value":[low, high, q]}
56
57
  * Which means the variable value is a value like round(loguniform(low, high)) / q) * q
  * Suitable for a discrete variable with respect to which the objective is "smooth" and gets smoother with the size of the value, but which should be bounded both above and below.
xuehui's avatar
xuehui committed
58

59
* {"_type":"normal","_value":[mu, sigma]}
Lee's avatar
Lee committed
60

61
  * Which means the variable value is a real value that's normally-distributed with mean mu and standard deviation sigma. When optimizing, this is an unconstrained variable.
xuehui's avatar
xuehui committed
62

63
* {"_type":"qnormal","_value":[mu, sigma, q]}
64
65
  * Which means the variable value is a value like round(normal(mu, sigma) / q) * q
  * Suitable for a discrete variable that probably takes a value around mu, but is fundamentally unbounded.
xuehui's avatar
xuehui committed
66

67
* {"_type":"lognormal","_value":[mu, sigma]}
Lee's avatar
Lee committed
68

69
  * Which means the variable value is a value drawn according to exp(normal(mu, sigma)) so that the logarithm of the return value is normally distributed. When optimizing, this variable is constrained to be positive.
xuehui's avatar
xuehui committed
70

71
* {"_type":"qlognormal","_value":[mu, sigma, q]}
72
73
  * Which means the variable value is a value like round(exp(normal(mu, sigma)) / q) * q
  * Suitable for a discrete variable with respect to which the objective is smooth and gets smoother with the size of the variable, which is bounded from one side.
74

Yan Ni's avatar
Yan Ni committed
75
76
77
78
79
80
81
82
83
84
85
86
87
88
## Search Space Types Supported by Each Tuner

|                   | choice  | randint | uniform | quniform | loguniform | qloguniform | normal  | qnormal | lognormal | qlognormal |
|:------:|:------:|:------:|:------:|:------:|:------:|:------:|:------:|:------:|:------:|:------:|
| TPE Tuner         | ✓ | ✓ | ✓ | ✓  | ✓    | ✓     | ✓ | ✓ | ✓   | ✓    |
| Random Search Tuner| ✓ | ✓ | ✓ | ✓  | ✓    | ✓     | ✓ | ✓ | ✓   | ✓    |
| Anneal Tuner   | ✓ | ✓ | ✓ | ✓  | ✓    | ✓     | ✓ | ✓ | ✓   | ✓    |
| Evolution Tuner   | ✓ | ✓ | ✓ | ✓  | ✓    | ✓     | ✓ | ✓ | ✓   | ✓    |
| SMAC Tuner        | ✓ | ✓ | ✓ | ✓  | ✓    |      |  |  |    |     |
| Batch Tuner       | ✓ |  |  |   |     |      |  |  |    |     |
| Grid Search Tuner | ✓ |  |  | ✓  |     | ✓     |  |  |    |     |
| Hyperband Advisor | ✓ | ✓ | ✓ | ✓  | ✓    | ✓     | ✓ | ✓ | ✓   | ✓    |
| Metis Tuner   | ✓ | ✓ | ✓ | ✓  |     |      |  |  |    |     |

89
Note that In Grid Search Tuner, for users' convenience, the definition of `quniform` and `qloguniform` change, where q here specifies the number of values that will be sampled. Details about them are listed as follows
Yan Ni's avatar
Yan Ni committed
90

91
* Type 'quniform' will receive three values [low, high, q], where [low, high] specifies a range and 'q' specifies the number of values that will be sampled evenly. Note that q should be at least 2. It will be sampled in a way that the first sampled value is 'low', and each of the following values is (high-low)/q larger that the value in front of it.
Chi Song's avatar
Chi Song committed
92
* Type 'qloguniform' behaves like 'quniform' except that it will first change the range to [log(low), log(high)] and sample and then change the sampled value back.
Yan Ni's avatar
Yan Ni committed
93
94

Note that Metis Tuner only support numerical `choice` now