SearchSpaceSpec.rst 9.35 KB
Newer Older
qianyj's avatar
qianyj committed
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
.. role:: raw-html(raw)
   :format: html

Search Space
============

Overview
--------

In NNI, tuner will sample parameters/architectures according to the search space.

To define a search space, users should define the name of the variable, the type of sampling strategy and its parameters.

* An example of a search space definition in a JSON file is as follow:

.. code-block:: json 

   {
       "dropout_rate": {"_type": "uniform", "_value": [0.1, 0.5]},
       "conv_size": {"_type": "choice", "_value": [2, 3, 5, 7]},
       "hidden_size": {"_type": "choice", "_value": [124, 512, 1024]},
       "batch_size": {"_type": "choice", "_value": [50, 250, 500]},
       "learning_rate": {"_type": "uniform", "_value": [0.0001, 0.1]}
   }

Take the first line as an example. ``dropout_rate`` is defined as a variable whose prior distribution is a uniform distribution with a range from ``0.1`` to ``0.5``.

.. note:: In the `experiment configuration (V2) schema <ExperimentConfig.rst>`_, NNI supports defining the search space directly in the configuration file, detailed usage can be found `here <QuickStart.rst#step-2-define-the-search-space>`__. When using Python API, users can write the search space in the Python file, refer `here <HowToLaunchFromPython.rst>`__. 

Note that the available sampling strategies within a search space depend on the tuner you want to use. We list the supported types for each builtin tuner below. For a customized tuner, you don't have to follow our convention and you will have the flexibility to define any type you want.

Types
-----

All types of sampling strategies and their parameter are listed here:


* 
  ``{"_type": "choice", "_value": options}``


  * The variable's value is one of the options. Here ``options`` should be a list of **numbers** or a list of **strings**. Using arbitrary objects as members of this list (like sublists, a mixture of numbers and strings, or null values) should work in most cases, but may trigger undefined behaviors.
  * ``options`` can also be a nested sub-search-space, this sub-search-space takes effect only when the corresponding element is chosen. The variables in this sub-search-space can be seen as conditional variables. Here is an simple :githublink:`example of nested search space definition <examples/trials/mnist-nested-search-space/search_space.json>`. If an element in the options list is a dict, it is a sub-search-space, and for our built-in tuners you have to add a ``_name`` key in this dict, which helps you to identify which element is chosen. Accordingly, here is a :githublink:`sample <examples/trials/mnist-nested-search-space/sample.json>` which users can get from nni with nested search space definition. See the table below for the tuners which support nested search spaces.

* 
  ``{"_type": "randint", "_value": [lower, upper]}``


  * Choosing a random integer between ``lower`` (inclusive) and ``upper`` (exclusive).
  * Note: Different tuners may interpret ``randint`` differently. Some (e.g., TPE, GridSearch) treat integers from lower
    to upper as unordered ones, while others respect the ordering (e.g., SMAC). If you want all the tuners to respect
    the ordering, please use ``quniform`` with ``q=1``.

* 
  ``{"_type": "uniform", "_value": [low, high]}``


  * The variable value is uniformly sampled between low and high.
  * When optimizing, this variable is constrained to a two-sided interval.

* 
  ``{"_type": "quniform", "_value": [low, high, q]}``


  * The variable value is determined using ``clip(round(uniform(low, high) / q) * q, low, high)``\ , where the clip operation is used to constrain the generated value within the bounds. For example, for ``_value`` specified as [0, 10, 2.5], possible values are [0, 2.5, 5.0, 7.5, 10.0]; For ``_value`` specified as [2, 10, 5], possible values are [2, 5, 10].
  * Suitable for a discrete value with respect to which the objective is still somewhat "smooth", but which should be bounded both above and below. If you want to uniformly choose an integer from a range [low, high], you can write ``_value`` like this: ``[low, high, 1]``.

* 
  ``{"_type": "loguniform", "_value": [low, high]}``


  * The variable value is drawn from a range [low, high] according to a loguniform distribution like exp(uniform(log(low), log(high))), so that the logarithm of the return value is uniformly distributed.
  * When optimizing, this variable is constrained to be positive.

* 
  ``{"_type": "qloguniform", "_value": [low, high, q]}``


  * The variable value is determined using ``clip(round(loguniform(low, high) / q) * q, low, high)``\ , where the clip operation is used to constrain the generated value within the bounds.
  * Suitable for a discrete variable with respect to which the objective is "smooth" and gets smoother with the size of the value, but which should be bounded both above and below.

* 
  ``{"_type": "normal", "_value": [mu, sigma]}``


  * The variable value is a real value that's normally-distributed with mean mu and standard deviation sigma. When optimizing, this is an unconstrained variable.

* 
  ``{"_type": "qnormal", "_value": [mu, sigma, q]}``


  * The variable value is determined using ``round(normal(mu, sigma) / q) * q``
  * Suitable for a discrete variable that probably takes a value around mu, but is fundamentally unbounded.

* 
  ``{"_type": "lognormal", "_value": [mu, sigma]}``


  * The variable value is drawn according to ``exp(normal(mu, sigma))`` so that the logarithm of the return value is normally distributed. When optimizing, this variable is constrained to be positive.

* 
  ``{"_type": "qlognormal", "_value": [mu, sigma, q]}``


  * The variable value is determined using ``round(exp(normal(mu, sigma)) / q) * q``
  * Suitable for a discrete variable with respect to which the objective is smooth and gets smoother with the size of the variable, which is bounded from one side.

Search Space Types Supported by Each Tuner
------------------------------------------

.. list-table::
   :header-rows: 1
   :widths: auto

   * - 
     - choice
     - choice(nested)
     - randint
     - uniform
     - quniform
     - loguniform
     - qloguniform
     - normal
     - qnormal
     - lognormal
     - qlognormal
   * - TPE Tuner
     - :raw-html:`&#10003;`
     - :raw-html:`&#10003;`
     - :raw-html:`&#10003;`
     - :raw-html:`&#10003;`
     - :raw-html:`&#10003;`
     - :raw-html:`&#10003;`
     - :raw-html:`&#10003;`
     - :raw-html:`&#10003;`
     - :raw-html:`&#10003;`
     - :raw-html:`&#10003;`
     - :raw-html:`&#10003;`
   * - Random Search Tuner
     - :raw-html:`&#10003;`
     - :raw-html:`&#10003;`
     - :raw-html:`&#10003;`
     - :raw-html:`&#10003;`
     - :raw-html:`&#10003;`
     - :raw-html:`&#10003;`
     - :raw-html:`&#10003;`
     - :raw-html:`&#10003;`
     - :raw-html:`&#10003;`
     - :raw-html:`&#10003;`
     - :raw-html:`&#10003;`
   * - Anneal Tuner
     - :raw-html:`&#10003;`
     - :raw-html:`&#10003;`
     - :raw-html:`&#10003;`
     - :raw-html:`&#10003;`
     - :raw-html:`&#10003;`
     - :raw-html:`&#10003;`
     - :raw-html:`&#10003;`
     - :raw-html:`&#10003;`
     - :raw-html:`&#10003;`
     - :raw-html:`&#10003;`
     - :raw-html:`&#10003;`
   * - Evolution Tuner
     - :raw-html:`&#10003;`
     - :raw-html:`&#10003;`
     - :raw-html:`&#10003;`
     - :raw-html:`&#10003;`
     - :raw-html:`&#10003;`
     - :raw-html:`&#10003;`
     - :raw-html:`&#10003;`
     - :raw-html:`&#10003;`
     - :raw-html:`&#10003;`
     - :raw-html:`&#10003;`
     - :raw-html:`&#10003;`
   * - SMAC Tuner
     - :raw-html:`&#10003;`
     - 
     - :raw-html:`&#10003;`
     - :raw-html:`&#10003;`
     - :raw-html:`&#10003;`
     - :raw-html:`&#10003;`
     - 
     - 
     - 
     - 
     - 
   * - Batch Tuner
     - :raw-html:`&#10003;`
     - 
     - 
     - 
     - 
     - 
     - 
     - 
     - 
     - 
     - 
   * - Grid Search Tuner
     - :raw-html:`&#10003;`
     - 
     - :raw-html:`&#10003;`
     - 
     - :raw-html:`&#10003;`
     - 
     - 
     - 
     - 
     - 
     - 
   * - Hyperband Advisor
     - :raw-html:`&#10003;`
     - 
     - :raw-html:`&#10003;`
     - :raw-html:`&#10003;`
     - :raw-html:`&#10003;`
     - :raw-html:`&#10003;`
     - :raw-html:`&#10003;`
     - :raw-html:`&#10003;`
     - :raw-html:`&#10003;`
     - :raw-html:`&#10003;`
     - :raw-html:`&#10003;`
   * - Metis Tuner
     - :raw-html:`&#10003;`
     - 
     - :raw-html:`&#10003;`
     - :raw-html:`&#10003;`
     - :raw-html:`&#10003;`
     - 
     - 
     - 
     - 
     - 
     - 
   * - GP Tuner
     - :raw-html:`&#10003;`
     - 
     - :raw-html:`&#10003;`
     - :raw-html:`&#10003;`
     - :raw-html:`&#10003;`
     - :raw-html:`&#10003;`
     - :raw-html:`&#10003;`
     - 
     - 
     - 
     - 
   * - DNGO Tuner
     - :raw-html:`&#10003;`
     - 
     - :raw-html:`&#10003;`
     - :raw-html:`&#10003;`
     - :raw-html:`&#10003;`
     - :raw-html:`&#10003;`
     - :raw-html:`&#10003;`
     - 
     - 
     - 
     - 


Known Limitations:


* 
  GP Tuner, Metis Tuner and DNGO tuner support only **numerical values** in search space (\ ``choice`` type values can be no-numerical with other tuners, e.g. string values). Both GP Tuner and Metis Tuner use Gaussian Process Regressor(GPR). GPR make predictions based on a kernel function and the 'distance' between different points, it's hard to get the true distance between no-numerical values.

* 
  Note that for nested search space:


  * Only Random Search/TPE/Anneal/Evolution/Grid Search tuner supports nested search space