GPTuner.md 1.19 KB
Newer Older
Guoxin's avatar
Guoxin committed
1
2
3
4
5
GP Tuner on NNI
===

## GP Tuner

6
Bayesian optimization works by constructing a posterior distribution of functions (a Gaussian Process) that best describes the function you want to optimize. As the number of observations grows, the posterior distribution improves, and the algorithm becomes more certain of which regions in parameter space are worth exploring and which are not.
Guoxin's avatar
Guoxin committed
7

8
GP Tuner is designed to minimize/maximize the number of steps required to find a combination of parameters that are close to the optimal combination. To do so, this method uses a proxy optimization problem (finding the maximum of the acquisition function) that, albeit still a hard problem, is cheaper (in the computational sense) to solve, and it's amenable to common tools. Therefore, Bayesian Optimization is suggested for situations where sampling the function to be optimized is very expensive.
Guoxin's avatar
Guoxin committed
9

10
Note that the only acceptable types within the search space are `randint`, `uniform`, `quniform`,  `loguniform`, `qloguniform`, and numerical `choice`.
11

12
This optimization approach is described in Section 3 of [Algorithms for Hyper-Parameter Optimization](https://papers.nips.cc/paper/4443-algorithms-for-hyper-parameter-optimization.pdf).