Unverified Commit a5764016 authored by chicm-ms's avatar chicm-ms Committed by GitHub
Browse files

Install builtin tuners (#2439)

parent 0f7f9460
......@@ -192,6 +192,7 @@ Within the following table, we summarized the current NNI capabilities, we are g
<ul>
<li><a href="docs/en_US/Tuner/CustomizeTuner.md">CustomizeTuner</a></li>
<li><a href="docs/en_US/Assessor/CustomizeAssessor.md">CustomizeAssessor</a></li>
<li><a href="docs/en_US/Tutorial/InstallCustomizedAlgos.md">Install Customized Algorithms as Builtin Tuners/Assessors/Advisors</a></li>
</ul>
</td>
<td style="border-top:#FF0000 solid 0px;">
......
......@@ -63,7 +63,8 @@ setuptools.setup(
'scipy',
'coverage',
'colorama',
'scikit-learn>=0.20,<0.22'
'scikit-learn>=0.20,<0.22',
'pkginfo'
],
classifiers = [
'Programming Language :: Python :: 3',
......
# How to install customized tuner as a builtin tuner
You can following below steps to install a customized tuner in `nni/examples/tuners/customized_tuner` as a builtin tuner.
## Prepare installation source and install package
There are 2 options to install this customized tuner:
### Option 1: install from directory
Step 1: From `nni/examples/tuners/customized_tuner` directory, run:
`python setup.py develop`
This command will build the `nni/examples/tuners/customized_tuner` directory as a pip installation source.
Step 2: Run command:
`nnictl package install ./`
### Option 2: install from whl file
Step 1: From `nni/examples/tuners/customized_tuner` directory, run:
`python setup.py bdist_wheel`
This command build a whl file which is a pip installation source.
Step 2: Run command:
`nnictl package install dist/demo_tuner-0.1-py3-none-any.whl`
## Check the installed package
Then run command `nnictl package list`, you should be able to see that demotuner is installed:
```
+-----------------+------------+-----------+--------=-------------+------------------------------------------+
| Name | Type | Installed | Class Name | Module Name |
+-----------------+------------+-----------+----------------------+------------------------------------------+
| demotuner | tuners | Yes | DemoTuner | demo_tuner |
+-----------------+------------+-----------+----------------------+------------------------------------------+
```
## Use the installed tuner in experiment
Now you can use the demotuner in experiment configuration file the same way as other builtin tuners:
```yaml
tuner:
builtinTunerName: demotuner
classArgs:
#choice: maximize, minimize
optimize_mode: maximize
```
**How to install customized algorithms as builtin tuners, assessors and advisors**
===
## Overview
NNI provides a lot of [builtin tuners](../Tuner/BuiltinTuner.md), [advisors](../Tuner/BuiltinTuner.md#Hyperband) and [assessors](../Assessor/BuiltinAssessor.md) can be used directly for Hyper Parameter Optimization, and some extra algorithms can be installed via `nnictl package install --name <name>` after NNI is installed. You can check these extra algorithms via `nnictl package list` command.
NNI also provides the ability to build your own customized tuners, advisors and assessors. To use the customized algorithm, users can simply follow the spec in experiment config file to properly reference the algorithm, which has been illustrated in the tutorials of [customized tuners](../Tuner/CustomizeTuner.md)/[advisors](../Tuner/CustomizeAdvisor.md)/[assessors](../Assessor/CustomizeAssessor.md).
NNI also allows users to install the customized algorithm as a builtin algorithm, in order for users to use the algorithm in the same way as NNI builtin tuners/advisors/assessors. More importantly, it becomes much easier for users to share or distribute their implemented algorithm to others. Customized tuners/advisors/assessors can be installed into NNI as builtin algorithms, once they are installed into NNI, you can use your customized algorithms the same way as builtin tuners/advisors/assessors in your experiment configuration file. For example, you built a customized tuner and installed it into NNI using a builtin name `mytuner`, then you can use this tuner in your configuration file like below:
```yaml
tuner:
builtinTunerName: mytuner
```
## Install customized algorithms as builtin tuners, assessors and advisors
You can follow below steps to build a customized tuner/assessor/advisor, and install it into NNI as builtin algorithm.
### 1. Create a customized tuner/assessor/advisor
Reference following instructions to create:
* [customized tuner](../Tuner/CustomizeTuner.md)
* [customized assessor](../Assessor/CustomizeAssessor.md)
* [customized advisor](../Tuner/CustomizeAdvisor.md)
### 2. (Optional) Create a validator to validate classArgs
NNI provides a `ClassArgsValidator` interface for customized algorithms author to validate the classArgs parameters in experiment configuration file which are passed to customized algorithms constructors.
The `ClassArgsValidator` interface is defined as:
```python
class ClassArgsValidator(object):
def validate_class_args(self, **kwargs):
"""
The classArgs fields in experiment configuration are packed as a dict and
passed to validator as kwargs.
"""
pass
```
For example, you can implement your validator such as:
```python
from schema import Schema, Optional
from nni import ClassArgsValidator
class MedianstopClassArgsValidator(ClassArgsValidator):
def validate_class_args(self, **kwargs):
Schema({
Optional('optimize_mode'): self.choices('optimize_mode', 'maximize', 'minimize'),
Optional('start_step'): self.range('start_step', int, 0, 9999),
}).validate(kwargs)
```
The validator will be invoked before experiment is started to check whether the classArgs fields are valid for your customized algorithms.
### 3. Prepare package installation source
In order to be installed as builtin tuners, assessors and advisors, the customized algorithms need to be packaged as installable source which can be recognized by `pip` command, under the hood nni calls `pip` command to install the package.
Besides being a common pip source, the package needs to provide meta information in the `classifiers` field.
Format of classifiers field is a following:
```
NNI Package :: <type> :: <builtin name> :: <full class name of tuner> :: <full class name of class args validator>
```
* `type`: type of algorithms, could be one of `tuner`, `assessor`, `advisor`
* `builtin name`: builtin name used in experiment configuration file
* `full class name of tuner`: tuner class name, including its module name, for example: `demo_tuner.DemoTuner`
* `full class name of class args validator`: class args validator class name, including its module name, for example: `demo_tuner.MyClassArgsValidator`
Following is an example of classfiers in package's `setup.py`:
```python
classifiers = [
'Programming Language :: Python :: 3',
'License :: OSI Approved :: MIT License',
'Operating System :: ',
'NNI Package :: tuner :: demotuner :: demo_tuner.DemoTuner :: demo_tuner.MyClassArgsValidator'
],
```
Once you have the meta info in `setup.py`, you can build your pip installation source via:
* Run command `python setup.py develop` from the package directory, this command will build the directory as a pip installation source.
* Run command `python setup.py bdist_wheel` from the package directory, this command build a whl file which is a pip installation source.
NNI will look for the classifier starts with `NNI Package` to retrieve the package meta information while the package being installed with `nnictl package install <source>` command.
Reference [customized tuner example](https://github.com/microsoft/nni/blob/master/examples/tuners/customized_tuner/README.md) for a full example.
### 4. Install customized algorithms package into NNI
If your installation source is prepared as a directory with `python setup.py develop`, you can install the package by following command:
`nnictl package install <installation source directory>`
For example:
`nnictl package install nni/examples/tuners/customized_tuner/`
If your installation source is prepared as a whl file with `python setup.py bdist_wheel`, you can install the package by following command:
`nnictl package install <whl file path>`
For example:
`nnictl package install nni/examples/tuners/customized_tuner/dist/demo_tuner-0.1-py3-none-any.whl`
## 5. Use the installed builtin algorithms in experiment
Once your customized algorithms is installed, you can use it in experiment configuration file the same way as other builtin tuners/assessors/advisors, for example:
```yaml
tuner:
builtinTunerName: demotuner
classArgs:
#choice: maximize, minimize
optimize_mode: maximize
```
## Manage packages using `nnictl package`
### List installed packages
Run following command to list the installed packages:
```
nnictl package list
+-----------------+------------+-----------+--------=-------------+------------------------------------------+
| Name | Type | Installed | Class Name | Module Name |
+-----------------+------------+-----------+----------------------+------------------------------------------+
| demotuner | tuners | Yes | DemoTuner | demo_tuner |
| SMAC | tuners | No | SMACTuner | nni.smac_tuner.smac_tuner |
| PPOTuner | tuners | No | PPOTuner | nni.ppo_tuner.ppo_tuner |
| BOHB | advisors | Yes | BOHB | nni.bohb_advisor.bohb_advisor |
+-----------------+------------+-----------+----------------------+------------------------------------------+
```
Run following command to list all packages, including the builtin packages can not be uninstalled.
```
nnictl package list --all
+-----------------+------------+-----------+--------=-------------+------------------------------------------+
| Name | Type | Installed | Class Name | Module Name |
+-----------------+------------+-----------+----------------------+------------------------------------------+
| TPE | tuners | Yes | HyperoptTuner | nni.hyperopt_tuner.hyperopt_tuner |
| Random | tuners | Yes | HyperoptTuner | nni.hyperopt_tuner.hyperopt_tuner |
| Anneal | tuners | Yes | HyperoptTuner | nni.hyperopt_tuner.hyperopt_tuner |
| Evolution | tuners | Yes | EvolutionTuner | nni.evolution_tuner.evolution_tuner |
| BatchTuner | tuners | Yes | BatchTuner | nni.batch_tuner.batch_tuner |
| GridSearch | tuners | Yes | GridSearchTuner | nni.gridsearch_tuner.gridsearch_tuner |
| NetworkMorphism | tuners | Yes | NetworkMorphismTuner | nni.networkmorphism_tuner.networkmo... |
| MetisTuner | tuners | Yes | MetisTuner | nni.metis_tuner.metis_tuner |
| GPTuner | tuners | Yes | GPTuner | nni.gp_tuner.gp_tuner |
| PBTTuner | tuners | Yes | PBTTuner | nni.pbt_tuner.pbt_tuner |
| SMAC | tuners | No | SMACTuner | nni.smac_tuner.smac_tuner |
| PPOTuner | tuners | No | PPOTuner | nni.ppo_tuner.ppo_tuner |
| Medianstop | assessors | Yes | MedianstopAssessor | nni.medianstop_assessor.medianstop_... |
| Curvefitting | assessors | Yes | CurvefittingAssessor | nni.curvefitting_assessor.curvefitt... |
| Hyperband | advisors | Yes | Hyperband | nni.hyperband_advisor.hyperband_adv... |
| BOHB | advisors | Yes | BOHB | nni.bohb_advisor.bohb_advisor |
+-----------------+------------+-----------+----------------------+------------------------------------------+
```
### Uninstall package
Run following command to uninstall an installed package:
`nnictl package uninstall <builtin name>`
For example:
`nnictl package uninstall demotuner`
......@@ -702,40 +702,108 @@ Debug mode will disable version check function in Trialkeeper.
* __nnictl package install__
* Description
Install the packages needed in nni experiments.
Install a package (customized algorithms or nni provided algorithms) as builtin tuner/assessor/advisor.
* Usage
```bash
nnictl package install [OPTIONS]
nnictl package install --name <package name>
```
The available `<package name>` can be checked via `nnictl package list` command.
or
```bash
nnictl package install <installation source>
```
Reference [Install customized algorithms](InstallCustomizedAlgos.md) to prepare the installation source.
* Example
> Install SMAC tuner
```bash
nnictl package install --name SMAC
```
> Install a customized tuner
```bash
nnictl package install nni/examples/tuners/customized_tuner/dist/demo_tuner-0.1-py3-none-any.whl
```
* __nnictl package show__
* Description
Show the detailed information of specified packages.
* Usage
```bash
nnictl package show <package name>
```
* Example
```bash
nnictl package show SMAC
```
* __nnictl package list__
* Description
List the installed/all packages.
* Usage
```bash
nnictl package list [OPTIONS]
```
* Options
|Name, shorthand|Required|Default|Description|
|------|------|------ |------|
|--name| True| |The name of package to be installed|
|--all| False| |List all packages|
* Example
> Install the packages needed in tuner SMAC
> List installed packages
```bash
nnictl package install --name=SMAC
nnictl package list
```
* __nnictl package show__
> List all packages
```bash
nnictl package list --all
```
* __nnictl package uninstall__
* Description
List the packages supported.
Uninstall a package.
* Usage
```bash
nnictl package show
nnictl package uninstall <package name>
```
* Example
Uninstall SMAC package
```bash
nnictl package uninstall SMAC
```
<a name="ss_gen"></a>
![](https://placehold.it/15/1589F0/000000?text=+) `Generate search space`
......
......@@ -214,6 +214,7 @@
<ul class="firstUl">
<li><a href="{{ pathto('Tuner/CustomizeTuner') }}">CustomizeTuner</a></li>
<li><a href="{{ pathto('Assessor/CustomizeAssessor') }}">CustomizeAssessor</a></li>
<li><a href="{{ pathto('Tutorial/InstallCustomizedAlgos') }}">Install Customized Algorithms as Builtin Tuners/Assessors/Advisors</a></li>
</ul>
</td>
<td>
......
......@@ -8,3 +8,5 @@ Advanced Features
Write a New Assessor <Assessor/CustomizeAssessor>
Write a New Advisor <Tuner/CustomizeAdvisor>
Write a New Training Service <TrainingService/HowToImplementTrainingService>
Install Customized Algorithms as Builtin Tuners/Assessors/Advisors <Tutorial/InstallCustomizedAlgos>
How to install customized tuner as a builtin tuner <Tuner/InstallCustomizedTuner.md>
# How to install this customized tuner as a builtin tuner
Reference [this document](https://github.com/microsoft/nni/blob/master/docs/en_US/Tuner/InstallCustomizedTuner.md) to install this customized tuner as a builtin tuner.
\ No newline at end of file
from .demo_tuner import DemoTuner, MyClassArgsValidator
import random
import numpy as np
from nni.tuner import Tuner
from nni.utils import ClassArgsValidator
class DemoTuner(Tuner):
def __init__(self, optimize_mode='maximize'):
# optimize_mode is used to demo how to create ClassArgsValidator
self.optimize_mode = optimize_mode
def update_search_space(self, search_space):
self._space = search_space
def generate_parameters(self, parameter_id, **kwargs):
params = {}
for k in self._space:
t, v = self._space[k]['_type'], self._space[k]['_value']
if t == 'choice':
params[k] = random.choice(v)
elif t == 'randint':
params[k] = random.choice(range(v[0], v[1]))
elif t == 'uniform':
params[k] = np.random.uniform(v[0], v[1])
else:
raise RuntimeError('parameter type {} is supported by DemoTuner!'.format(t))
return params
def receive_trial_result(self, parameter_id, parameters, value, **kwargs):
pass
class MyClassArgsValidator(ClassArgsValidator):
def validate_class_args(self, **kwargs):
if 'optimize_mode' in kwargs:
assert kwargs['optimize_mode'] in ['maximize', 'minimize'], \
'optimize_mode {} is invalid!'.format(kwargs['optimize_mode'])
# Copyright (c) Microsoft Corporation.
# Licensed under the MIT license.
import setuptools
setuptools.setup(
name = 'demo-tuner',
version = '0.1',
packages = setuptools.find_packages(exclude=['*test*']),
python_requires = '>=3.5',
classifiers = [
'Programming Language :: Python :: 3',
'License :: OSI Approved :: MIT License',
'Operating System :: ',
'NNI Package :: tuner :: demotuner :: demo_tuner.DemoTuner :: demo_tuner.MyClassArgsValidator'
],
author = 'Microsoft NNI Team',
author_email = 'nni@microsoft.com',
description = 'NNI control for Neural Network Intelligence project',
license = 'MIT',
url = 'https://github.com/Microsoft/nni'
)
......@@ -41,7 +41,8 @@ setup(
'schema',
'PythonWebHDFS',
'colorama',
'scikit-learn>=0.20,<0.22'
'scikit-learn>=0.20,<0.22',
'pkginfo'
],
entry_points = {
......
......@@ -169,7 +169,7 @@ export namespace ValidationSchemas {
versionCheck: joi.boolean(),
logCollection: joi.string(),
advisor: joi.object({
builtinAdvisorName: joi.string().valid('Hyperband', 'BOHB'),
builtinAdvisorName: joi.string(),
codeDir: joi.string(),
classFileName: joi.string(),
className: joi.string(),
......@@ -178,7 +178,7 @@ export namespace ValidationSchemas {
gpuIndices: joi.string()
}),
tuner: joi.object({
builtinTunerName: joi.string().valid('TPE', 'Random', 'Anneal', 'Evolution', 'SMAC', 'BatchTuner', 'GridSearch', 'NetworkMorphism', 'MetisTuner', 'GPTuner', 'PPOTuner', 'PBTTuner'),
builtinTunerName: joi.string(),
codeDir: joi.string(),
classFileName: joi.string(),
className: joi.string(),
......@@ -188,7 +188,7 @@ export namespace ValidationSchemas {
gpuIndices: joi.string()
}),
assessor: joi.object({
builtinAssessorName: joi.string().valid('Medianstop', 'Curvefitting'),
builtinAssessorName: joi.string(),
codeDir: joi.string(),
classFileName: joi.string(),
className: joi.string(),
......
......@@ -4,6 +4,7 @@
__version__ = '999.0.0-developing'
from .env_vars import dispatcher_env_vars
from .utils import ClassArgsValidator
if dispatcher_env_vars.SDK_PROCESS != 'dispatcher':
from .trial import *
......
......@@ -2,16 +2,14 @@
# Licensed under the MIT license.
import os
import sys
import argparse
import logging
import json
import importlib
import base64
from .common import enable_multi_thread, enable_multi_phase
from .constants import ModuleName, ClassName, ClassArgs, AdvisorModuleName, AdvisorClassName
from .msg_dispatcher import MsgDispatcher
from .package_utils import create_builtin_class_instance, create_customized_class_instance
logger = logging.getLogger('nni.main')
logger.debug('START')
......@@ -20,49 +18,6 @@ if os.environ.get('COVERAGE_PROCESS_START'):
import coverage
coverage.process_startup()
def augment_classargs(input_class_args, classname):
if classname in ClassArgs:
for key, value in ClassArgs[classname].items():
if key not in input_class_args:
input_class_args[key] = value
return input_class_args
def create_builtin_class_instance(class_name, class_args, builtin_module_dict, builtin_class_dict):
if class_name not in builtin_module_dict or \
importlib.util.find_spec(builtin_module_dict[class_name]) is None:
raise RuntimeError('Builtin module is not found: {}'.format(class_name))
class_module = importlib.import_module(builtin_module_dict[class_name])
class_constructor = getattr(class_module, builtin_class_dict[class_name])
if class_args is None:
class_args = {}
class_args = augment_classargs(class_args, class_name)
instance = class_constructor(**class_args)
return instance
def create_customized_class_instance(class_params):
code_dir = class_params.get('codeDir')
class_filename = class_params.get('classFileName')
class_name = class_params.get('className')
class_args = class_params.get('classArgs')
if not os.path.isfile(os.path.join(code_dir, class_filename)):
raise ValueError('Class file not found: {}'.format(
os.path.join(code_dir, class_filename)))
sys.path.append(code_dir)
module_name = os.path.splitext(class_filename)[0]
class_module = importlib.import_module(module_name)
class_constructor = getattr(class_module, class_name)
if class_args is None:
class_args = {}
instance = class_constructor(**class_args)
return instance
def main():
parser = argparse.ArgumentParser(description='Dispatcher command line parser')
......@@ -106,11 +61,11 @@ def main():
def _run_advisor(exp_params):
if exp_params.get('advisor').get('builtinAdvisorName') in AdvisorModuleName:
if exp_params.get('advisor').get('builtinAdvisorName'):
dispatcher = create_builtin_class_instance(
exp_params.get('advisor').get('builtinAdvisorName'),
exp_params.get('advisor').get('classArgs'),
AdvisorModuleName, AdvisorClassName)
'advisors')
else:
dispatcher = create_customized_class_instance(exp_params.get('advisor'))
if dispatcher is None:
......@@ -123,11 +78,11 @@ def _run_advisor(exp_params):
def _create_tuner(exp_params):
if exp_params.get('tuner').get('builtinTunerName') in ModuleName:
if exp_params.get('tuner').get('builtinTunerName'):
tuner = create_builtin_class_instance(
exp_params.get('tuner').get('builtinTunerName'),
exp_params.get('tuner').get('classArgs'),
ModuleName, ClassName)
'tuners')
else:
tuner = create_customized_class_instance(exp_params.get('tuner'))
if tuner is None:
......@@ -136,11 +91,11 @@ def _create_tuner(exp_params):
def _create_assessor(exp_params):
if exp_params.get('assessor').get('builtinAssessorName') in ModuleName:
if exp_params.get('assessor').get('builtinAssessorName'):
assessor = create_builtin_class_instance(
exp_params.get('assessor').get('builtinAssessorName'),
exp_params.get('assessor').get('classArgs'),
ModuleName, ClassName)
'assessors')
else:
assessor = create_customized_class_instance(exp_params.get('assessor'))
if assessor is None:
......
......@@ -9,10 +9,11 @@ import sys
import math
import logging
import json_tricks
from schema import Schema, Optional
import ConfigSpace as CS
import ConfigSpace.hyperparameters as CSH
from nni import ClassArgsValidator
from nni.protocol import CommandType, send
from nni.msg_dispatcher_base import MsgDispatcherBase
from nni.utils import OptimizeMode, MetricType, extract_scalar_reward
......@@ -230,6 +231,20 @@ class Bracket:
self.num_configs_to_run.append(len(hyper_configs))
self.increase_i()
class BOHBClassArgsValidator(ClassArgsValidator):
def validate_class_args(self, **kwargs):
Schema({
'optimize_mode': self.choices('optimize_mode', 'maximize', 'minimize'),
Optional('min_budget'): self.range('min_budget', int, 0, 9999),
Optional('max_budget'): self.range('max_budget', int, 0, 9999),
Optional('eta'): self.range('eta', int, 0, 9999),
Optional('min_points_in_model'): self.range('min_points_in_model', int, 0, 9999),
Optional('top_n_percent'): self.range('top_n_percent', int, 1, 99),
Optional('num_samples'): self.range('num_samples', int, 1, 9999),
Optional('random_fraction'): self.range('random_fraction', float, 0, 9999),
Optional('bandwidth_factor'): self.range('bandwidth_factor', float, 0, 9999),
Optional('min_bandwidth'): self.range('min_bandwidth', float, 0, 9999),
}).validate(kwargs)
class BOHB(MsgDispatcherBase):
"""
......
# Copyright (c) Microsoft Corporation.
# Licensed under the MIT license.
ModuleName = {
'TPE': 'nni.hyperopt_tuner.hyperopt_tuner',
'Random': 'nni.hyperopt_tuner.hyperopt_tuner',
'Anneal': 'nni.hyperopt_tuner.hyperopt_tuner',
'Evolution': 'nni.evolution_tuner.evolution_tuner',
'SMAC': 'nni.smac_tuner.smac_tuner',
'BatchTuner': 'nni.batch_tuner.batch_tuner',
'Medianstop': 'nni.medianstop_assessor.medianstop_assessor',
'GridSearch': 'nni.gridsearch_tuner.gridsearch_tuner',
'NetworkMorphism': 'nni.networkmorphism_tuner.networkmorphism_tuner',
'Curvefitting': 'nni.curvefitting_assessor.curvefitting_assessor',
'MetisTuner': 'nni.metis_tuner.metis_tuner',
'GPTuner': 'nni.gp_tuner.gp_tuner',
'PPOTuner': 'nni.ppo_tuner.ppo_tuner',
'PBTTuner': 'nni.pbt_tuner.pbt_tuner'
}
ClassName = {
'TPE': 'HyperoptTuner',
'Random': 'HyperoptTuner',
'Anneal': 'HyperoptTuner',
'Evolution': 'EvolutionTuner',
'SMAC': 'SMACTuner',
'BatchTuner': 'BatchTuner',
'GridSearch': 'GridSearchTuner',
'NetworkMorphism':'NetworkMorphismTuner',
'MetisTuner':'MetisTuner',
'GPTuner':'GPTuner',
'PPOTuner': 'PPOTuner',
'PBTTuner': 'PBTTuner',
'Medianstop': 'MedianstopAssessor',
'Curvefitting': 'CurvefittingAssessor'
}
ClassArgs = {
'TPE': {
'algorithm_name': 'tpe'
},
'Random': {
'algorithm_name': 'random_search'
},
'Anneal': {
'algorithm_name': 'anneal'
}
}
AdvisorModuleName = {
'Hyperband': 'nni.hyperband_advisor.hyperband_advisor',
'BOHB': 'nni.bohb_advisor.bohb_advisor'
}
AdvisorClassName = {
'Hyperband': 'Hyperband',
'BOHB': 'BOHB'
BuiltinAlgorithms = {
'tuners': [
{
'name': 'TPE',
'class_name': 'nni.hyperopt_tuner.hyperopt_tuner.HyperoptTuner',
'class_args': {
'algorithm_name': 'tpe'
},
'class_args_validator': 'nni.hyperopt_tuner.hyperopt_tuner.HyperoptClassArgsValidator'
},
{
'name': 'Random',
'class_name': 'nni.hyperopt_tuner.hyperopt_tuner.HyperoptTuner',
'class_args': {
'algorithm_name': 'random_search'
},
'accept_class_args': False,
'class_args_validator': 'nni.hyperopt_tuner.hyperopt_tuner.HyperoptClassArgsValidator'
},
{
'name': 'Anneal',
'class_name': 'nni.hyperopt_tuner.hyperopt_tuner.HyperoptTuner',
'class_args': {
'algorithm_name': 'anneal'
},
'class_args_validator': 'nni.hyperopt_tuner.hyperopt_tuner.HyperoptClassArgsValidator'
},
{
'name': 'Evolution',
'class_name': 'nni.evolution_tuner.evolution_tuner.EvolutionTuner',
'class_args_validator': 'nni.evolution_tuner.evolution_tuner.EvolutionClassArgsValidator'
},
{
'name': 'BatchTuner',
'class_name': 'nni.batch_tuner.batch_tuner.BatchTuner',
'accept_class_args': False,
},
{
'name': 'GridSearch',
'class_name': 'nni.gridsearch_tuner.gridsearch_tuner.GridSearchTuner',
'accept_class_args': False,
},
{
'name': 'NetworkMorphism',
'class_name': 'nni.networkmorphism_tuner.networkmorphism_tuner.NetworkMorphismTuner',
'class_args_validator': 'nni.networkmorphism_tuner.networkmorphism_tuner.NetworkMorphismClassArgsValidator'
},
{
'name': 'MetisTuner',
'class_name': 'nni.metis_tuner.metis_tuner.MetisTuner',
'class_args_validator': 'nni.metis_tuner.metis_tuner.MetisClassArgsValidator'
},
{
'name': 'GPTuner',
'class_name': 'nni.gp_tuner.gp_tuner.GPTuner',
'class_args_validator': 'nni.gp_tuner.gp_tuner.GPClassArgsValidator'
},
{
'name': 'PBTTuner',
'class_name': 'nni.pbt_tuner.pbt_tuner.PBTTuner',
'class_args_validator': 'nni.pbt_tuner.pbt_tuner.PBTClassArgsValidator'
}
],
'assessors': [
{
'name': 'Medianstop',
'class_name': 'nni.medianstop_assessor.medianstop_assessor.MedianstopAssessor',
'class_args_validator': 'nni.medianstop_assessor.medianstop_assessor.MedianstopClassArgsValidator'
},
{
'name': 'Curvefitting',
'class_name': 'nni.curvefitting_assessor.curvefitting_assessor.CurvefittingAssessor',
'class_args_validator': 'nni.curvefitting_assessor.curvefitting_assessor.CurvefittingClassArgsValidator'
},
],
'advisors': [
{
'name': 'Hyperband',
'class_name': 'nni.hyperband_advisor.hyperband_advisor.Hyperband',
'class_args_validator': 'nni.hyperband_advisor.hyperband_advisor.HyperbandClassArgsValidator'
}
]
}
......@@ -3,12 +3,23 @@
import logging
import datetime
from schema import Schema, Optional
from nni import ClassArgsValidator
from nni.assessor import Assessor, AssessResult
from nni.utils import extract_scalar_history
from .model_factory import CurveModel
logger = logging.getLogger('curvefitting_Assessor')
class CurvefittingClassArgsValidator(ClassArgsValidator):
def validate_class_args(self, **kwargs):
Schema({
'epoch_num': self.range('epoch_num', int, 0, 9999),
Optional('start_step'): self.range('start_step', int, 0, 9999),
Optional('threshold'): self.range('threshold', float, 0, 9999),
Optional('gap'): self.range('gap', int, 1, 9999),
}).validate(kwargs)
class CurvefittingAssessor(Assessor):
"""CurvefittingAssessor uses learning curve fitting algorithm to predict the learning curve performance in the future.
......
......@@ -9,6 +9,9 @@ import copy
import random
import numpy as np
from schema import Schema, Optional
from nni import ClassArgsValidator
from nni.tuner import Tuner
from nni.utils import OptimizeMode, extract_scalar_reward, split_index, json2parameter, json2space
......@@ -65,6 +68,12 @@ class Individual:
self.save_dir = save_dir
self.info = info
class EvolutionClassArgsValidator(ClassArgsValidator):
def validate_class_args(self, **kwargs):
Schema({
'optimize_mode': self.choices('optimize_mode', 'maximize', 'minimize'),
Optional('population_size'): self.range('population_size', int, 0, 99999),
}).validate(kwargs)
class EvolutionTuner(Tuner):
"""
......
......@@ -10,10 +10,12 @@ See :class:`GPTuner` for details.
import warnings
import logging
import numpy as np
from schema import Schema, Optional
from sklearn.gaussian_process.kernels import Matern
from sklearn.gaussian_process import GaussianProcessRegressor
from nni import ClassArgsValidator
from nni.tuner import Tuner
from nni.utils import OptimizeMode, extract_scalar_reward
......@@ -22,6 +24,19 @@ from .util import UtilityFunction, acq_max
logger = logging.getLogger("GP_Tuner_AutoML")
class GPClassArgsValidator(ClassArgsValidator):
def validate_class_args(self, **kwargs):
Schema({
Optional('optimize_mode'): self.choices('optimize_mode', 'maximize', 'minimize'),
Optional('utility'): self.choices('utility', 'ei', 'ucb', 'poi'),
Optional('kappa'): float,
Optional('xi'): float,
Optional('nu'): float,
Optional('alpha'): float,
Optional('cold_start_num'): int,
Optional('selection_num_warm_up'): int,
Optional('selection_num_starting_points'): int,
}).validate(kwargs)
class GPTuner(Tuner):
"""
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment