Unverified Commit 1c56fea8 authored by chicm-ms's avatar chicm-ms Committed by GitHub
Browse files

Merge pull request #21 from microsoft/master

pull code
parents 12410686 97829ccd
......@@ -31,7 +31,7 @@ class CustomizedTuner(Tuner):
def __init__(self, ...):
...
def receive_trial_result(self, parameter_id, parameters, value):
def receive_trial_result(self, parameter_id, parameters, value, **kwargs):
'''
Receive trial's final result.
parameter_id: int
......@@ -41,7 +41,7 @@ class CustomizedTuner(Tuner):
# your code implements here.
...
def generate_parameters(self, parameter_id):
def generate_parameters(self, parameter_id, **kwargs):
'''
Returns a set of trial (hyper-)parameters, as a serializable object
parameter_id: int
......@@ -51,7 +51,7 @@ class CustomizedTuner(Tuner):
...
```
`receive_trial_result` will receive the `parameter_id, parameters, value` as parameters input. Also, Tuner will receive the `value` object are exactly same value that Trial send.
`receive_trial_result` will receive the `parameter_id, parameters, value` as parameters input. Also, Tuner will receive the `value` object are exactly same value that Trial send. If `multiPhase` is set to `true` in the experiment configuration file, an additional `trial_job_id` parameter is passed to `receive_trial_result` and `generate_parameters` through the `**kwargs` parameter.
The `your_parameters` return from `generate_parameters` function, will be package as json object by NNI SDK. NNI SDK will unpack json object so the Trial will receive the exact same `your_parameters` from Tuner.
......@@ -59,7 +59,7 @@ For example:
If the you implement the `generate_parameters` like this:
```python
def generate_parameters(self, parameter_id):
def generate_parameters(self, parameter_id, **kwargs):
'''
Returns a set of trial (hyper-)parameters, as a serializable object
parameter_id: int
......
......@@ -150,10 +150,15 @@ machineList:
Note: The maxExecDuration spec set the time of an experiment, not a trial job. If the experiment reach the max duration time, the experiment will not stop, but could not submit new trial jobs any more.
* __versionCheck__
* Description
NNI will check the version of nniManager process and the version of trialKeeper in remote, pai and kubernetes platform. If you want to disable version check, you could set versionCheck be false.
* __debug__
* Description
NNI will check the version of nniManager process and the version of trialKeeper in remote, pai and kubernetes platform. If you want to disable version check, you could set debug be true.
Debug mode will set versionCheck be False and set logLevel be 'debug'
* __maxTrialNum__
* Description
......
......@@ -38,7 +38,15 @@ To enable multi-phase, you should also add `multiPhase: true` in your experiment
### Write a tuner that leverages multi-phase:
Before writing a multi-phase tuner, we highly suggest you to go through [Customize Tuner](https://nni.readthedocs.io/en/latest/Customize_Tuner.html). Different from writing a normal tuner, your tuner needs to inherit from `MultiPhaseTuner` (in nni.multi_phase_tuner). The key difference between `Tuner` and `MultiPhaseTuner` is that the methods in MultiPhaseTuner are aware of additional information, that is, `trial_job_id`. With this information, the tuner could know which trial is requesting a configuration, and which trial is reporting results. This information provides enough flexibility for your tuner to deal with different trials and different phases. For example, you may want to use the trial_job_id parameter of generate_parameters method to generate hyperparameters for a specific trial job.
Before writing a multi-phase tuner, we highly suggest you to go through [Customize Tuner](https://nni.readthedocs.io/en/latest/Customize_Tuner.html). Same as writing a normal tuner, your tuner needs to inherit from `Tuner` class. When you enable multi-phase through configuration (set `multiPhase` to true), your tuner will get an additional parameter `trial_job_id` via tuner's following methods:
```
generate_parameters
generate_multiple_parameters
receive_trial_result
receive_customized_trial_result
trial_end
```
With this information, the tuner could know which trial is requesting a configuration, and which trial is reporting results. This information provides enough flexibility for your tuner to deal with different trials and different phases. For example, you may want to use the trial_job_id parameter of generate_parameters method to generate hyperparameters for a specific trial job.
Of course, to use your multi-phase tuner, __you should add `multiPhase: true` in your experiment YAML configure file__.
......
......@@ -144,7 +144,7 @@ export NNI_TRIAL_SEQ_ID=1
export MULTI_PHASE=false
export CUDA_VISIBLE_DEVICES=
eval python3 mnist.py 2>/home/user_name/nni/experiments/$experiment_id$/trials/$trial_id$/stderr
echo $? `date +%s000` >/home/user_name/nni/experiments/$experiment_id$/trials/$trial_id$/.nni/state
echo $? `date +%s%3N` >/home/user_name/nni/experiments/$experiment_id$/trials/$trial_id$/.nni/state
```
### Other Modes
......
......@@ -8,6 +8,7 @@ Click the tab "Overview".
* Support to download the experiment result.
* Support to export nni-manager and dispatcher log file.
* If you have any question, you can click "Feedback" to report it.
* If your experiment have more than 1000 trials, you can change the refresh interval on here.
![](../img/webui-img/over1.png)
* See good performance trials.
......@@ -58,6 +59,10 @@ Click the tab "Trials Detail" to see the status of the all trials. Specifically:
![](../img/webui-img/addColumn.png)
* If you want to compare some trials, you can select them and then click "Compare" to see the results.
![](../img/webui-img/compare.png)
* You can use the button named "Copy as python" to copy trial's parameters.
![](../img/webui-img/copyParameter.png)
......@@ -69,6 +74,6 @@ Click the tab "Trials Detail" to see the status of the all trials. Specifically:
* Kill: you can kill a job that status is running.
* Support to search for a specific trial.
* Intermediate Result Graph.
* Intermediate Result Graph: you can see default and other keys in this graph.
![](../img/intermediate.png)
![](../img/webui-img/intermediate.png)
docs/img/webui-img/addColumn.png

26.2 KB | W: | H:

docs/img/webui-img/addColumn.png

35.7 KB | W: | H:

docs/img/webui-img/addColumn.png
docs/img/webui-img/addColumn.png
docs/img/webui-img/addColumn.png
docs/img/webui-img/addColumn.png
  • 2-up
  • Swipe
  • Onion skin
docs/img/webui-img/copyParameter.png

33.5 KB | W: | H:

docs/img/webui-img/copyParameter.png

34.9 KB | W: | H:

docs/img/webui-img/copyParameter.png
docs/img/webui-img/copyParameter.png
docs/img/webui-img/copyParameter.png
docs/img/webui-img/copyParameter.png
  • 2-up
  • Swipe
  • Onion skin
docs/img/webui-img/detail-local.png

33 KB | W: | H:

docs/img/webui-img/detail-local.png

37.2 KB | W: | H:

docs/img/webui-img/detail-local.png
docs/img/webui-img/detail-local.png
docs/img/webui-img/detail-local.png
docs/img/webui-img/detail-local.png
  • 2-up
  • Swipe
  • Onion skin
docs/img/webui-img/detail-pai.png

22.1 KB | W: | H:

docs/img/webui-img/detail-pai.png

13.8 KB | W: | H:

docs/img/webui-img/detail-pai.png
docs/img/webui-img/detail-pai.png
docs/img/webui-img/detail-pai.png
docs/img/webui-img/detail-pai.png
  • 2-up
  • Swipe
  • Onion skin
docs/img/webui-img/over1.png

61.9 KB | W: | H:

docs/img/webui-img/over1.png

67.7 KB | W: | H:

docs/img/webui-img/over1.png
docs/img/webui-img/over1.png
docs/img/webui-img/over1.png
docs/img/webui-img/over1.png
  • 2-up
  • Swipe
  • Onion skin
......@@ -149,7 +149,7 @@ export NNI_TRIAL_SEQ_ID=1
export MULTI_PHASE=false
export CUDA_VISIBLE_DEVICES=
eval python3 mnist.py 2>/home/user_name/nni/experiments/$experiment_id$/trials/$trial_id$/stderr
echo $? `date +%s000` >/home/user_name/nni/experiments/$experiment_id$/trials/$trial_id$/.nni/state
echo $? `date +%s%3N` >/home/user_name/nni/experiments/$experiment_id$/trials/$trial_id$/.nni/state
```
### 其它模式
......@@ -166,4 +166,4 @@ echo $? `date +%s000` >/home/user_name/nni/experiments/$experiment_id$/trials/$t
* [为 CIFAR 10 分类找到最佳的 optimizer](Cifar10Examples.md)
* [如何在 NNI 调优 SciKit-learn 的参数](SklearnExamples.md)
* [在阅读理解上使用自动模型架构搜索。](SquadEvolutionExamples.md)
* [如何在 NNI 上调优 GBDT](GbdtExample.md)
\ No newline at end of file
* [如何在 NNI 上调优 GBDT](GbdtExample.md)
......@@ -79,7 +79,7 @@ class CustomerTuner(Tuner):
logger.debug('init population done.')
return
def generate_parameters(self, parameter_id):
def generate_parameters(self, parameter_id, **kwargs):
"""Returns a set of trial graph config, as a serializable object.
parameter_id : int
"""
......@@ -109,7 +109,7 @@ class CustomerTuner(Tuner):
return temp
def receive_trial_result(self, parameter_id, parameters, value):
def receive_trial_result(self, parameter_id, parameters, value, **kwargs):
'''
Record an observation of the objective function
parameter_id : int
......
......@@ -49,12 +49,12 @@ class RandomNASTuner(Tuner):
self.searchspace_json = search_space
self.random_state = np.random.RandomState()
def generate_parameters(self, parameter_id):
def generate_parameters(self, parameter_id, **kwargs):
'''generate
'''
return random_archi_generator(self.searchspace_json, self.random_state)
def receive_trial_result(self, parameter_id, parameters, value):
def receive_trial_result(self, parameter_id, parameters, value, **kwargs):
'''receive
'''
pass
......@@ -112,7 +112,7 @@ class CustomerTuner(Tuner):
population.append(Individual(indiv_id=self.generate_new_id(), graph_cfg=graph_tmp, result=None))
return population
def generate_parameters(self, parameter_id):
def generate_parameters(self, parameter_id, **kwargs):
"""Returns a set of trial graph config, as a serializable object.
An example configuration:
```json
......@@ -196,7 +196,7 @@ class CustomerTuner(Tuner):
logger.debug("trial {} ready".format(indiv.indiv_id))
return param_json
def receive_trial_result(self, parameter_id, parameters, value):
def receive_trial_result(self, parameter_id, parameters, value, **kwargs):
'''
Record an observation of the objective function
parameter_id : int
......
......@@ -375,7 +375,7 @@ function countFilesRecursively(directory: string, timeoutMilliSeconds?: number):
}
function validateFileName(fileName: string): boolean {
let pattern: string = '^[a-z0-9A-Z\.-_]+$';
let pattern: string = '^[a-z0-9A-Z\._-]+$';
const validateResult = fileName.match(pattern);
if(validateResult) {
return true;
......
......@@ -58,6 +58,10 @@ export abstract class ClusterJobRestServer extends RestServer {
this.port = basePort + 1;
}
get apiRootUrl(): string {
return this.API_ROOT_URL;
}
public get clusterRestServerPort(): number {
if (this.port === undefined) {
throw new Error('PAI Rest server port is undefined');
......@@ -87,7 +91,7 @@ export abstract class ClusterJobRestServer extends RestServer {
protected abstract handleTrialMetrics(jobId : string, trialMetrics : any[]) : void;
// tslint:disable: no-unsafe-any no-any
private createRestHandler() : Router {
protected createRestHandler() : Router {
const router: Router = Router();
router.use((req: Request, res: Response, next: any) => {
......
......@@ -355,7 +355,8 @@ class LocalTrainingService implements TrainingService {
this.log.info('Stopping local machine training service...');
this.stopping = true;
for (const stream of this.jobStreamMap.values()) {
stream.destroy();
stream.end(0)
stream.emit('end')
}
if (this.gpuScheduler !== undefined) {
await this.gpuScheduler.stop();
......@@ -372,7 +373,9 @@ class LocalTrainingService implements TrainingService {
if (stream === undefined) {
throw new Error(`Could not find stream in trial ${trialJob.id}`);
}
stream.destroy();
//Refer https://github.com/Juul/tail-stream/issues/20
stream.end(0)
stream.emit('end')
this.jobStreamMap.delete(trialJob.id);
}
}
......@@ -507,12 +510,12 @@ class LocalTrainingService implements TrainingService {
script.push(
`cmd /c ${localTrailConfig.command} 2>${path.join(workingDirectory, 'stderr')}`,
`$NOW_DATE = [int64](([datetime]::UtcNow)-(get-date "1/1/1970")).TotalSeconds`,
`$NOW_DATE = "$NOW_DATE" + "000"`,
`$NOW_DATE = "$NOW_DATE" + (Get-Date -Format fff).ToString()`,
`Write $LASTEXITCODE " " $NOW_DATE | Out-File ${path.join(workingDirectory, '.nni', 'state')} -NoNewline -encoding utf8`);
} else {
script.push(
`eval ${localTrailConfig.command} 2>${path.join(workingDirectory, 'stderr')}`,
`echo $? \`date +%s000\` >${path.join(workingDirectory, '.nni', 'state')}`);
`echo $? \`date +%s%3N\` >${path.join(workingDirectory, '.nni', 'state')}`);
}
return script;
......@@ -567,7 +570,6 @@ class LocalTrainingService implements TrainingService {
buffer = remain;
}
});
this.jobStreamMap.set(trialJobDetail.id, stream);
}
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment