GetStarted.md 4.21 KB
Newer Older
Scarlett Li's avatar
Scarlett Li committed
1
**Get Started with NNI**
Deshui Yu's avatar
Deshui Yu committed
2
3
===

4
5
## **Installation**
* __Dependencies__
6

7
      python >= 3.5
Scarlett Li's avatar
Scarlett Li committed
8
9
      git
      wget
10

11
    python pip should also be correctly installed. You could use "python3 -m pip -V" to check in Linux.
Scarlett Li's avatar
Scarlett Li committed
12
    
Scarlett Li's avatar
Scarlett Li committed
13
    * Note: we don't support virtual environment in current releases.
Deshui Yu's avatar
Deshui Yu committed
14

15
* __Install NNI through pip__
Deshui Yu's avatar
Deshui Yu committed
16

fishyds's avatar
fishyds committed
17
      python3 -m pip install -v --user git+https://github.com/Microsoft/nni.git@v0.2
18
19
20
21
      source ~/.bashrc

* __Install NNI through source code__
   
fishyds's avatar
fishyds committed
22
      git clone -b v0.2 https://github.com/Microsoft/nni.git
QuanluZhang's avatar
QuanluZhang committed
23
      cd nni
24
25
      chmod +x install.sh
      source install.sh
Deshui Yu's avatar
Deshui Yu committed
26
27
28
29
30
31
32
33
34

## **Quick start: run a customized experiment**
An experiment is to run multiple trial jobs, each trial job tries a configuration which includes a specific neural architecture (or model) and hyper-parameter values. To run an experiment through NNI, you should:

* Provide a runnable trial
* Provide or choose a tuner
* Provide a yaml experiment configure file
* (optional) Provide or choose an assessor

35
**Prepare trial**: Let's use a simple trial example, e.g. mnist, provided by NNI. After you installed NNI, NNI examples have been put in ~/nni/examples, run `ls ~/nni/examples/trials` to see all the trial examples. You can simply execute the following command to run the NNI mnist example: 
Deshui Yu's avatar
Deshui Yu committed
36

Scarlett Li's avatar
Scarlett Li committed
37
      python3 ~/nni/examples/trials/mnist-annotation/mnist.py
Deshui Yu's avatar
Deshui Yu committed
38
39
40

This command will be filled in the yaml configure file below. Please refer to [here]() for how to write your own trial.

QuanluZhang's avatar
QuanluZhang committed
41
**Prepare tuner**: NNI supports several popular automl algorithms, including Random Search, Tree of Parzen Estimators (TPE), Evolution algorithm etc. Users can write their own tuner (refer to [here](CustomizedTuner.md)), but for simplicity, here we choose a tuner provided by NNI as below:
Deshui Yu's avatar
Deshui Yu committed
42

QuanluZhang's avatar
QuanluZhang committed
43
44
45
46
      tuner:
        builtinTunerName: TPE
        classArgs:
          optimize_mode: maximize
Deshui Yu's avatar
Deshui Yu committed
47

QuanluZhang's avatar
QuanluZhang committed
48
*builtinTunerName* is used to specify a tuner in NNI, *classArgs* are the arguments pass to the tuner (the spec of builtin tuners can be found [here]()), *optimization_mode* is to indicate whether you want to maximize or minimize your trial's result.
Deshui Yu's avatar
Deshui Yu committed
49

50
**Prepare configure file**: Since you have already known which trial code you are going to run and which tuner you are going to use, it is time to prepare the yaml configure file. NNI provides a demo configure file for each trial example, `cat ~/nni/examples/trials/mnist-annotation/config.yml` to see it. Its content is basically shown below:
Deshui Yu's avatar
Deshui Yu committed
51
52
53
54

```
authorName: your_name
experimentName: auto_mnist
55

Deshui Yu's avatar
Deshui Yu committed
56
57
# how many trials could be concurrently running
trialConcurrency: 2
58

Deshui Yu's avatar
Deshui Yu committed
59
60
# maximum experiment running duration
maxExecDuration: 3h
61

Deshui Yu's avatar
Deshui Yu committed
62
63
# empty means never stop
maxTrialNum: 100
64

SparkSnail's avatar
SparkSnail committed
65
# choice: local, remote, pai
Deshui Yu's avatar
Deshui Yu committed
66
trainingServicePlatform: local
67

Deshui Yu's avatar
Deshui Yu committed
68
69
70
# choice: true, false  
useAnnotation: true
tuner:
71
72
73
  builtinTunerName: TPE
  classArgs:
    optimize_mode: maximize
Deshui Yu's avatar
Deshui Yu committed
74
trial:
75
76
77
  command: python mnist.py
  codeDir: ~/nni/examples/trials/mnist-annotation
  gpuNum: 0
Deshui Yu's avatar
Deshui Yu committed
78
79
``` 

QuanluZhang's avatar
QuanluZhang committed
80
Here *useAnnotation* is true because this trial example uses our python annotation (refer to [here](../tools/annotation/README.md) for details). For trial, we should provide *trialCommand* which is the command to run the trial, provide *trialCodeDir* where the trial code is. The command will be executed in this directory. We should also provide how many GPUs a trial requires.
Deshui Yu's avatar
Deshui Yu committed
81
82
83

With all these steps done, we can run the experiment with the following command:

84
      nnictl create --config ~/nni/examples/trials/mnist-annotation/config.yml
Deshui Yu's avatar
Deshui Yu committed
85
86
87
88
89
90
91

You can refer to [here](NNICTLDOC.md) for more usage guide of *nnictl* command line tool.

## View experiment results
The experiment has been running now, NNI provides WebUI for you to view experiment progress, to control your experiment, and some other appealing features. The WebUI is opened by default by `nnictl create`.

## Further reading
Scarlett Li's avatar
Scarlett Li committed
92
93
94
95
96
97
98
99
100
* [Overview](Overview.md)
* [Installation](InstallNNI_Ubuntu.md)
* [Use command line tool nnictl](NNICTLDOC.md)
* [Use NNIBoard](WebUI.md)
* [Define search space](SearchSpaceSpec.md)
* [Config an experiment](ExperimentConfig.md)
* [How to run an experiment on local (with multiple GPUs)?](tutorial_1_CR_exp_local_api.md)
* [How to run an experiment on multiple machines?](tutorial_2_RemoteMachineMode.md)
* [How to run an experiment on OpenPAI?](PAIMode.md)