GetStarted.md 6.98 KB
Newer Older
Deshui Yu's avatar
Deshui Yu committed
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
**Getting Started with NNI**
===
NNI (Nerual Network Intelligance) is a toolkit to help users running automated machine learning experiment. 
The tool dispatchs and runs trail jobs that generated by tunning algorithms to search the best neural architecture and/or hyper-parameters at different enviroments (e.g. local, remote servers, Cloud).

```
            AutoML experiment                                 Training Services
┌────────┐        ┌────────────────────────┐                  ┌────────────────┐
│ nnictl │ ─────> │  nni_manager           │                  │ Local Machine  │
└────────┘        │    sdk/tuner           │                  └────────────────┘
                  │      hyperopt_tuner    │
                  │      evlution_tuner    │    trail jobs    ┌────────────────┐
                  │      ...               │     ────────>    │ Remote Servers │          
                  ├────────────────────────┤                  └────────────────┘
                  │  trail job source code │                  
                  │    sdk/annotation      │                  ┌────────────────┐
                  ├────────────────────────┤                  │ Yarn,K8s,      │
                  │  nni_board             │                  │ ...            │
                  └────────────────────────┘                  └────────────────┘
```
## **Who should consider using NNI**
* You want to try different AutoML algorithms for your training code (model) at local
* You want to run AutoML trail jobs in different enviroments to speed up search (e.g. remote servers, Cloud)
* As a reseacher and data scientist, you want to implement your own AutoML algorithms and compare with other algorithms
* As a ML platform owner, you want to support AutoML in your platform

## **Setup**
28
29
* __Dependencies__  
nni requires:
Deshui Yu's avatar
Deshui Yu committed
30
```
31
32
33
  python >= 3.5
  node >= 10.9.0
  yarn >= 1.9.4
Deshui Yu's avatar
Deshui Yu committed
34
```
35
36
37
38
39
40
41
42
43
44
45
Before install nni, please make sure you have installed python environment correctly.
* __User installation__

   * clone nni repository 
   
         git clone https://github.com/Microsoft/NeuralNetworkIntelligence

   * run install.sh
   
         cd NeuralNetworkIntelligence
         sh ./install.sh
Deshui Yu's avatar
Deshui Yu committed
46

47
For more details about installation, please refer to [Installation instructions](Installation.md).
Deshui Yu's avatar
Deshui Yu committed
48
49
50
51
52
53
54

## **Quick start: run an experiment at local**
Requirements:
* local enviroment setup [TODO]

Run the following command to create an experiemnt for [mnist]
```bash
55
    nnictl create --config /usr/share/nni/examples/trials/mnist-annotation/config.yml
Deshui Yu's avatar
Deshui Yu committed
56
57
58
59
60
61
62
63
64
65
66
```
This command will start the experiment and WebUI. The WebUI endpoint will be shown in the output of this command (for example, `http://localhost:8080`). Open this URL using your browsers. You can analyze your experiment through WebUI, or open trials' tensorboard.

## **Quick start: run a customized experiment**
An experiment is to run multiple trial jobs, each trial job tries a configuration which includes a specific neural architecture (or model) and hyper-parameter values. To run an experiment through NNI, you should:

* Provide a runnable trial
* Provide or choose a tuner
* Provide a yaml experiment configure file
* (optional) Provide or choose an assessor

67
**Prepare trial**: Let's use a simple trial example, e.g. mnist, provided by NNI. After you installed NNI, NNI examples have been put in /usr/share/nni/examples, run `ls /usr/share/nni/examples/trials` to see all the trial examples. You can simply execute the following command to run the NNI mnist example: 
Deshui Yu's avatar
Deshui Yu committed
68

69
      python /usr/share/nni/examples/trials/mnist-annotation/mnist.py
Deshui Yu's avatar
Deshui Yu committed
70
71
72
73
74
75
76
77
78
79

This command will be filled in the yaml configure file below. Please refer to [here]() for how to write your own trial.

**Prepare tuner**: NNI supports several popular automl algorithms, including Random Search, Tree of Parzen Estimators (TPE), Bayesian Optimization etc. Users can write their own tuner (refer to [here]()), but for simplicity, here we can choose a tuner provided by NNI as below:

      tunerName: TPE
      optimizationMode: maximize

*tunerName* is used to specify a tuner in NNI, *optimizationMode* is to indicate whether you want to maximize or minimize your trial's result.

80
**Prepare configure file**: Since you have already known which trial code you are going to run and which tuner you are going to use, it is time to prepare the yaml configure file. NNI provides a demo configure file for each trial example, `cat /usr/share/nni/examples/trials/mnist-annotation/config.yml` to see it. Its content is basically shown below:
Deshui Yu's avatar
Deshui Yu committed
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99

```
authorName: your_name
experimentName: auto_mnist
# how many trials could be concurrently running
trialConcurrency: 2
# maximum experiment running duration
maxExecDuration: 3h
# empty means never stop
maxTrialNum: 100
# choice: local, remote  
trainingServicePlatform: local
# choice: true, false  
useAnnotation: true
tuner:
  tunerName: TPE
  optimizationMode: Maximize
trial:
  trialCommand: python mnist.py
100
  trialCodeDir: /usr/share/nni/examples/trials/mnist-annotation
Deshui Yu's avatar
Deshui Yu committed
101
102
103
104
105
106
107
  trialGpuNum: 0
``` 

Here *useAnnotation* is true because this trial example uses our python annotation (refer to [here]() for details). For trial, we should provide *trialCommand* which is the command to run the trial, provide *trialCodeDir* where the trial code is. The command will be executed in this directory. We should also provide how many GPUs a trial requires.

With all these steps done, we can run the experiment with the following command:

108
      nnictl create --config /usr/share/nni/examples/trials/mnist-annotation/config.yml
Deshui Yu's avatar
Deshui Yu committed
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128

You can refer to [here](NNICTLDOC.md) for more usage guide of *nnictl* command line tool.

## View experiment results
The experiment has been running now, NNI provides WebUI for you to view experiment progress, to control your experiment, and some other appealing features. The WebUI is opened by default by `nnictl create`.

## Further reading
* [How to write a trial running on NNI (Mnist as an example)?](WriteYourTrial.md)
* [Tutorial of NNI python annotation.](../tools/annotation/README.md)
* [Tuners supported by NNI.](../src/sdk/pynni/nni/README.md)
* [How to enable early stop (i.e. assessor) in an experiment?](EnableAssessor.md)
* [How to run an experiment on multiple machines?](RemoteMachineMode.md)
* [How to write a customized tuner?](../examples/tuners/README.md)
* [How to write a customized assessor?](../examples/assessors/README.md)
* [How to resume an experiment?]()
* [Tutorial of the command tool *nnictl*.](NNICTLDOC.md)
* [How to use *nnictl* to control multiple experiments?]()

## How to contribute
TBD