"test/vscode:/vscode.git/clone" did not exist on "3af2ad9654784325c96aa3f7157ec53bb61db62f"
QuickStart.md 12.3 KB
Newer Older
Yan Ni's avatar
Yan Ni committed
1
2
3
4
# QuickStart

## Installation

5
We currently support Linux, macOS, and Windows. Ubuntu 16.04 or higher, macOS 10.14.1, and Windows 10.1809 are tested and supported. Simply run the following `pip install` in an environment that has `python >= 3.6`.
QuanluZhang's avatar
QuanluZhang committed
6

7
### Linux and macOS
Chi Song's avatar
Chi Song committed
8

Yan Ni's avatar
Yan Ni committed
9
```bash
10
python3 -m pip install --upgrade nni
Yan Ni's avatar
Yan Ni committed
11
```
Chi Song's avatar
Chi Song committed
12

13
### Windows
14

15
```bash
16
python -m pip install --upgrade nni
17
```
Chi Song's avatar
Chi Song committed
18

19
20
21
```eval_rst
.. Note:: For Linux and macOS, ``--user`` can be added if you want to install NNI in your home directory; this does not require any special privileges.
```
Yan Ni's avatar
Yan Ni committed
22

23
24
25
26
27
28
29
```eval_rst
.. Note:: If there is an error like ``Segmentation fault``, please refer to the :doc:`FAQ <FAQ>`.
```

```eval_rst
.. Note:: For the system requirements of NNI, please refer to :doc:`Install NNI on Linux & Mac <InstallationLinux>` or :doc:`Windows <InstallationWin>`.
```
Junwei Sun's avatar
Junwei Sun committed
30
31
32
### Enable NNI Command-line Auto-Completion (Optional)

After the installation, you may want to enable the auto-completion feature for __nnictl__ commands. Please refer to this [tutorial](../CommunitySharings/AutoCompletion.md).
Yan Ni's avatar
Yan Ni committed
33
34
35

## "Hello World" example on MNIST

36
NNI is a toolkit to help users run automated machine learning experiments. It can automatically do the cyclic process of getting hyperparameters, running trials, testing results, and tuning hyperparameters. Here, we'll show how to use NNI to help you find the optimal hyperparameters for a MNIST model.
Yan Ni's avatar
Yan Ni committed
37

38
Here is an example script to train a CNN on the MNIST dataset **without NNI**:
Yan Ni's avatar
Yan Ni committed
39
40
41
42
43

```python
def run_trial(params):
    # Input data
    mnist = input_data.read_data_sets(params['data_dir'], one_hot=True)
44
    # Build network
45
46
47
48
49
50
    mnist_network = MnistNetwork(channel_1_num=params['channel_1_num'],
                                 channel_2_num=params['channel_2_num'],
                                 conv_size=params['conv_size'],
                                 hidden_size=params['hidden_size'],
                                 pool_size=params['pool_size'],
                                 learning_rate=params['learning_rate'])
Yan Ni's avatar
Yan Ni committed
51
52
53
54
    mnist_network.build_network()

    test_acc = 0.0
    with tf.Session() as sess:
55
        # Train network
Yan Ni's avatar
Yan Ni committed
56
        mnist_network.train(sess, mnist)
57
        # Evaluate network
Yan Ni's avatar
Yan Ni committed
58
59
60
        test_acc = mnist_network.evaluate(mnist)

if __name__ == '__main__':
61
62
63
64
65
66
67
68
69
70
    params = {'data_dir': '/tmp/tensorflow/mnist/input_data',
              'dropout_rate': 0.5,
              'channel_1_num': 32,
              'channel_2_num': 64,
              'conv_size': 5,
              'pool_size': 2,
              'hidden_size': 1024,
              'learning_rate': 1e-4,
              'batch_num': 2000,
              'batch_size': 32}
Yan Ni's avatar
Yan Ni committed
71
72
73
    run_trial(params)
```

74
If you want to see the full implementation, please refer to [examples/trials/mnist-tfv1/mnist_before.py](https://github.com/Microsoft/nni/tree/v1.9/examples/trials/mnist-tfv1/mnist_before.py).
Yan Ni's avatar
Yan Ni committed
75

76
The above code can only try one set of parameters at a time; if we want to tune learning rate, we need to manually modify the hyperparameter and start the trial again and again.
Yan Ni's avatar
Yan Ni committed
77

78
NNI is born to help the user do tuning jobs; the NNI working process is presented below:
Yan Ni's avatar
Yan Ni committed
79

QuanluZhang's avatar
QuanluZhang committed
80
```text
Yan Ni's avatar
Yan Ni committed
81
82
83
84
85
86
87
88
89
90
91
92
input: search space, trial code, config file
output: one optimal hyperparameter configuration

1: For t = 0, 1, 2, ..., maxTrialNum,
2:      hyperparameter = chose a set of parameter from search space
3:      final result = run_trial_and_evaluate(hyperparameter)
4:      report final result to NNI
5:      If reach the upper limit time,
6:          Stop the experiment
7: return hyperparameter value with best final result
```

93
If you want to use NNI to automatically train your model and find the optimal hyper-parameters, you need to do three changes based on your code:
Yan Ni's avatar
Yan Ni committed
94

95
### Three steps to start an experiment
Yan Ni's avatar
Yan Ni committed
96

97
**Step 1**: Write a `Search Space` file in JSON, including the `name` and the `distribution` (discrete-valued or continuous-valued) of all the hyperparameters you need to search.
Yan Ni's avatar
Yan Ni committed
98
99
100
101
102
103
104
105
106
107
108
109
110

```diff
-   params = {'data_dir': '/tmp/tensorflow/mnist/input_data', 'dropout_rate': 0.5, 'channel_1_num': 32, 'channel_2_num': 64,
-   'conv_size': 5, 'pool_size': 2, 'hidden_size': 1024, 'learning_rate': 1e-4, 'batch_num': 2000, 'batch_size': 32}
+ {
+     "dropout_rate":{"_type":"uniform","_value":[0.5, 0.9]},
+     "conv_size":{"_type":"choice","_value":[2,3,5,7]},
+     "hidden_size":{"_type":"choice","_value":[124, 512, 1024]},
+     "batch_size": {"_type":"choice", "_value": [1, 4, 8, 16, 32]},
+     "learning_rate":{"_type":"choice","_value":[0.0001, 0.001, 0.01, 0.1]}
+ }
```

111
*Example: [search_space.json](https://github.com/Microsoft/nni/tree/v1.9/examples/trials/mnist-tfv1/search_space.json)*
Yan Ni's avatar
Yan Ni committed
112

113
**Step 2**: Modify your `Trial` file to get the hyperparameter set from NNI and report the final result to NNI.
Yan Ni's avatar
Yan Ni committed
114
115
116
117
118
119
120
121
122
123
124
125
126

```diff
+ import nni

  def run_trial(params):
      mnist = input_data.read_data_sets(params['data_dir'], one_hot=True)

      mnist_network = MnistNetwork(channel_1_num=params['channel_1_num'], channel_2_num=params['channel_2_num'], conv_size=params['conv_size'], hidden_size=params['hidden_size'], pool_size=params['pool_size'], learning_rate=params['learning_rate'])
      mnist_network.build_network()

      with tf.Session() as sess:
          mnist_network.train(sess, mnist)
          test_acc = mnist_network.evaluate(mnist)
Chi Song's avatar
Chi Song committed
127
+         nni.report_final_result(test_acc)
Yan Ni's avatar
Yan Ni committed
128
129
130
131
132
133
134
135

  if __name__ == '__main__':
-     params = {'data_dir': '/tmp/tensorflow/mnist/input_data', 'dropout_rate': 0.5, 'channel_1_num': 32, 'channel_2_num': 64,
-     'conv_size': 5, 'pool_size': 2, 'hidden_size': 1024, 'learning_rate': 1e-4, 'batch_num': 2000, 'batch_size': 32}
+     params = nni.get_next_parameter()
      run_trial(params)
```

136
*Example: [mnist.py](https://github.com/Microsoft/nni/tree/v1.9/examples/trials/mnist-tfv1/mnist.py)*
Yan Ni's avatar
Yan Ni committed
137

138
**Step 3**: Define a `config` file in YAML which declares the `path` to the search space and trial files. It also gives other information such as the tuning algorithm, max trial number, and max duration arguments.
Yan Ni's avatar
Yan Ni committed
139

Yan Ni's avatar
Yan Ni committed
140
```yaml
Yan Ni's avatar
Yan Ni committed
141
142
143
144
145
146
147
148
149
150
151
152
authorName: default
experimentName: example_mnist
trialConcurrency: 1
maxExecDuration: 1h
maxTrialNum: 10
trainingServicePlatform: local
# The path to Search Space
searchSpacePath: search_space.json
useAnnotation: false
tuner:
  builtinTunerName: TPE
# The path and the running command of trial
153
trial:
Yan Ni's avatar
Yan Ni committed
154
155
156
157
  command: python3 mnist.py
  codeDir: .
  gpuNum: 0
```
Chi Song's avatar
Chi Song committed
158

159
160
161
```eval_rst
.. Note:: If you are planning to use remote machines or clusters as your :doc:`training service <../TrainingService/Overview>`, to avoid too much pressure on network, we limit the number of files to 2000 and total size to 300MB. If your codeDir contains too many files, you can choose which files and subfolders should be excluded by adding a ``.nniignore`` file that works like a ``.gitignore`` file. For more details on how to write this file, see the `git documentation <https://git-scm.com/docs/gitignore#_pattern_format>`_.
```
Yan Ni's avatar
Yan Ni committed
162

163
*Example: [config.yml](https://github.com/Microsoft/nni/tree/v1.9/examples/trials/mnist-tfv1/config.yml) [.nniignore](https://github.com/Microsoft/nni/tree/v1.9/examples/trials/mnist-tfv1/.nniignore)*
Yan Ni's avatar
Yan Ni committed
164

165
All the code above is already prepared and stored in [examples/trials/mnist-tfv1/](https://github.com/Microsoft/nni/tree/v1.9/examples/trials/mnist-tfv1).
Yan Ni's avatar
Yan Ni committed
166

167
#### Linux and macOS
QuanluZhang's avatar
QuanluZhang committed
168

169
Run the **config.yml** file from your command line to start an MNIST experiment.
Yan Ni's avatar
Yan Ni committed
170
171

```bash
172
nnictl create --config nni/examples/trials/mnist-tfv1/config.yml
Yan Ni's avatar
Yan Ni committed
173
```
QuanluZhang's avatar
QuanluZhang committed
174

175
#### Windows
QuanluZhang's avatar
QuanluZhang committed
176

177
Run the **config_windows.yml** file from your command line to start an MNIST experiment.
Chi Song's avatar
Chi Song committed
178

179
```bash
180
181
182
183
184
nnictl create --config nni\examples\trials\mnist-tfv1\config_windows.yml
```

```eval_rst
.. Note:: If you're using NNI on Windows, you probably need to change ``python3`` to ``python`` in the config.yml file or use the config_windows.yml file to start the experiment.
185
```
Yan Ni's avatar
Yan Ni committed
186

187
188
189
```eval_rst
.. Note:: ``nnictl`` is a command line tool that can be used to control experiments, such as start/stop/resume an experiment, start/stop NNIBoard, etc. Click :doc:`here <Nnictl>` for more usage of ``nnictl``.
```
Yan Ni's avatar
Yan Ni committed
190

191
Wait for the message `INFO: Successfully started experiment!` in the command line. This message indicates that your experiment has been successfully started. And this is what we expect to get:
Yan Ni's avatar
Yan Ni committed
192

Chi Song's avatar
Chi Song committed
193
```text
Yan Ni's avatar
Yan Ni committed
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
INFO: Starting restful server...
INFO: Successfully started Restful server!
INFO: Setting local config...
INFO: Successfully set local config!
INFO: Starting experiment...
INFO: Successfully started experiment!
-----------------------------------------------------------------------
The experiment id is egchD4qy
The Web UI urls are: [Your IP]:8080
-----------------------------------------------------------------------

You can use these commands to get more information about the experiment
-----------------------------------------------------------------------
         commands                       description
1. nnictl experiment show        show the information of experiments
2. nnictl trial ls               list all of trial jobs
3. nnictl top                    monitor the status of running experiments
4. nnictl log stderr             show stderr log content
5. nnictl log stdout             show stdout log content
6. nnictl stop                   stop an experiment
7. nnictl trial kill             kill a trial job by id
8. nnictl --help                 get help information about nnictl
-----------------------------------------------------------------------
```

219
If you prepared `trial`, `search space`, and `config` according to the above steps and successfully created an NNI job, NNI will automatically tune the optimal hyper-parameters and run different hyper-parameter sets for each trial according to the requirements you set. You can clearly see its progress through the NNI WebUI.
Yan Ni's avatar
Yan Ni committed
220
221
222

## WebUI

223
After you start your experiment in NNI successfully, you can find a message in the command-line interface that tells you the `Web UI url` like this:
Yan Ni's avatar
Yan Ni committed
224

Chi Song's avatar
Chi Song committed
225
```text
Yan Ni's avatar
Yan Ni committed
226
227
228
The Web UI urls are: [Your IP]:8080
```

229
Open the `Web UI url` (Here it's: `[Your IP]:8080`) in your browser; you can view detailed information about the experiment and all the submitted trial jobs as shown below. If you cannot open the WebUI link in your terminal, please refer to the [FAQ](FAQ.md).
Yan Ni's avatar
Yan Ni committed
230

QuanluZhang's avatar
QuanluZhang committed
231
### View summary page
Yan Ni's avatar
Yan Ni committed
232

233
Click the "Overview" tab.
Yan Ni's avatar
Yan Ni committed
234

235
Information about this experiment will be shown in the WebUI, including the experiment trial profile and search space message. NNI also supports downloading this information and the parameters through the **Download** button. You can download the experiment results anytime while the experiment is running, or you can wait until the end of the execution, etc.
Yan Ni's avatar
Yan Ni committed
236

xuehui's avatar
xuehui committed
237
![](../../img/QuickStart1.png)
Yan Ni's avatar
Yan Ni committed
238

239
The top 10 trials will be listed on the Overview page. You can browse all the trials on the "Trials Detail" page.
Yan Ni's avatar
Yan Ni committed
240

xuehui's avatar
xuehui committed
241
![](../../img/QuickStart2.png)
Yan Ni's avatar
Yan Ni committed
242

QuanluZhang's avatar
QuanluZhang committed
243
### View trials detail page
Yan Ni's avatar
Yan Ni committed
244

245
Click the "Default Metric" tab to see the point graph of all trials. Hover to see specific default metrics and search space messages.
Yan Ni's avatar
Yan Ni committed
246

xuehui's avatar
xuehui committed
247
![](../../img/QuickStart3.png)
Yan Ni's avatar
Yan Ni committed
248

249
Click the "Hyper Parameter" tab to see the parallel graph.
Yan Ni's avatar
Yan Ni committed
250

251
252
* You can select the percentage to see the top trials.
* Choose two axis to swap their positions.
Yan Ni's avatar
Yan Ni committed
253

xuehui's avatar
xuehui committed
254
![](../../img/QuickStart4.png)
Yan Ni's avatar
Yan Ni committed
255

256
Click the "Trial Duration" tab to see the bar graph.
Yan Ni's avatar
Yan Ni committed
257

xuehui's avatar
xuehui committed
258
![](../../img/QuickStart5.png)
Yan Ni's avatar
Yan Ni committed
259

260
Below is the status of all trials. Specifically:
Yan Ni's avatar
Yan Ni committed
261

262
* Trial detail: trial's id, duration, start time, end time, status, accuracy, and search space file.
263
* If you run on the OpenPAI platform, you can also see the hdfsLogPath.
264
265
* Kill: you can kill a job that has the `Running` status.
* Support: Used to search for a specific trial.
Yan Ni's avatar
Yan Ni committed
266

xuehui's avatar
xuehui committed
267
![](../../img/QuickStart6.png)
Yan Ni's avatar
Yan Ni committed
268

Chi Song's avatar
Chi Song committed
269
* Intermediate Result Graph
Yan Ni's avatar
Yan Ni committed
270

xuehui's avatar
xuehui committed
271
![](../../img/QuickStart7.png)
Yan Ni's avatar
Yan Ni committed
272
273
274

## Related Topic

xuehui's avatar
xuehui committed
275
276
* [Try different Tuners](../Tuner/BuiltinTuner.md)
* [Try different Assessors](../Assessor/BuiltinAssessor.md)
277
* [How to use command line tool nnictl](Nnictl.md)
xuehui's avatar
xuehui committed
278
279
280
281
282
283
* [How to write a trial](../TrialExample/Trials.md)
* [How to run an experiment on local (with multiple GPUs)?](../TrainingService/LocalMode.md)
* [How to run an experiment on multiple machines?](../TrainingService/RemoteMachineMode.md)
* [How to run an experiment on OpenPAI?](../TrainingService/PaiMode.md)
* [How to run an experiment on Kubernetes through Kubeflow?](../TrainingService/KubeflowMode.md)
* [How to run an experiment on Kubernetes through FrameworkController?](../TrainingService/FrameworkControllerMode.md)
284
* [How to run an experiment on Kubernetes through AdaptDL?](../TrainingService/AdaptDLMode.md)