README.md 18.1 KB
Newer Older
1
<p align="center">
2
<img src="docs/img/nni_logo.png" width="300"/>
3
4
5
</p>

-----------
6

7
[![MIT licensed](https://img.shields.io/badge/license-MIT-brightgreen.svg)](LICENSE)
8
[![Build Status](https://msrasrg.visualstudio.com/NNIOpenSource/_apis/build/status/full%20test%20-%20linux?branchName=master)](https://msrasrg.visualstudio.com/NNIOpenSource/_build/latest?definitionId=62&branchName=master)
Gems Guo's avatar
Gems Guo committed
9
10
11
[![Issues](https://img.shields.io/github/issues-raw/Microsoft/nni.svg)](https://github.com/Microsoft/nni/issues?q=is%3Aissue+is%3Aopen)
[![Bugs](https://img.shields.io/github/issues/Microsoft/nni/bug.svg)](https://github.com/Microsoft/nni/issues?q=is%3Aissue+is%3Aopen+label%3Abug)
[![Pull Requests](https://img.shields.io/github/issues-pr-raw/Microsoft/nni.svg)](https://github.com/Microsoft/nni/pulls?q=is%3Apr+is%3Aopen)
The Gitter Badger's avatar
The Gitter Badger committed
12
[![Version](https://img.shields.io/github/release/Microsoft/nni.svg)](https://github.com/Microsoft/nni/releases) [![Join the chat at https://gitter.im/Microsoft/nni](https://badges.gitter.im/Microsoft/nni.svg)](https://gitter.im/Microsoft/nni?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge&utm_content=badge)
Yan Ni's avatar
Yan Ni committed
13
[![Documentation Status](https://readthedocs.org/projects/nni/badge/?version=latest)](https://nni.readthedocs.io/en/latest/?badge=latest)
Microsoft Open Source's avatar
Microsoft Open Source committed
14

Scarlett Li's avatar
Scarlett Li committed
15
[NNI Doc](https://nni.readthedocs.io/) | [简体中文](README_zh_CN.md)
Chi Song's avatar
Chi Song committed
16

17
**NNI (Neural Network Intelligence)** is a lightweight but powerful toolkit to help users **automate** <a href="docs/en_US/FeatureEngineering/Overview.rst">Feature Engineering</a>, <a href="docs/en_US/NAS/Overview.rst">Neural Architecture Search</a>, <a href="docs/en_US/Tuner/BuiltinTuner.rst">Hyperparameter Tuning</a> and <a href="docs/en_US/Compression/Overview.rst">Model Compression</a>.
18

Scarlett Li's avatar
Scarlett Li committed
19
The tool manages automated machine learning (AutoML) experiments, **dispatches and runs** experiments' trial jobs generated by tuning algorithms to search the best neural architecture and/or hyper-parameters in **different training environments** like <a href="docs/en_US/TrainingService/LocalMode.rst">Local Machine</a>, <a href="docs/en_US/TrainingService/RemoteMachineMode.rst">Remote Servers</a>, <a href="docs/en_US/TrainingService/PaiMode.rst">OpenPAI</a>, <a href="docs/en_US/TrainingService/KubeflowMode.rst">Kubeflow</a>, <a href="docs/en_US/TrainingService/FrameworkControllerMode.rst">FrameworkController on K8S (AKS etc.)</a>, <a href="docs/en_US/TrainingService/DLTSMode.rst">DLWorkspace (aka. DLTS)</a>, <a href="docs/en_US/TrainingService/AMLMode.rst">AML (Azure Machine Learning)</a>, <a href="docs/en_US/TrainingService/AdaptDLMode.rst">AdaptDL (aka. ADL)</a> , other cloud options and even <a href="docs/en_US/TrainingService/HybridMode.rst">Hybrid mode</a>.
20
21
22
23
24

## **Who should consider using NNI**

* Those who want to **try different AutoML algorithms** in their training code/model.
* Those who want to run AutoML trial jobs **in different environments** to speed up search.
Scarlett Li's avatar
Scarlett Li committed
25
* Researchers and data scientists who want to easily **implement and experiment new AutoML algorithms**, may it be: hyperparameter tuning algorithm, neural architect search algorithm or model compression algorithm.
26
* ML Platform owners who want to **support AutoML in their platform**.
27

Scarlett Li's avatar
Scarlett Li committed
28
29
### **[NNI v2.0 has been released!](https://github.com/microsoft/nni/releases) &nbsp;<a href="#nni-released-reminder"><img width="48" src="docs/img/release_icon.png"></a>**

30

31
## **NNI capabilities in a glance**
Daiki Katsuragawa's avatar
Daiki Katsuragawa committed
32

Vaggelis Gkiastas's avatar
Vaggelis Gkiastas committed
33
NNI provides CommandLine Tool as well as an user friendly WebUI to manage training experiments. With the extensible API, you can customize your own AutoML algorithms and training services. To make it easy for new users, NNI also provides a set of build-in state-of-the-art AutoML algorithms and out of box support for popular training platforms.
34
35
36

Within the following table, we summarized the current NNI capabilities, we are gradually adding new capabilities and we'd love to have your contribution.

QuanluZhang's avatar
QuanluZhang committed
37
<p align="center">
Lijiao's avatar
Lijiao committed
38
  <a href="#nni-has-been-released"><img src="docs/img/overview.svg" /></a>
QuanluZhang's avatar
QuanluZhang committed
39
</p>
40

QuanluZhang's avatar
QuanluZhang committed
41
42
<table>
  <tbody>
43
    <tr align="center" valign="bottom">
44
45
    <td>
      </td>
QuanluZhang's avatar
QuanluZhang committed
46
      <td>
47
        <b>Frameworks & Libraries</b>
48
        <img src="docs/img/bar.png"/>
QuanluZhang's avatar
QuanluZhang committed
49
50
      </td>
      <td>
51
        <b>Algorithms</b>
52
        <img src="docs/img/bar.png"/>
QuanluZhang's avatar
QuanluZhang committed
53
54
      </td>
      <td>
Gems's avatar
Gems committed
55
        <b>Training Services</b>
56
        <img src="docs/img/bar.png"/>
QuanluZhang's avatar
QuanluZhang committed
57
58
      </td>
    </tr>
59
    </tr>
QuanluZhang's avatar
QuanluZhang committed
60
    <tr valign="top">
61
62
63
    <td align="center" valign="middle">
    <b>Built-in</b>
      </td>
QuanluZhang's avatar
QuanluZhang committed
64
      <td>
65
      <ul><li><b>Supported Frameworks</b></li>
66
67
68
        <ul>
          <li>PyTorch</li>
          <li>Keras</li>
69
          <li>TensorFlow</li>
70
71
          <li>MXNet</li>
          <li>Caffe2</li>
72
          <a href="docs/en_US/SupportedFramework_Library.rst">More...</a><br/>
73
74
75
76
77
78
79
80
        </ul>
        </ul>
      <ul>
        <li><b>Supported Libraries</b></li>
          <ul>
           <li>Scikit-learn</li>
           <li>XGBoost</li>
           <li>LightGBM</li>
81
           <a href="docs/en_US/SupportedFramework_Library.rst">More...</a><br/>
82
83
84
85
86
          </ul>
      </ul>
        <ul>
        <li><b>Examples</b></li>
         <ul>
Guoxin's avatar
Guoxin committed
87
           <li><a href="examples/trials/mnist-pytorch">MNIST-pytorch</li></a>
88
           <li><a href="examples/trials/mnist-tfv1">MNIST-tensorflow</li></a>
89
           <li><a href="examples/trials/mnist-keras">MNIST-keras</li></a>
90
91
92
93
94
95
           <li><a href="docs/en_US/TrialExample/GbdtExample.rst">Auto-gbdt</a></li>
           <li><a href="docs/en_US/TrialExample/Cifar10Examples.rst">Cifar10-pytorch</li></a>
           <li><a href="docs/en_US/TrialExample/SklearnExamples.rst">Scikit-learn</a></li>
           <li><a href="docs/en_US/TrialExample/EfficientNet.rst">EfficientNet</a></li>
           <li><a href="docs/en_US/TrialExample/OpEvoExamples.rst">Kernel Tunning</li></a>
              <a href="docs/en_US/SupportedFramework_Library.rst">More...</a><br/>
96
          </ul>
QuanluZhang's avatar
QuanluZhang committed
97
98
        </ul>
      </td>
99
      <td align="left" >
100
        <a href="docs/en_US/Tuner/BuiltinTuner.rst">Hyperparameter Tuning</a>
QuanluZhang's avatar
QuanluZhang committed
101
        <ul>
102
          <b>Exhaustive search</b>
103
          <ul>
104
105
106
            <li><a href="docs/en_US/Tuner/BuiltinTuner.rst#Random">Random Search</a></li>
            <li><a href="docs/en_US/Tuner/BuiltinTuner.rst#GridSearch">Grid Search</a></li>
            <li><a href="docs/en_US/Tuner/BuiltinTuner.rst#Batch">Batch</a></li>
107
108
109
            </ul>
          <b>Heuristic search</b>
          <ul>
110
111
112
113
            <li><a href="docs/en_US/Tuner/BuiltinTuner.rst#Evolution">Naïve Evolution</a></li>
            <li><a href="docs/en_US/Tuner/BuiltinTuner.rst#Anneal">Anneal</a></li>
            <li><a href="docs/en_US/Tuner/BuiltinTuner.rst#Hyperband">Hyperband</a></li>
            <li><a href="docs/en_US/Tuner/BuiltinTuner.rst#PBTTuner">PBT</a></li>
114
          </ul>
115
116
          <b>Bayesian optimization</b>
            <ul>
117
118
119
120
121
              <li><a href="docs/en_US/Tuner/BuiltinTuner.rst#BOHB">BOHB</a></li>
              <li><a href="docs/en_US/Tuner/BuiltinTuner.rst#TPE">TPE</a></li>
            <li><a href="docs/en_US/Tuner/BuiltinTuner.rst#SMAC">SMAC</a></li>
            <li><a href="docs/en_US/Tuner/BuiltinTuner.rst#MetisTuner">Metis Tuner</a></li>
            <li><a href="docs/en_US/Tuner/BuiltinTuner.rst#GPTuner">GP Tuner</a></li>
Daiki Katsuragawa's avatar
Daiki Katsuragawa committed
122
            </ul>
123
124
          <b>RL Based</b>
          <ul>
125
            <li><a href="docs/en_US/Tuner/BuiltinTuner.rst#PPOTuner">PPO Tuner</a> </li>
126
127
          </ul>
        </ul>
128
          <a href="docs/en_US/NAS/Overview.rst">Neural Architecture Search</a>
Daiki Katsuragawa's avatar
Daiki Katsuragawa committed
129
          <ul>
130
            <ul>
131
132
133
134
135
136
137
138
139
              <li><a href="docs/en_US/NAS/ENAS.rst">ENAS</a></li>
              <li><a href="docs/en_US/NAS/DARTS.rst">DARTS</a></li>
              <li><a href="docs/en_US/NAS/PDARTS.rst">P-DARTS</a></li>
              <li><a href="docs/en_US/NAS/CDARTS.rst">CDARTS</a></li>
              <li><a href="docs/en_US/NAS/SPOS.rst">SPOS</a></li>
              <li><a href="docs/en_US/NAS/Proxylessnas.rst">ProxylessNAS</a></li>
              <li><a href="docs/en_US/Tuner/BuiltinTuner.rst#NetworkMorphism">Network Morphism</a></li>
              <li><a href="docs/en_US/NAS/TextNAS.rst">TextNAS</a></li>
              <li><a href="docs/en_US/NAS/Cream.rst">Cream</a></li>
Daiki Katsuragawa's avatar
Daiki Katsuragawa committed
140
            </ul>
141
          </ul>
142
          <a href="docs/en_US/Compression/Overview.rst">Model Compression</a>
143
          <ul>
144
145
            <b>Pruning</b>
            <ul>
146
147
148
149
150
151
152
              <li><a href="docs/en_US/Compression/Pruner.rst#agp-pruner">AGP Pruner</a></li>
              <li><a href="docs/en_US/Compression/Pruner.rst#slim-pruner">Slim Pruner</a></li>
              <li><a href="docs/en_US/Compression/Pruner.rst#fpgm-pruner">FPGM Pruner</a></li>
              <li><a href="docs/en_US/Compression/Pruner.rst#netadapt-pruner">NetAdapt Pruner</a></li>
              <li><a href="docs/en_US/Compression/Pruner.rst#simulatedannealing-pruner">SimulatedAnnealing Pruner</a></li>
              <li><a href="docs/en_US/Compression/Pruner.rst#admm-pruner">ADMM Pruner</a></li>
              <li><a href="docs/en_US/Compression/Pruner.rst#autocompress-pruner">AutoCompress Pruner</a></li>
153
154
155
            </ul>
            <b>Quantization</b>
            <ul>
156
157
              <li><a href="docs/en_US/Compression/Quantizer.rst#qat-quantizer">QAT Quantizer</a></li>
              <li><a href="docs/en_US/Compression/Quantizer.rst#dorefa-quantizer">DoReFa Quantizer</a></li>
158
            </ul>
159
          </ul>
160
          <a href="docs/en_US/FeatureEngineering/Overview.rst">Feature Engineering (Beta)</a>
161
          <ul>
162
163
          <li><a href="docs/en_US/FeatureEngineering/GradientFeatureSelector.rst">GradientFeatureSelector</a></li>
          <li><a href="docs/en_US/FeatureEngineering/GBDTSelector.rst">GBDTSelector</a></li>
164
          </ul>
165
          <a href="docs/en_US/Assessor/BuiltinAssessor.rst">Early Stop Algorithms</a>
166
          <ul>
167
168
          <li><a href="docs/en_US/Assessor/BuiltinAssessor.rst#Medianstop">Median Stop</a></li>
          <li><a href="docs/en_US/Assessor/BuiltinAssessor.rst#Curvefitting">Curve Fitting</a></li>
169
          </ul>
QuanluZhang's avatar
QuanluZhang committed
170
171
172
      </td>
      <td>
      <ul>
173
174
        <li><a href="docs/en_US/TrainingService/LocalMode.rst">Local Machine</a></li>
        <li><a href="docs/en_US/TrainingService/RemoteMachineMode.rst">Remote Servers</a></li>
Scarlett Li's avatar
Scarlett Li committed
175
        <li><a href="docs/en_US/TrainingService/HybridMode.rst">Hybrid mode</a></li>
176
        <li><a href="docs/en_US/TrainingService/AMLMode.rst">AML(Azure Machine Learning)</a></li>
177
        <li><b>Kubernetes based services</b></li>
178
        <ul>
179
180
181
182
183
          <li><a href="docs/en_US/TrainingService/PaiMode.rst">OpenPAI</a></li>
          <li><a href="docs/en_US/TrainingService/KubeflowMode.rst">Kubeflow</a></li>
          <li><a href="docs/en_US/TrainingService/FrameworkControllerMode.rst">FrameworkController on K8S (AKS etc.)</a></li>
          <li><a href="docs/en_US/TrainingService/DLTSMode.rst">DLWorkspace (aka. DLTS)</a></li>
          <li><a href="docs/en_US/TrainingService/AdaptDLMode.rst">AdaptDL (aka. ADL)</a></li>
184
        </ul>
QuanluZhang's avatar
QuanluZhang committed
185
186
      </ul>
      </td>
187
    </tr>
188
189
190
191
192
193
194
195
196
      <tr align="center" valign="bottom">
      </td>
      </tr>
      <tr valign="top">
       <td valign="middle">
    <b>References</b>
      </td>
     <td style="border-top:#FF0000 solid 0px;">
      <ul>
QuanluZhang's avatar
QuanluZhang committed
197
        <li><a href="https://nni.readthedocs.io/en/latest/autotune_ref.html#trial">Python API</a></li>
198
        <li><a href="docs/en_US/Tutorial/AnnotationSpec.rst">NNI Annotation</a></li>
QuanluZhang's avatar
QuanluZhang committed
199
         <li><a href="https://nni.readthedocs.io/en/latest/installation.html">Supported OS</a></li>
200
201
202
203
      </ul>
      </td>
       <td style="border-top:#FF0000 solid 0px;">
      <ul>
204
205
206
        <li><a href="docs/en_US/Tuner/CustomizeTuner.rst">CustomizeTuner</a></li>
        <li><a href="docs/en_US/Assessor/CustomizeAssessor.rst">CustomizeAssessor</a></li>
        <li><a href="docs/en_US/Tutorial/InstallCustomizedAlgos.rst">Install Customized Algorithms as Builtin Tuners/Assessors/Advisors</a></li>
207
208
209
210
      </ul>
      </td>
        <td style="border-top:#FF0000 solid 0px;">
      <ul>
211
212
        <li><a href="docs/en_US/TrainingService/Overview.rst">Support TrainingService</li>
        <li><a href="docs/en_US/TrainingService/HowToImplementTrainingService.rst">Implement TrainingService</a></li>
213
      </ul>
214
215
      </td>
    </tr>
QuanluZhang's avatar
QuanluZhang committed
216
217
  </tbody>
</table>
218

219
## **Installation**
Chi Song's avatar
Chi Song committed
220

221
### **Install**
Chi Song's avatar
Chi Song committed
222

223
NNI supports and is tested on Ubuntu >= 16.04, macOS >= 10.14.1, and Windows 10 >= 1809. Simply run the following `pip install` in an environment that has `python 64-bit >= 3.6`.
Zejun Lin's avatar
Zejun Lin committed
224

225
Linux or macOS
Chi Song's avatar
Chi Song committed
226

Zejun Lin's avatar
Zejun Lin committed
227
```bash
Chi Song's avatar
Chi Song committed
228
python3 -m pip install --upgrade nni
229
```
Chi Song's avatar
Chi Song committed
230

231
Windows
Chi Song's avatar
Chi Song committed
232

233
```bash
Chi Song's avatar
Chi Song committed
234
python -m pip install --upgrade nni
235
```
Chi Song's avatar
Chi Song committed
236

QuanluZhang's avatar
QuanluZhang committed
237
If you want to try latest code, please [install NNI](https://nni.readthedocs.io/en/latest/installation.html) from source code.
Chi Song's avatar
Chi Song committed
238

QuanluZhang's avatar
QuanluZhang committed
239
For detail system requirements of NNI, please refer to [here](https://nni.readthedocs.io/en/latest/Tutorial/InstallationLinux.html#system-requirements) for Linux & macOS, and [here](https://nni.readthedocs.io/en/latest/Tutorial/InstallationWin.html#system-requirements) for Windows.
240

241
Note:
Chi Song's avatar
Chi Song committed
242

243
* If there is any privilege issue, add `--user` to install NNI in the user directory.
244
245
* Currently NNI on Windows supports local, remote and pai mode. Anaconda or Miniconda is highly recommended to install [NNI on Windows](docs/en_US/Tutorial/InstallationWin.rst).
* If there is any error like `Segmentation fault`, please refer to [FAQ](docs/en_US/Tutorial/FAQ.rst). For FAQ on Windows, please refer to [NNI on Windows](docs/en_US/Tutorial/InstallationWin.rst#faq).
246

247
### **Verify installation**
Chi Song's avatar
Chi Song committed
248
249
250

* Download the examples via clone the source code.

251
  ```bash
252
  git clone -b v2.0 https://github.com/Microsoft/nni.git
253
  ```
Chi Song's avatar
Chi Song committed
254
255
256

* Run the MNIST example.

257
  Linux or macOS
Chi Song's avatar
Chi Song committed
258

259
  ```bash
260
  nnictl create --config nni/examples/trials/mnist-pytorch/config.yml
261
  ```
Chi Song's avatar
Chi Song committed
262

263
  Windows
Chi Song's avatar
Chi Song committed
264

265
266
  ```powershell
  nnictl create --config nni\examples\trials\mnist-pytorch\config_windows.yml
267
  ```
Chi Song's avatar
Chi Song committed
268

269
* Wait for the message `INFO: Successfully started experiment!` in the command line. This message indicates that your experiment has been successfully started. You can explore the experiment using the `Web UI url`.
270

Chi Song's avatar
Chi Song committed
271
```text
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
INFO: Starting restful server...
INFO: Successfully started Restful server!
INFO: Setting local config...
INFO: Successfully set local config!
INFO: Starting experiment...
INFO: Successfully started experiment!
-----------------------------------------------------------------------
The experiment id is egchD4qy
The Web UI urls are: http://223.255.255.1:8080   http://127.0.0.1:8080
-----------------------------------------------------------------------

You can use these commands to get more information about the experiment
-----------------------------------------------------------------------
         commands                       description
1. nnictl experiment show        show the information of experiments
2. nnictl trial ls               list all of trial jobs
SparkSnail's avatar
SparkSnail committed
288
289
290
291
292
293
3. nnictl top                    monitor the status of running experiments
4. nnictl log stderr             show stderr log content
5. nnictl log stdout             show stdout log content
6. nnictl stop                   stop an experiment
7. nnictl trial kill             kill a trial job by id
8. nnictl --help                 get help information about nnictl
294
-----------------------------------------------------------------------
Scarlett Li's avatar
Scarlett Li committed
295
```
Scarlett Li's avatar
Scarlett Li committed
296

297
* Open the `Web UI url` in your browser, you can view detail information of the experiment and all the submitted trial jobs as shown below. [Here](docs/en_US/Tutorial/WebUI.rst) are more Web UI pages.
298
299

<table style="border: none">
300
301
    <th><img src="./docs/img/webui-img/full-oview.png" alt="drawing" width="395" height="300"/></th>
    <th><img src="./docs/img/webui-img/full-detail.png" alt="drawing" width="410" height="300"/></th>
302
303
</table>

304
305
306
307
308
309
310
311
312
313
314
315
316
## **Releases and Contributing**
NNI has a monthly release cycle (major releases). Please let us know if you encounter a bug by [filling an issue](https://github.com/microsoft/nni/issues/new/choose).

We appreciate all contributions. If you are planning to contribute any bug-fixes, please do so without further discussions.

If you plan to contribute new features, new tuners, new training services, etc. please first open an issue or reuse an exisiting issue, and discuss the feature with us. We will discuss with you on the issue timely or set up conference calls if needed.

To learn more about making a contribution to NNI, please refer to our [How-to contribution page](https://nni.readthedocs.io/en/stable/contribution.html). 

We appreciate all contributions and thank all the contributors!

<a href="https://github.com/microsoft/nni/graphs/contributors"><img src="docs/img/contributors.png" /></a>

317
318

## **Feedback**
319
* [File an issue](https://github.com/microsoft/nni/issues/new/choose) on GitHub.
JSong-Jia's avatar
JSong-Jia committed
320
321
322
323
324
* Discuss on the NNI [Gitter](https://gitter.im/Microsoft/nni?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge&utm_content=badge) in NNI.

Join IM discussion groups:
|Gitter||WeChat|
|----|----|----|
325
|![image](https://user-images.githubusercontent.com/39592018/80665738-e0574a80-8acc-11ea-91bc-0836dc4cbf89.png)| OR |![image](https://github.com/scarlett2018/nniutil/raw/master/wechat.png)|
JSong-Jia's avatar
JSong-Jia committed
326

Chi Song's avatar
Chi Song committed
327

328
## Related Projects
Daiki Katsuragawa's avatar
Daiki Katsuragawa committed
329

330
Targeting at openness and advancing state-of-art technology, [Microsoft Research (MSR)](https://www.microsoft.com/en-us/research/group/systems-and-networking-research-group-asia/) had also released few other open source projects.
331
332
333
334
335
336
337

* [OpenPAI](https://github.com/Microsoft/pai) : an open source platform that provides complete AI model training and resource management capabilities, it is easy to extend and supports on-premise, cloud and hybrid environments in various scale.
* [FrameworkController](https://github.com/Microsoft/frameworkcontroller) : an open source general-purpose Kubernetes Pod Controller that orchestrate all kinds of applications on Kubernetes by a single controller.
* [MMdnn](https://github.com/Microsoft/MMdnn) : A comprehensive, cross-framework solution to convert, visualize and diagnose deep neural network models. The "MM" in MMdnn stands for model management and "dnn" is an acronym for deep neural network.
* [SPTAG](https://github.com/Microsoft/SPTAG) : Space Partition Tree And Graph (SPTAG) is an open source library for large scale vector approximate nearest neighbor search scenario.

We encourage researchers and students leverage these projects to accelerate the AI development and research.
Microsoft Open Source's avatar
Microsoft Open Source committed
338

Chi Song's avatar
Chi Song committed
339
340
## **License**

341
The entire codebase is under [MIT license](LICENSE)