"Dockerfile" did not exist on "fae086848240876984fe53340ca399fb14540571"
README.md 21.5 KB
Newer Older
1
<p align="center">
2
<img src="docs/img/nni_logo.png" width="300"/>
3
4
5
</p>

-----------
6

7
[![MIT licensed](https://img.shields.io/badge/license-MIT-brightgreen.svg)](LICENSE)
8
[![Build Status](https://msrasrg.visualstudio.com/NNIOpenSource/_apis/build/status/full%20test%20-%20linux?branchName=master)](https://msrasrg.visualstudio.com/NNIOpenSource/_build/latest?definitionId=62&branchName=master)
Gems Guo's avatar
Gems Guo committed
9
10
11
[![Issues](https://img.shields.io/github/issues-raw/Microsoft/nni.svg)](https://github.com/Microsoft/nni/issues?q=is%3Aissue+is%3Aopen)
[![Bugs](https://img.shields.io/github/issues/Microsoft/nni/bug.svg)](https://github.com/Microsoft/nni/issues?q=is%3Aissue+is%3Aopen+label%3Abug)
[![Pull Requests](https://img.shields.io/github/issues-pr-raw/Microsoft/nni.svg)](https://github.com/Microsoft/nni/pulls?q=is%3Apr+is%3Aopen)
The Gitter Badger's avatar
The Gitter Badger committed
12
[![Version](https://img.shields.io/github/release/Microsoft/nni.svg)](https://github.com/Microsoft/nni/releases) [![Join the chat at https://gitter.im/Microsoft/nni](https://badges.gitter.im/Microsoft/nni.svg)](https://gitter.im/Microsoft/nni?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge&utm_content=badge)
Yan Ni's avatar
Yan Ni committed
13
[![Documentation Status](https://readthedocs.org/projects/nni/badge/?version=latest)](https://nni.readthedocs.io/en/latest/?badge=latest)
Microsoft Open Source's avatar
Microsoft Open Source committed
14

15
[简体中文](README_zh_CN.md)
Chi Song's avatar
Chi Song committed
16

17
**NNI (Neural Network Intelligence)** is a lightweight but powerful toolkit to help users **automate** <a href="docs/en_US/FeatureEngineering/Overview.rst">Feature Engineering</a>, <a href="docs/en_US/NAS/Overview.rst">Neural Architecture Search</a>, <a href="docs/en_US/Tuner/BuiltinTuner.rst">Hyperparameter Tuning</a> and <a href="docs/en_US/Compression/Overview.rst">Model Compression</a>.
18

19
The tool manages automated machine learning (AutoML) experiments, **dispatches and runs** experiments' trial jobs generated by tuning algorithms to search the best neural architecture and/or hyper-parameters in **different training environments** like <a href="docs/en_US/TrainingService/LocalMode.rst">Local Machine</a>, <a href="docs/en_US/TrainingService/RemoteMachineMode.rst">Remote Servers</a>, <a href="docs/en_US/TrainingService/PaiMode.rst">OpenPAI</a>, <a href="docs/en_US/TrainingService/KubeflowMode.rst">Kubeflow</a>, <a href="docs/en_US/TrainingService/FrameworkControllerMode.rst">FrameworkController on K8S (AKS etc.)</a>, <a href="docs/en_US/TrainingService/DLTSMode.rst">DLWorkspace (aka. DLTS)</a>, <a href="docs/en_US/TrainingService/AMLMode.rst">AML (Azure Machine Learning)</a>, <a href="docs/en_US/TrainingService/AdaptDLMode.rst">AdaptDL (aka. ADL)</a> and other cloud options.
20
21
22
23
24

## **Who should consider using NNI**

* Those who want to **try different AutoML algorithms** in their training code/model.
* Those who want to run AutoML trial jobs **in different environments** to speed up search.
Scarlett Li's avatar
Scarlett Li committed
25
* Researchers and data scientists who want to easily **implement and experiment new AutoML algorithms**, may it be: hyperparameter tuning algorithm, neural architect search algorithm or model compression algorithm.
26
* ML Platform owners who want to **support AutoML in their platform**.
27

QuanluZhang's avatar
QuanluZhang committed
28
### **[NNI v1.9 has been released!](https://github.com/microsoft/nni/releases) &nbsp;<a href="#nni-released-reminder"><img width="48" src="docs/img/release_icon.png"></a>**
29

30
## **NNI capabilities in a glance**
Daiki Katsuragawa's avatar
Daiki Katsuragawa committed
31

Vaggelis Gkiastas's avatar
Vaggelis Gkiastas committed
32
NNI provides CommandLine Tool as well as an user friendly WebUI to manage training experiments. With the extensible API, you can customize your own AutoML algorithms and training services. To make it easy for new users, NNI also provides a set of build-in state-of-the-art AutoML algorithms and out of box support for popular training platforms.
33
34
35

Within the following table, we summarized the current NNI capabilities, we are gradually adding new capabilities and we'd love to have your contribution.

QuanluZhang's avatar
QuanluZhang committed
36
<p align="center">
Lijiao's avatar
Lijiao committed
37
  <a href="#nni-has-been-released"><img src="docs/img/overview.svg" /></a>
QuanluZhang's avatar
QuanluZhang committed
38
</p>
39

QuanluZhang's avatar
QuanluZhang committed
40
41
<table>
  <tbody>
42
    <tr align="center" valign="bottom">
43
44
    <td>
      </td>
QuanluZhang's avatar
QuanluZhang committed
45
      <td>
46
        <b>Frameworks & Libraries</b>
47
        <img src="docs/img/bar.png"/>
QuanluZhang's avatar
QuanluZhang committed
48
49
      </td>
      <td>
50
        <b>Algorithms</b>
51
        <img src="docs/img/bar.png"/>
QuanluZhang's avatar
QuanluZhang committed
52
53
      </td>
      <td>
Gems's avatar
Gems committed
54
        <b>Training Services</b>
55
        <img src="docs/img/bar.png"/>
QuanluZhang's avatar
QuanluZhang committed
56
57
      </td>
    </tr>
58
    </tr>
QuanluZhang's avatar
QuanluZhang committed
59
    <tr valign="top">
60
61
62
    <td align="center" valign="middle">
    <b>Built-in</b>
      </td>
QuanluZhang's avatar
QuanluZhang committed
63
      <td>
64
      <ul><li><b>Supported Frameworks</b></li>
65
66
67
        <ul>
          <li>PyTorch</li>
          <li>Keras</li>
68
          <li>TensorFlow</li>
69
70
          <li>MXNet</li>
          <li>Caffe2</li>
71
          <a href="docs/en_US/SupportedFramework_Library.rst">More...</a><br/>
72
73
74
75
76
77
78
79
        </ul>
        </ul>
      <ul>
        <li><b>Supported Libraries</b></li>
          <ul>
           <li>Scikit-learn</li>
           <li>XGBoost</li>
           <li>LightGBM</li>
80
           <a href="docs/en_US/SupportedFramework_Library.rst">More...</a><br/>
81
82
83
84
85
          </ul>
      </ul>
        <ul>
        <li><b>Examples</b></li>
         <ul>
Guoxin's avatar
Guoxin committed
86
           <li><a href="examples/trials/mnist-pytorch">MNIST-pytorch</li></a>
87
           <li><a href="examples/trials/mnist-tfv1">MNIST-tensorflow</li></a>
88
           <li><a href="examples/trials/mnist-keras">MNIST-keras</li></a>
89
90
91
92
93
94
           <li><a href="docs/en_US/TrialExample/GbdtExample.rst">Auto-gbdt</a></li>
           <li><a href="docs/en_US/TrialExample/Cifar10Examples.rst">Cifar10-pytorch</li></a>
           <li><a href="docs/en_US/TrialExample/SklearnExamples.rst">Scikit-learn</a></li>
           <li><a href="docs/en_US/TrialExample/EfficientNet.rst">EfficientNet</a></li>
           <li><a href="docs/en_US/TrialExample/OpEvoExamples.rst">Kernel Tunning</li></a>
              <a href="docs/en_US/SupportedFramework_Library.rst">More...</a><br/>
95
          </ul>
QuanluZhang's avatar
QuanluZhang committed
96
97
        </ul>
      </td>
98
      <td align="left" >
99
        <a href="docs/en_US/Tuner/BuiltinTuner.rst">Hyperparameter Tuning</a>
QuanluZhang's avatar
QuanluZhang committed
100
        <ul>
101
          <b>Exhaustive search</b>
102
          <ul>
103
104
105
            <li><a href="docs/en_US/Tuner/BuiltinTuner.rst#Random">Random Search</a></li>
            <li><a href="docs/en_US/Tuner/BuiltinTuner.rst#GridSearch">Grid Search</a></li>
            <li><a href="docs/en_US/Tuner/BuiltinTuner.rst#Batch">Batch</a></li>
106
107
108
            </ul>
          <b>Heuristic search</b>
          <ul>
109
110
111
112
            <li><a href="docs/en_US/Tuner/BuiltinTuner.rst#Evolution">Naïve Evolution</a></li>
            <li><a href="docs/en_US/Tuner/BuiltinTuner.rst#Anneal">Anneal</a></li>
            <li><a href="docs/en_US/Tuner/BuiltinTuner.rst#Hyperband">Hyperband</a></li>
            <li><a href="docs/en_US/Tuner/BuiltinTuner.rst#PBTTuner">PBT</a></li>
113
          </ul>
114
115
          <b>Bayesian optimization</b>
            <ul>
116
117
118
119
120
              <li><a href="docs/en_US/Tuner/BuiltinTuner.rst#BOHB">BOHB</a></li>
              <li><a href="docs/en_US/Tuner/BuiltinTuner.rst#TPE">TPE</a></li>
            <li><a href="docs/en_US/Tuner/BuiltinTuner.rst#SMAC">SMAC</a></li>
            <li><a href="docs/en_US/Tuner/BuiltinTuner.rst#MetisTuner">Metis Tuner</a></li>
            <li><a href="docs/en_US/Tuner/BuiltinTuner.rst#GPTuner">GP Tuner</a></li>
Daiki Katsuragawa's avatar
Daiki Katsuragawa committed
121
            </ul>
122
123
          <b>RL Based</b>
          <ul>
124
            <li><a href="docs/en_US/Tuner/BuiltinTuner.rst#PPOTuner">PPO Tuner</a> </li>
125
126
          </ul>
        </ul>
127
          <a href="docs/en_US/NAS/Overview.rst">Neural Architecture Search</a>
Daiki Katsuragawa's avatar
Daiki Katsuragawa committed
128
          <ul>
129
            <ul>
130
131
132
133
134
135
136
137
138
              <li><a href="docs/en_US/NAS/ENAS.rst">ENAS</a></li>
              <li><a href="docs/en_US/NAS/DARTS.rst">DARTS</a></li>
              <li><a href="docs/en_US/NAS/PDARTS.rst">P-DARTS</a></li>
              <li><a href="docs/en_US/NAS/CDARTS.rst">CDARTS</a></li>
              <li><a href="docs/en_US/NAS/SPOS.rst">SPOS</a></li>
              <li><a href="docs/en_US/NAS/Proxylessnas.rst">ProxylessNAS</a></li>
              <li><a href="docs/en_US/Tuner/BuiltinTuner.rst#NetworkMorphism">Network Morphism</a></li>
              <li><a href="docs/en_US/NAS/TextNAS.rst">TextNAS</a></li>
              <li><a href="docs/en_US/NAS/Cream.rst">Cream</a></li>
Daiki Katsuragawa's avatar
Daiki Katsuragawa committed
139
            </ul>
140
          </ul>
141
          <a href="docs/en_US/Compression/Overview.rst">Model Compression</a>
142
          <ul>
143
144
            <b>Pruning</b>
            <ul>
145
146
147
148
149
150
151
              <li><a href="docs/en_US/Compression/Pruner.rst#agp-pruner">AGP Pruner</a></li>
              <li><a href="docs/en_US/Compression/Pruner.rst#slim-pruner">Slim Pruner</a></li>
              <li><a href="docs/en_US/Compression/Pruner.rst#fpgm-pruner">FPGM Pruner</a></li>
              <li><a href="docs/en_US/Compression/Pruner.rst#netadapt-pruner">NetAdapt Pruner</a></li>
              <li><a href="docs/en_US/Compression/Pruner.rst#simulatedannealing-pruner">SimulatedAnnealing Pruner</a></li>
              <li><a href="docs/en_US/Compression/Pruner.rst#admm-pruner">ADMM Pruner</a></li>
              <li><a href="docs/en_US/Compression/Pruner.rst#autocompress-pruner">AutoCompress Pruner</a></li>
152
153
154
            </ul>
            <b>Quantization</b>
            <ul>
155
156
              <li><a href="docs/en_US/Compression/Quantizer.rst#qat-quantizer">QAT Quantizer</a></li>
              <li><a href="docs/en_US/Compression/Quantizer.rst#dorefa-quantizer">DoReFa Quantizer</a></li>
157
            </ul>
158
          </ul>
159
          <a href="docs/en_US/FeatureEngineering/Overview.rst">Feature Engineering (Beta)</a>
160
          <ul>
161
162
          <li><a href="docs/en_US/FeatureEngineering/GradientFeatureSelector.rst">GradientFeatureSelector</a></li>
          <li><a href="docs/en_US/FeatureEngineering/GBDTSelector.rst">GBDTSelector</a></li>
163
          </ul>
164
          <a href="docs/en_US/Assessor/BuiltinAssessor.rst">Early Stop Algorithms</a>
165
          <ul>
166
167
          <li><a href="docs/en_US/Assessor/BuiltinAssessor.rst#Medianstop">Median Stop</a></li>
          <li><a href="docs/en_US/Assessor/BuiltinAssessor.rst#Curvefitting">Curve Fitting</a></li>
168
          </ul>
QuanluZhang's avatar
QuanluZhang committed
169
170
171
      </td>
      <td>
      <ul>
172
173
174
        <li><a href="docs/en_US/TrainingService/LocalMode.rst">Local Machine</a></li>
        <li><a href="docs/en_US/TrainingService/RemoteMachineMode.rst">Remote Servers</a></li>
        <li><a href="docs/en_US/TrainingService/AMLMode.rst">AML(Azure Machine Learning)</a></li>
175
        <li><b>Kubernetes based services</b></li>
176
        <ul>
177
178
179
180
181
          <li><a href="docs/en_US/TrainingService/PaiMode.rst">OpenPAI</a></li>
          <li><a href="docs/en_US/TrainingService/KubeflowMode.rst">Kubeflow</a></li>
          <li><a href="docs/en_US/TrainingService/FrameworkControllerMode.rst">FrameworkController on K8S (AKS etc.)</a></li>
          <li><a href="docs/en_US/TrainingService/DLTSMode.rst">DLWorkspace (aka. DLTS)</a></li>
          <li><a href="docs/en_US/TrainingService/AdaptDLMode.rst">AdaptDL (aka. ADL)</a></li>
182
        </ul>
QuanluZhang's avatar
QuanluZhang committed
183
184
      </ul>
      </td>
185
    </tr>
186
187
188
189
190
191
192
193
194
      <tr align="center" valign="bottom">
      </td>
      </tr>
      <tr valign="top">
       <td valign="middle">
    <b>References</b>
      </td>
     <td style="border-top:#FF0000 solid 0px;">
      <ul>
QuanluZhang's avatar
QuanluZhang committed
195
        <li><a href="https://nni.readthedocs.io/en/latest/autotune_ref.html#trial">Python API</a></li>
196
        <li><a href="docs/en_US/Tutorial/AnnotationSpec.rst">NNI Annotation</a></li>
QuanluZhang's avatar
QuanluZhang committed
197
         <li><a href="https://nni.readthedocs.io/en/latest/installation.html">Supported OS</a></li>
198
199
200
201
      </ul>
      </td>
       <td style="border-top:#FF0000 solid 0px;">
      <ul>
202
203
204
        <li><a href="docs/en_US/Tuner/CustomizeTuner.rst">CustomizeTuner</a></li>
        <li><a href="docs/en_US/Assessor/CustomizeAssessor.rst">CustomizeAssessor</a></li>
        <li><a href="docs/en_US/Tutorial/InstallCustomizedAlgos.rst">Install Customized Algorithms as Builtin Tuners/Assessors/Advisors</a></li>
205
206
207
208
      </ul>
      </td>
        <td style="border-top:#FF0000 solid 0px;">
      <ul>
209
210
        <li><a href="docs/en_US/TrainingService/Overview.rst">Support TrainingService</li>
        <li><a href="docs/en_US/TrainingService/HowToImplementTrainingService.rst">Implement TrainingService</a></li>
211
      </ul>
212
213
      </td>
    </tr>
QuanluZhang's avatar
QuanluZhang committed
214
215
  </tbody>
</table>
216

217
## **Installation**
Chi Song's avatar
Chi Song committed
218

219
### **Install**
Chi Song's avatar
Chi Song committed
220

221
NNI supports and is tested on Ubuntu >= 16.04, macOS >= 10.14.1, and Windows 10 >= 1809. Simply run the following `pip install` in an environment that has `python 64-bit >= 3.6`.
Zejun Lin's avatar
Zejun Lin committed
222

223
Linux or macOS
Chi Song's avatar
Chi Song committed
224

Zejun Lin's avatar
Zejun Lin committed
225
```bash
Chi Song's avatar
Chi Song committed
226
python3 -m pip install --upgrade nni
227
```
Chi Song's avatar
Chi Song committed
228

229
Windows
Chi Song's avatar
Chi Song committed
230

231
```bash
Chi Song's avatar
Chi Song committed
232
python -m pip install --upgrade nni
233
```
Chi Song's avatar
Chi Song committed
234

QuanluZhang's avatar
QuanluZhang committed
235
If you want to try latest code, please [install NNI](https://nni.readthedocs.io/en/latest/installation.html) from source code.
Chi Song's avatar
Chi Song committed
236

QuanluZhang's avatar
QuanluZhang committed
237
For detail system requirements of NNI, please refer to [here](https://nni.readthedocs.io/en/latest/Tutorial/InstallationLinux.html#system-requirements) for Linux & macOS, and [here](https://nni.readthedocs.io/en/latest/Tutorial/InstallationWin.html#system-requirements) for Windows.
238

239
Note:
Chi Song's avatar
Chi Song committed
240

241
* If there is any privilege issue, add `--user` to install NNI in the user directory.
242
243
* Currently NNI on Windows supports local, remote and pai mode. Anaconda or Miniconda is highly recommended to install [NNI on Windows](docs/en_US/Tutorial/InstallationWin.rst).
* If there is any error like `Segmentation fault`, please refer to [FAQ](docs/en_US/Tutorial/FAQ.rst). For FAQ on Windows, please refer to [NNI on Windows](docs/en_US/Tutorial/InstallationWin.rst#faq).
244

245
### **Verify installation**
Chi Song's avatar
Chi Song committed
246
247
248

* Download the examples via clone the source code.

249
  ```bash
250
  git clone -b v2.0 https://github.com/Microsoft/nni.git
251
  ```
Chi Song's avatar
Chi Song committed
252
253
254

* Run the MNIST example.

255
  Linux or macOS
Chi Song's avatar
Chi Song committed
256

257
  ```bash
258
  nnictl create --config nni/examples/trials/mnist-pytorch/config.yml
259
  ```
Chi Song's avatar
Chi Song committed
260

261
  Windows
Chi Song's avatar
Chi Song committed
262

263
264
  ```powershell
  nnictl create --config nni\examples\trials\mnist-pytorch\config_windows.yml
265
  ```
Chi Song's avatar
Chi Song committed
266

267
* Wait for the message `INFO: Successfully started experiment!` in the command line. This message indicates that your experiment has been successfully started. You can explore the experiment using the `Web UI url`.
268

Chi Song's avatar
Chi Song committed
269
```text
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
INFO: Starting restful server...
INFO: Successfully started Restful server!
INFO: Setting local config...
INFO: Successfully set local config!
INFO: Starting experiment...
INFO: Successfully started experiment!
-----------------------------------------------------------------------
The experiment id is egchD4qy
The Web UI urls are: http://223.255.255.1:8080   http://127.0.0.1:8080
-----------------------------------------------------------------------

You can use these commands to get more information about the experiment
-----------------------------------------------------------------------
         commands                       description
1. nnictl experiment show        show the information of experiments
2. nnictl trial ls               list all of trial jobs
SparkSnail's avatar
SparkSnail committed
286
287
288
289
290
291
3. nnictl top                    monitor the status of running experiments
4. nnictl log stderr             show stderr log content
5. nnictl log stdout             show stdout log content
6. nnictl stop                   stop an experiment
7. nnictl trial kill             kill a trial job by id
8. nnictl --help                 get help information about nnictl
292
-----------------------------------------------------------------------
Scarlett Li's avatar
Scarlett Li committed
293
```
Scarlett Li's avatar
Scarlett Li committed
294

295
* Open the `Web UI url` in your browser, you can view detail information of the experiment and all the submitted trial jobs as shown below. [Here](docs/en_US/Tutorial/WebUI.rst) are more Web UI pages.
296
297

<table style="border: none">
298
299
    <th><img src="./docs/img/webui-img/full-oview.png" alt="drawing" width="395" height="300"/></th>
    <th><img src="./docs/img/webui-img/full-detail.png" alt="drawing" width="410" height="300"/></th>
300
301
</table>

Scarlett Li's avatar
Scarlett Li committed
302
## **Documentation**
Daiki Katsuragawa's avatar
Daiki Katsuragawa committed
303
304
305

* To learn about what's NNI, read the [NNI Overview](https://nni.readthedocs.io/en/latest/Overview.html).
* To get yourself familiar with how to use NNI, read the [documentation](https://nni.readthedocs.io/en/latest/index.html).
306
* To get started and install NNI on your system, please refer to [Install NNI](https://nni.readthedocs.io/en/latest/installation.html).
Chi Song's avatar
Chi Song committed
307

308
309
## **Contributing**
This project welcomes contributions and suggestions. Most contributions require you to agree to a Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us the rights to use your contribution. For details, visit https://cla.microsoft.com.
310

311
When you submit a pull request, a CLA-bot will automatically determine whether you need to provide a CLA and decorate the PR appropriately (e.g., label, comment). Simply follow the instructions provided by the bot. You will only need to do this once across all repos using our CLA.
Scarlett Li's avatar
Scarlett Li committed
312

313
This project has adopted the [Microsoft Open Source Code of Conduct](https://opensource.microsoft.com/codeofconduct/). For more information see the Code of [Conduct FAQ](https://opensource.microsoft.com/codeofconduct/faq/) or contact opencode@microsoft.com with any additional questions or comments.
314

315
After getting familiar with contribution agreements, you are ready to create your first PR =), follow the NNI developer tutorials to get start:
Daiki Katsuragawa's avatar
Daiki Katsuragawa committed
316

317
318
319
320
321
322
323
324
* We recommend new contributors to start with simple issues: [good first issue](https://github.com/Microsoft/nni/issues?q=is%3Aissue+is%3Aopen+label%3A%22good+first+issue%22) or [help-wanted](https://github.com/microsoft/nni/issues?q=is%3Aopen+is%3Aissue+label%3A%22help+wanted%22).
* [NNI developer environment installation tutorial](docs/en_US/Tutorial/SetupNniDeveloperEnvironment.rst)
* [How to debug](docs/en_US/Tutorial/HowToDebug.rst)
* If you have any questions on usage, review [FAQ](https://github.com/microsoft/nni/blob/master/docs/en_US/Tutorial/FAQ.rst) first, if there are no relevant issues and answers to your question, try contact NNI dev team and users in [Gitter](https://gitter.im/Microsoft/nni?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge&utm_content=badge) or [File an issue](https://github.com/microsoft/nni/issues/new/choose) on GitHub.
* [Customize your own Tuner](docs/en_US/Tuner/CustomizeTuner.rst)
* [Implement customized TrainingService](docs/en_US/TrainingService/HowToImplementTrainingService.rst)
* [Implement a new NAS trainer on NNI](docs/en_US/NAS/Advanced.rst)
* [Customize your own Advisor](docs/en_US/Tuner/CustomizeAdvisor.rst)
325

rabbit008's avatar
rabbit008 committed
326
327
## **External Repositories and References**
With authors' permission, we listed a set of NNI usage examples and relevant articles.
Daiki Katsuragawa's avatar
Daiki Katsuragawa committed
328

329
* ### **External Repositories** ###
Lijiaoa's avatar
Lijiaoa committed
330
   * Run [ENAS](examples/nas/enas/README.md) with NNI
Daiki Katsuragawa's avatar
Daiki Katsuragawa committed
331
   * [Automatic Feature Engineering](examples/feature_engineering/auto-feature-engineering/README.md) with NNI
332
   * [Hyperparameter Tuning for Matrix Factorization](https://github.com/microsoft/recommenders/blob/master/examples/04_model_select_and_optimize/nni_surprise_svd.ipynb) with NNI
QuanluZhang's avatar
QuanluZhang committed
333
   * [scikit-nni](https://github.com/ksachdeva/scikit-nni) Hyper-parameter search for scikit-learn pipelines using NNI
334
* ### **Relevant Articles** ###
335
336
337
338
339
  * [Hyper Parameter Optimization Comparison](docs/en_US/CommunitySharings/HpoComparison.rst)
  * [Neural Architecture Search Comparison](docs/en_US/CommunitySharings/NasComparison.rst)
  * [Parallelizing a Sequential Algorithm TPE](docs/en_US/CommunitySharings/ParallelizingTpeSearch.rst)
  * [Automatically tuning SVD with NNI](docs/en_US/CommunitySharings/RecommendersSvd.rst)
  * [Automatically tuning SPTAG with NNI](docs/en_US/CommunitySharings/SptagAutoTune.rst)
QuanluZhang's avatar
QuanluZhang committed
340
  * [Find thy hyper-parameters for scikit-learn pipelines using Microsoft NNI](https://towardsdatascience.com/find-thy-hyper-parameters-for-scikit-learn-pipelines-using-microsoft-nni-f1015b1224c1)
341
  * **Blog (in Chinese)** - [AutoML tools (Advisor, NNI and Google Vizier) comparison](http://gaocegege.com/Blog/%E6%9C%BA%E5%99%A8%E5%AD%A6%E4%B9%A0/katib-new#%E6%80%BB%E7%BB%93%E4%B8%8E%E5%88%86%E6%9E%90) by [@gaocegege](https://github.com/gaocegege) - 总结与分析 section of design and implementation of kubeflow/katib
Scarlett Li's avatar
Scarlett Li committed
342
  * **Blog (in Chinese)** - [A summary of NNI new capabilities in 2019](https://mp.weixin.qq.com/s/7_KRT-rRojQbNuJzkjFMuA) by @squirrelsc
343
344

## **Feedback**
345
* [File an issue](https://github.com/microsoft/nni/issues/new/choose) on GitHub.
346
* Ask a question with NNI tags on [Stack Overflow](https://stackoverflow.com/questions/tagged/nni?sort=Newest&edited=true).
JSong-Jia's avatar
JSong-Jia committed
347
348
349
350
351
* Discuss on the NNI [Gitter](https://gitter.im/Microsoft/nni?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge&utm_content=badge) in NNI.

Join IM discussion groups:
|Gitter||WeChat|
|----|----|----|
352
|![image](https://user-images.githubusercontent.com/39592018/80665738-e0574a80-8acc-11ea-91bc-0836dc4cbf89.png)| OR |![image](https://github.com/scarlett2018/nniutil/raw/master/wechat.png)|
JSong-Jia's avatar
JSong-Jia committed
353

Chi Song's avatar
Chi Song committed
354

355
## Related Projects
Daiki Katsuragawa's avatar
Daiki Katsuragawa committed
356

357
Targeting at openness and advancing state-of-art technology, [Microsoft Research (MSR)](https://www.microsoft.com/en-us/research/group/systems-and-networking-research-group-asia/) had also released few other open source projects.
358
359
360
361
362
363
364

* [OpenPAI](https://github.com/Microsoft/pai) : an open source platform that provides complete AI model training and resource management capabilities, it is easy to extend and supports on-premise, cloud and hybrid environments in various scale.
* [FrameworkController](https://github.com/Microsoft/frameworkcontroller) : an open source general-purpose Kubernetes Pod Controller that orchestrate all kinds of applications on Kubernetes by a single controller.
* [MMdnn](https://github.com/Microsoft/MMdnn) : A comprehensive, cross-framework solution to convert, visualize and diagnose deep neural network models. The "MM" in MMdnn stands for model management and "dnn" is an acronym for deep neural network.
* [SPTAG](https://github.com/Microsoft/SPTAG) : Space Partition Tree And Graph (SPTAG) is an open source library for large scale vector approximate nearest neighbor search scenario.

We encourage researchers and students leverage these projects to accelerate the AI development and research.
Microsoft Open Source's avatar
Microsoft Open Source committed
365

Chi Song's avatar
Chi Song committed
366
367
## **License**

368
The entire codebase is under [MIT license](LICENSE)