Unverified Commit 51d261e7 authored by J-shang's avatar J-shang Committed by GitHub
Browse files

Merge pull request #4668 from microsoft/doc-refactor

parents d63a2ea3 b469e1c1
Advanced Features
=================
.. toctree::
:maxdepth: 2
Write a New Tuner <Tuner/CustomizeTuner>
Write a New Assessor <Assessor/CustomizeAssessor>
Write a New Advisor <Tuner/CustomizeAdvisor>
Write a New Training Service <TrainingService/HowToImplementTrainingService>
Install Customized Algorithms as Builtin Tuners/Assessors/Advisors <Tutorial/InstallCustomizedAlgos>
.. 43bb394b1e25458a948c134058ec68ac
高级功能
=================
.. toctree::
:maxdepth: 2
编写新的 Tuner <Tuner/CustomizeTuner>
编写新的 Assessor <Assessor/CustomizeAssessor>
编写新的 Advisor <Tuner/CustomizeAdvisor>
编写新的训练平台 <TrainingService/HowToImplementTrainingService>
安装自定义的 Tuners/Assessors/Advisors <Tutorial/InstallCustomizedAlgos>
#############################
Auto (Hyper-parameter) Tuning
#############################
Auto tuning is one of the key features provided by NNI; a main application scenario being
hyper-parameter tuning. Tuning specifically applies to trial code. We provide a lot of popular
auto tuning algorithms (called Tuner), and some early stop algorithms (called Assessor).
NNI supports running trials on various training platforms, for example, on a local machine,
on several servers in a distributed manner, or on platforms such as OpenPAI, Kubernetes, etc.
Other key features of NNI, such as model compression, feature engineering, can also be further
enhanced by auto tuning, which we'll described when introducing those features.
NNI has high extensibility, advanced users can customize their own Tuner, Assessor, and Training Service
according to their needs.
.. toctree::
:maxdepth: 2
Write Trial <TrialExample/Trials>
Tuners <builtin_tuner>
Assessors <builtin_assessor>
Training Platform <training_services>
Examples <examples>
WebUI <Tutorial/WebUI>
How to Debug <Tutorial/HowToDebug>
Advanced <hpo_advanced>
HPO Benchmarks <hpo_benchmark>
.. 6ed30d3a87dbc4c1c4650cf56f074045
##############
自动超参数调优
##############
自动调优是 NNI 的主要功能之一。它的工作模式是
反复运行 trial 代码,每次向其提供不同的超参组合,从而对 trial 的运行结果进行调优。
NNI 提供了很多流行的自动调优算法(称为 Tuner)和一些提前终止算法(称为 Assessor)。
NNI 支持在多种训练平台上运行 trial,包括本机、
远程服务器、Azure Machine Learning、基于 Kubernetes 的集群(如 OpenPAI、Kubeflow)等等。
其他的功能,例如模型压缩、特征工程,也可以
使用自动调优。这些我们在介绍相应功能的时候会具体介绍。
NNI 具有高扩展性,
用户可以根据需求实现自己的 Tuner 算法和训练平台。
.. toctree::
:maxdepth: 2
实现 Trial <./TrialExample/Trials>
Tuners <builtin_tuner>
Assessors <builtin_assessor>
训练平台 <training_services>
示例 <examples>
Web 界面 <Tutorial/WebUI>
如何调试 <Tutorial/HowToDebug>
高级功能 <hpo_advanced>
Tuner 基准测试 <hpo_benchmark>
.. modified from index.html
.. replace \{\{ pathto\('(.*)'\) \}\} -> $1.html
###########################
Neural Network Intelligence
###########################
===========================
.. toctree::
:maxdepth: 2
:caption: Get Started
:hidden:
Installation <installation>
QuickStart <Tutorial/QuickStart>
Tutorials <tutorials>
installation
quickstart
Learning NNI <tutorials>
.. toctree::
:maxdepth: 2
:caption: Advanced Materials
:caption: Full-scale Materials
:hidden:
Auto (Hyper-parameter) Tuning <hyperparameter_tune>
Neural Architecture Search <nas>
Model Compression <model_compression>
Hyperparameter Optimization <hpo/index>
Neural Architecture Search <nas/index>
Model Compression <compression/index>
Feature Engineering <feature_engineering>
Experiment <experiment/overview>
.. toctree::
:maxdepth: 2
:caption: References
:hidden:
References <reference>
nnictl Commands <reference/nnictl>
Experiment Configuration <reference/experiment_config>
Python API <reference/_modules/nni>
API Reference <reference/python_api_ref>
.. toctree::
:maxdepth: 2
:caption: Misc
:hidden:
Use Cases and Solutions <CommunitySharings/community_sharings>
Research and Publications <ResearchPublications>
FAQ <Tutorial/FAQ>
Use Cases and Solutions <misc/community_sharings>
Research and Publications <misc/research_publications>
FAQ <misc/faq>
notes/build_from_source
Contribution Guide <notes/contributing>
Change Log <Release>
**NNI (Neural Network Intelligence)** is a lightweight but powerful toolkit to help users **automate**:
* :doc:`Hyperparameter Tuning </hpo/overview>`,
* :doc:`Neural Architecture Search </nas/index>`,
* :doc:`Model Compression </compression/index>`,
* :doc:`Feature Engineering </FeatureEngineering/Overview>`.
.. Can't use section title here due to the limitation of toc
.. raw:: html
<h2>Get Started Now</h2>
To install the current release:
.. code-block:: bash
$ pip install nni
See the :doc:`installation guide </installation>` if you need additional help on installation.
Then, please read :doc:`quickstart` and :doc:`tutorials` to start your journey with NNI!
.. Please keep this part sync with readme
.. raw:: html
<h2>Latest Updates
.. image:: ../img/release_icon.png
:class: release-icon
.. raw:: html
<div class="rowHeight">
<div class="chinese"><a href="https://nni.readthedocs.io/zh/stable/">简体中文</a></div>
<b>NNI (Neural Network Intelligence)</b> is a lightweight but powerful toolkit to
help users <b>automate</b>
<a href="FeatureEngineering/Overview.html">Feature Engineering</a>,
<a href="NAS/Overview.html">Neural Architecture Search</a>,
<a href="Tuner/BuiltinTuner.html">Hyperparameter Tuning</a> and
<a href="Compression/Overview.html">Model Compression</a>.
</div>
<p class="gap rowHeight">
The tool manages automated machine learning (AutoML) experiments,
<b>dispatches and runs</b>
experiments' trial jobs generated by tuning algorithms to search the best neural
architecture and/or hyper-parameters in
<b>different training environments</b> like
<a href="TrainingService/LocalMode.html">Local Machine</a>,
<a href="TrainingService/RemoteMachineMode.html">Remote Servers</a>,
<a href="TrainingService/PaiMode.html">OpenPAI</a>,
<a href="TrainingService/KubeflowMode.html">Kubeflow</a>,
<a href="TrainingService/FrameworkControllerMode.html">FrameworkController on K8S (AKS etc.)</a>,
<a href="TrainingService/DLTSMode.html">DLWorkspace (aka. DLTS)</a>,
<a href="TrainingService/AMLMode.html">AML (Azure Machine Learning)</a>,
<a href="TrainingService/AdaptDLMode.html">AdaptDL (aka. ADL)</a>, other cloud options and even <a href="TrainingService/HybridMode.html">Hybrid mode</a>.
</p>
<!-- Who should consider using NNI -->
<div>
<h2 class="title">Who should consider using NNI</h2>
<ul>
<li>Those who want to <b>try different AutoML algorithms</b> in their training code/model.</li>
<li>Those who want to run AutoML trial jobs <b>in different environments</b> to speed up search.</li>
<li class="rowHeight">Researchers and data scientists who want to easily <b>implement and experiement new AutoML
algorithms</b>
, may it be: hyperparameter tuning algorithm,
neural architect search algorithm or model compression algorithm.
</li>
<li>ML Platform owners who want to <b>support AutoML in their platform</b></li>
</ul>
</div>
<!-- what's new -->
<div>
<div class="inline gap">
<h2>What's NEW! </h2>
<img width="48" src="_static/img/release_icon.png">
</div>
<hr class="whatNew"/>
<ul>
<li><b>New release:</b> <a href='https://github.com/microsoft/nni/releases/tag/v2.6'>v2.6 is available. <i>- released on Jan-18-2022</i></a></li>
<li><b>New demo available:</b> <a href="https://www.youtube.com/channel/UCKcafm6861B2mnYhPbZHavw">Youtube entry</a> | <a href="https://space.bilibili.com/1649051673">Bilibili</a> 入口 <i>- last updated on May-26-2021</i></li>
<li><b>New webinar:</b> <a href="https://note.microsoft.com/MSR-Webinar-Retiarii-Registration-On-Demand.html">
Introducing Retiarii: A deep learning exploratory-training framework on NNI
</a> <i>- scheduled on June-24-2021</i>
</li>
<li><b>New community channel:</b> <a href="https://github.com/microsoft/nni/discussions">Discussions</a></li>
<li>
<div><b>New emoticons release:</b> <a href="nnSpider.html">nnSpider</a></div>
<img class="gap" src="_static/img/home.svg"></img>
</li>
</ul>
</div>
<!-- NNI capabilities in a glance -->
<div class="gap">
<h2 class="title">NNI capabilities in a glance</h2>
<p class="rowHeight">
NNI provides CommandLine Tool as well as an user friendly WebUI to manage training experiements.
With the extensible API, you can customize your own AutoML algorithms and training services.
To make it easy for new users, NNI also provides a set of build-in stat-of-the-art
AutoML algorithms and out of box support for popular training platforms.
</p>
<p class="rowHeight">
Within the following table, we summarized the current NNI capabilities,
we are gradually adding new capabilities and we'd love to have your contribution.
</p>
</div>
<p align="center">
<a href="#overview"><img src="_static/img/overview.svg" /></a>
</p>
<table class="main-table">
<tbody>
<tr align="center" valign="bottom" class="column">
<td></td>
<td class="framework">
<b>Frameworks & Libraries</b>
</td>
<td>
<b>Algorithms</b>
</td>
<td>
<b>Training Services</b>
</td>
</tr>
</tr>
<tr>
<td class="verticalMiddle"><b>Built-in</b></td>
<td>
<ul class="firstUl">
<li><b>Supported Frameworks</b></li>
<ul class="circle">
<li>PyTorch</li>
<li>Keras</li>
<li>TensorFlow</li>
<li>MXNet</li>
<li>Caffe2</li>
<a href="SupportedFramework_Library.html">More...</a><br />
</ul>
</ul>
<ul class="firstUl">
<li><b>Supported Libraries</b></li>
<ul class="circle">
<li>Scikit-learn</li>
<li>XGBoost</li>
<li>LightGBM</li>
<a href="SupportedFramework_Library.html">More...</a><br />
</ul>
</ul>
<ul class="firstUl">
<li><b>Examples</b></li>
<ul class="circle">
<li><a href="https://github.com/microsoft/nni/tree/master/examples/trials/mnist-pytorch">MNIST-pytorch</li>
</a>
<li><a href="https://github.com/microsoft/nni/tree/master/examples/trials/mnist-tfv2">MNIST-tensorflow</li>
</a>
<li><a href="https://github.com/microsoft/nni/tree/master/examples/trials/mnist-keras">MNIST-keras</li></a>
<li><a href="TrialExample/GbdtExample.html">Auto-gbdt</a></li>
<li><a href="TrialExample/Cifar10Examples.html">Cifar10-pytorch</li></a>
<li><a href="TrialExample/SklearnExamples.html">Scikit-learn</a></li>
<li><a href="TrialExample/EfficientNet.html">EfficientNet</a></li>
<li><a href="TrialExample/OpEvoExamples.html">Kernel Tunning</li></a>
<a href="SupportedFramework_Library.html">More...</a><br />
</ul>
</ul>
</td>
<td align="left">
<a href="Tuner/BuiltinTuner.html">Hyperparameter Tuning</a>
<ul class="firstUl">
<div><b>Exhaustive search</b></div>
<ul class="circle">
<li><a href="Tuner/BuiltinTuner.html#Random">Random Search</a></li>
<li><a href="Tuner/BuiltinTuner.html#GridSearch">Grid Search</a></li>
<li><a href="Tuner/BuiltinTuner.html#Batch">Batch</a></li>
</ul>
<div><b>Heuristic search</b></div>
<ul class="circle">
<li><a href="Tuner/BuiltinTuner.html#Evolution">Naïve Evolution</a></li>
<li><a href="Tuner/BuiltinTuner.html#Anneal">Anneal</a></li>
<li><a href="Tuner/BuiltinTuner.html#Hyperband">Hyperband</a></li>
<li><a href="Tuner/BuiltinTuner.html#PBTTuner">PBT</a></li>
</ul>
<div><b>Bayesian optimization</b></div>
<ul class="circle">
<li><a href="Tuner/BuiltinTuner.html#BOHB">BOHB</a></li>
<li><a href="Tuner/BuiltinTuner.html#TPE">TPE</a></li>
<li><a href="Tuner/BuiltinTuner.html#SMAC">SMAC</a></li>
<li><a href="Tuner/BuiltinTuner.html#MetisTuner">Metis Tuner</a></li>
<li><a href="Tuner/BuiltinTuner.html#GPTuner">GP Tuner</a> </li>
<li><a href="Tuner/BuiltinTuner.html#DNGOTuner">DNGO Tuner</a></li>
</ul>
</ul>
<a href="NAS/Overview.html">Neural Architecture Search (Retiarii)</a>
<ul class="firstUl">
<ul class="circle">
<li><a href="NAS/ENAS.html">ENAS</a></li>
<li><a href="NAS/DARTS.html">DARTS</a></li>
<li><a href="NAS/SPOS.html">SPOS</a></li>
<li><a href="NAS/Proxylessnas.html">ProxylessNAS</a></li>
<li><a href="NAS/FBNet.html">FBNet</a></li>
<li><a href="NAS/ExplorationStrategies.html">Reinforcement Learning</a></li>
<li><a href="NAS/ExplorationStrategies.html">Regularized Evolution</a></li>
<li><a href="NAS/Overview.html">More...</a></li>
</ul>
</ul>
<a href="Compression/Overview.html">Model Compression</a>
<ul class="firstUl">
<div><b>Pruning</b></div>
<ul class="circle">
<li><a href="Compression/Pruner.html#agp-pruner">AGP Pruner</a></li>
<li><a href="Compression/Pruner.html#slim-pruner">Slim Pruner</a></li>
<li><a href="Compression/Pruner.html#fpgm-pruner">FPGM Pruner</a></li>
<li><a href="Compression/Pruner.html#netadapt-pruner">NetAdapt Pruner</a></li>
<li><a href="Compression/Pruner.html#simulatedannealing-pruner">SimulatedAnnealing Pruner</a></li>
<li><a href="Compression/Pruner.html#admm-pruner">ADMM Pruner</a></li>
<li><a href="Compression/Pruner.html#autocompress-pruner">AutoCompress Pruner</a></li>
<li><a href="Compression/Overview.html">More...</a></li>
</ul>
<div><b>Quantization</b></div>
<ul class="circle">
<li><a href="Compression/Quantizer.html#qat-quantize">QAT Quantizer</a></li>
<li><a href="Compression/Quantizer.html#dorefa-quantizer">DoReFa Quantizer</a></li>
<li><a href="Compression/Quantizer.html#bnn-quantizer">BNN Quantizer</a></li>
</ul>
</ul>
<a href="FeatureEngineering/Overview.html">Feature Engineering (Beta)</a>
<ul class="circle">
<li><a href="FeatureEngineering/GradientFeatureSelector.html">GradientFeatureSelector</a></li>
<li><a href="FeatureEngineering/GBDTSelector.html">GBDTSelector</a></li>
</ul>
<a href="Assessor/BuiltinAssessor.html">Early Stop Algorithms</a>
<ul class="circle">
<li><a href="Assessor/BuiltinAssessor.html#MedianStop">Median Stop</a></li>
<li><a href="Assessor/BuiltinAssessor.html#Curvefitting">Curve Fitting</a></li>
</ul>
</td>
<td>
<ul class="firstUl">
<li><a href="TrainingService/LocalMode.html">Local Machine</a></li>
<li><a href="TrainingService/RemoteMachineMode.html">Remote Servers</a></li>
<li><a href="TrainingService/HybridMode.html">Hybrid mode</a></li>
<li><a href="TrainingService/AMLMode.html">AML(Azure Machine Learning)</a></li>
<li><b>Kubernetes based services</b></li>
<ul>
<li><a href="TrainingService/PaiMode.html">OpenPAI</a></li>
<li><a href="TrainingService/KubeflowMode.html">Kubeflow</a></li>
<li><a href="TrainingService/FrameworkControllerMode.html">FrameworkController on K8S (AKS etc.)</a></li>
<li><a href="TrainingService/DLTSMode.html">DLWorkspace (aka. DLTS)</a></li>
<li><a href="TrainingService/AdaptDLMode.html">AdaptDL (aka. ADL)</a></li>
</ul>
</ul>
</td>
</tr>
<tr valign="top">
<td class="verticalMiddle"><b>References</b></td>
<td>
<ul class="firstUl">
<li><a href="Tutorial/HowToLaunchFromPython.html">Python API</a></li>
<li><a href="Tutorial/AnnotationSpec.html">NNI Annotation</a></li>
<li><a href="installation.html">Supported OS</a></li>
</ul>
</td>
<td>
<ul class="firstUl">
<li><a href="Tuner/CustomizeTuner.html">CustomizeTuner</a></li>
<li><a href="Assessor/CustomizeAssessor.html">CustomizeAssessor</a></li>
<li><a href="Tutorial/InstallCustomizedAlgos.html">Install Customized Algorithms as Builtin Tuners/Assessors/Advisors</a></li>
<li><a href="NAS/QuickStart.html">Define NAS Model Space</a></li>
<li><a href="NAS/ApiReference.html">NAS/Retiarii APIs</a></li>
</ul>
</td>
<td>
<ul class="firstUl">
<li><a href="TrainingService/Overview.html">Support TrainingService</a></li>
<li><a href="TrainingService/HowToImplementTrainingService.html">Implement TrainingService</a></li>
</ul>
</td>
</tr>
</tbody>
</table>
<!-- Installation -->
<div class="gap">
<h2 class="title">Installation</h2>
<div>
<h3 class="second-title">Install</h3>
<div class="gap2">
NNI supports and is tested on Ubuntu >= 16.04, macOS >= 10.14.1,
and Windows 10 >= 1809. Simply run the following <code>pip install</code>
in an environment that has <code>python 64-bit >= 3.6</code>.
</div>
<div class="command-intro">Linux or macOS</div>
<div class="command">python3 -m pip install --upgrade nni</div>
<div class="command-intro">Windows</div>
<div class="command">python -m pip install --upgrade nni</div>
<div class="command-intro">If you want to try latest code, please <a href="installation.html">install
NNI</a> from source code.
</div>
<div class="chinese">For detail system requirements of NNI, please refer to <a href="Tutorial/InstallationLinux.html">here</a>
for Linux & macOS, and <a href="Tutorial/InstallationWin.html">here</a> for Windows.</div>
</div>
<div>
<p>Note:</p>
<ul>
<li>If there is any privilege issue, add --user to install NNI in the user directory.</li>
<li class="rowHeight">Currently NNI on Windows supports local, remote and pai mode. Anaconda or Miniconda is highly
recommended to install <a href="Tutorial/InstallationWin.html">NNI on Windows</a>.</li>
<li>If there is any error like Segmentation fault, please refer to <a
href="installation.html">FAQ</a>. For FAQ on Windows, please refer
to <a href="Tutorial/InstallationWin.html">NNI on Windows</a>.</li>
</ul>
</div>
<div>
<h3 class="second-title gap">Verify installation</h3>
<div>
The following example is built on TensorFlow 1.x. Make sure <b>TensorFlow 1.x is used</b> when running
it.
</div>
<ul>
<li>
<div class="command-intro">Download the examples via clone the source code.</div>
<div class="command">git clone -b v2.6 https://github.com/Microsoft/nni.git</div>
</li>
<li>
<div>Run the MNIST example.</div>
<div class="command-intro">Linux or macOS</div>
<div class="command">nnictl create --config nni/examples/trials/mnist-pytorch/config.yml</div>
<div class="command-intro">Windows</div>
<div class="command">nnictl create --config nni\examples\trials\mnist-pytorch\config_windows.yml</div>
</li>
<li>
<div class="rowHeight">
Wait for the message INFO: Successfully started experiment! in the command line.
This message indicates that your experiment has been successfully started.
You can explore the experiment using the Web UI url.
</div>
<!-- Indentation affects style! -->
<pre class="main-code">
INFO: Starting restful server...
INFO: Successfully started Restful server!
INFO: Setting local config...
INFO: Successfully set local config!
INFO: Starting experiment...
INFO: Successfully started experiment!
-----------------------------------------------------------------------
The experiment id is egchD4qy
The Web UI urls are: http://223.255.255.1:8080 http://127.0.0.1:8080
-----------------------------------------------------------------------
You can use these commands to get more information about the experiment
-----------------------------------------------------------------------
commands description
1. nnictl experiment show show the information of experiments
2. nnictl trial ls list all of trial jobs
3. nnictl top monitor the status of running experiments
4. nnictl log stderr show stderr log content
5. nnictl log stdout show stdout log content
6. nnictl stop stop an experiment
7. nnictl trial kill kill a trial job by id
8. nnictl --help get help information about nnictl
-----------------------------------------------------------------------
</pre>
</li>
<li class="rowHeight">
Open the Web UI url in your browser, you can view detail information of the experiment and
all the submitted trial jobs as shown below. <a href="Tutorial/WebUI.html">Here</a> are more Web UI
pages.
<img class="gap" src="_static/img/webui.gif" width="100%"/>
</div>
</li>
</ul>
</div>
<!-- Releases and Contributing -->
<div class="gap">
<h2 class="title">Releases and Contributing</h2>
<div>NNI has a monthly release cycle (major releases). Please let us know if you encounter a bug by filling an issue.</div>
<br/>
<div>We appreciate all contributions. If you are planning to contribute any bug-fixes, please do so without further discussions.</div>
<br/>
<div class="rowHeight">If you plan to contribute new features, new tuners, new training services, etc. please first open an issue or reuse an exisiting issue, and discuss the feature with us. We will discuss with you on the issue timely or set up conference calls if needed.</div>
<br/>
<div>To learn more about making a contribution to NNI, please refer to our <a href="contribution.html"">How-to contribution page</a>.</div>
<br/>
<div>We appreciate all contributions and thank all the contributors!</div>
<img class="gap" src="_static/img/contributors.png"></img>
</div>
<!-- feedback -->
<div class="gap">
<h2 class="title">Feedback</h2>
<ul>
<li><a href="https://github.com/microsoft/nni/issues/new/choose">File an issue</a> on GitHub.</li>
<li>Open or participate in a <a href="https://github.com/microsoft/nni/discussions">discussion</a>.</li>
<li>Discuss on the <a href="https://gitter.im/Microsoft/nni?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge&utm_content=badge">NNI Gitter</a> in NNI.</li>
</ul>
<div>
<div class="rowHeight">Join IM discussion groups:</div>
<table class="gap" border=1 style="border-collapse: collapse;">
<tbody>
<tr style="line-height: 30px;">
<th>Gitter</th>
<td></td>
<th>WeChat</th>
</tr>
<tr>
<td class="QR">
<img src="https://user-images.githubusercontent.com/39592018/80665738-e0574a80-8acc-11ea-91bc-0836dc4cbf89.png" alt="Gitter" />
</td>
<td width="80" align="center" class="or">OR</td>
<td class="QR">
<img src="https://github.com/scarlett2018/nniutil/raw/master/wechat.png" alt="NNI Wechat" />
</td>
</tr>
</tbody>
</table>
</div>
</div>
<!-- Test status -->
<div class="gap">
<h2 class="title">Test status</h2>
<h3>Essentials</h3>
<table class="pipeline">
<tr>
<th>Type</th>
<th>Status</th>
</tr>
<tr>
<td>Fast test</td>
<td>
<a href="https://msrasrg.visualstudio.com/NNIOpenSource/_build/latest?definitionId=54&branchName=master">
<img src="https://msrasrg.visualstudio.com/NNIOpenSource/_apis/build/status/fast%20test?branchName=master"/>
</a>
</td>
</tr>
<tr>
<td>Full linux</td>
<td>
<a href="https://msrasrg.visualstudio.com/NNIOpenSource/_build/latest?definitionId=62&repoName=microsoft%2Fnni&branchName=master">
<img src="https://msrasrg.visualstudio.com/NNIOpenSource/_apis/build/status/full%20test%20-%20linux?repoName=microsoft%2Fnni&branchName=master"/>
</a>
</td>
</tr>
<tr>
<td>Full windows</td>
<td>
<a href="https://msrasrg.visualstudio.com/NNIOpenSource/_build/latest?definitionId=63&branchName=master">
<img src="https://msrasrg.visualstudio.com/NNIOpenSource/_apis/build/status/full%20test%20-%20windows?branchName=master"/>
</a>
</td>
</tr>
</table>
<h3 class="gap">Training services</h3>
<table class="pipeline">
<tr>
<th>Type</th>
<th>Status</th>
</th>
<tr>
<td>Remote - linux to linux</td>
<td>
<a href="https://msrasrg.visualstudio.com/NNIOpenSource/_build/latest?definitionId=64&branchName=master">
<img src="https://msrasrg.visualstudio.com/NNIOpenSource/_apis/build/status/integration%20test%20-%20remote%20-%20linux%20to%20linux?branchName=master"/>
</a>
</td>
</tr>
<tr>
<td>Remote - linux to windows</td>
<td>
<a href="https://msrasrg.visualstudio.com/NNIOpenSource/_build/latest?definitionId=67&branchName=master">
<img src="https://msrasrg.visualstudio.com/NNIOpenSource/_apis/build/status/integration%20test%20-%20remote%20-%20linux%20to%20windows?branchName=master"/>
</a>
</td>
</tr>
<tr>
<td>Remote - windows to linux</td>
<td>
<a href="https://msrasrg.visualstudio.com/NNIOpenSource/_build/latest?definitionId=68&branchName=master">
<img src="https://msrasrg.visualstudio.com/NNIOpenSource/_apis/build/status/integration%20test%20-%20remote%20-%20windows%20to%20linux?branchName=master"/>
</a>
</td>
</tr>
<tr>
<td>OpenPAI</td>
<td>
<a href="https://msrasrg.visualstudio.com/NNIOpenSource/_build/latest?definitionId=65&branchName=master">
<img src="https://msrasrg.visualstudio.com/NNIOpenSource/_apis/build/status/integration%20test%20-%20openpai%20-%20linux?branchName=master"/>
</a>
</td>
</tr>
<tr>
<td>Frameworkcontroller</td>
<td>
<a href="https://msrasrg.visualstudio.com/NNIOpenSource/_build/latest?definitionId=70&branchName=master">
<img src="https://msrasrg.visualstudio.com/NNIOpenSource/_apis/build/status/integration%20test%20-%20frameworkcontroller?branchName=master"/>
</a>
</td>
</tr>
<tr>
<td>Kubeflow</td>
<td>
<a href="https://msrasrg.visualstudio.com/NNIOpenSource/_build/latest?definitionId=69&branchName=master">
<img src="https://msrasrg.visualstudio.com/NNIOpenSource/_apis/build/status/integration%20test%20-%20kubeflow?branchName=master"/>
</a>
</td>
</tr>
<tr>
<td>Hybrid</td>
<td>
<a href="https://msrasrg.visualstudio.com/NNIOpenSource/_build/latest?definitionId=79&branchName=master">
<img src="https://msrasrg.visualstudio.com/NNIOpenSource/_apis/build/status/integration%20test%20-%20hybrid?branchName=master"/>
</a>
</td>
</tr>
<tr>
<td>AzureML</td>
<td>
<a href="https://msrasrg.visualstudio.com/NNIOpenSource/_build/latest?definitionId=78&branchName=master">
<img src="https://msrasrg.visualstudio.com/NNIOpenSource/_apis/build/status/integration%20test%20-%20aml?branchName=master"/>
</a>
</td>
</tr>
</table>
</div>
<!-- Related Projects -->
<div class="gap">
<h2 class="title">Related Projects</h2>
<p class="rowHeight">
Targeting at openness and advancing state-of-art technology,
<a href="https://www.microsoft.com/en-us/research/group/systems-and-networking-research-group-asia/">Microsoft Research (MSR)</a>
had also released few
other open source projects.</p>
<ul id="relatedProject">
<li class="rowHeight">
<a href="https://github.com/Microsoft/pai">OpenPAI</a> : an open source platform that provides complete AI model
training and resource management
capabilities, it is easy to extend and supports on-premise,
cloud and hybrid environments in various scale.
</li>
<li class="rowHeight">
<a href="https://github.com/Microsoft/frameworkcontroller">FrameworkController</a> : an open source
general-purpose Kubernetes Pod Controller that orchestrate
all kinds of applications on Kubernetes by a single controller.
</li>
<li class="rowHeight">
<a href="https://github.com/Microsoft/MMdnn">MMdnn</a> : A comprehensive, cross-framework solution to convert,
visualize and diagnose deep neural network
models. The "MM" in MMdnn stands for model management
and "dnn" is an acronym for deep neural network.
</li>
<li class="rowHeight">
<a href="https://github.com/Microsoft/SPTAG">SPTAG</a> : Space Partition Tree And Graph (SPTAG) is an open
source library
for large scale vector approximate nearest neighbor search scenario.
</li>
<li class="rowHeight">
<a href="https://github.com/Microsoft/SPTAG">nn-Meter</a> : An accurate inference latency predictor for DNN models on diverse edge devices.
</li>
</ul>
<p>We encourage researchers and students leverage these projects to accelerate the AI development and research.</p>
</div>
<!-- License -->
<div>
<h2 class="title">License</h2>
<p>The entire codebase is under <a href="https://github.com/microsoft/nni/blob/master/LICENSE">MIT license</a></p>
</div>
</div>
</h2>
* **New release**: `v2.6 is available <https://github.com/microsoft/nni/releases/tag/v2.6>`_ - *released on Jan-19-2022*
* **New demo available**: `Youtube entry <https://www.youtube.com/channel/UCKcafm6861B2mnYhPbZHavw>`_ | `Bilibili 入口 <https://space.bilibili.com/1649051673>`_ - *last updated on May-26-2021*
* **New webinar**: `Introducing Retiarii, A deep learning exploratory-training framework on NNI <https://note.microsoft.com/MSR-Webinar-Retiarii-Registration-Live.html>`_ - *scheduled on June-24-2021*
* **New community channel**: `Discussions <https://github.com/microsoft/nni/discussions>`_
* **New emoticons release**: :doc:`nnSpider <nnSpider>`
.. raw:: html
<h2>Why choose NNI?</h2>
<h3>NNI makes AutoML techniques plug-and-play.</h3>
<div class="codesnippet-card-container">
.. codesnippetcard::
:icon: ../img/thumbnails/hpo-small.svg
:title: Hyper-parameter Tuning
:link: tutorials/hpo_quickstart_pytorch/main
.. code-block::
params = nni.get_next_parameter()
class Net(nn.Module):
...
model = Net()
optimizer = optim.SGD(model.parameters(),
params['lr'],
params['momentum'])
for epoch in range(10):
train(...)
accuracy = test(model)
nni.report_final_result(accuracy)
.. codesnippetcard::
:icon: ../img/thumbnails/pruning-small.svg
:title: Model Pruning
:link: tutorials/pruning_quick_start_mnist
.. code-block::
# define a config_list
config = [{
'sparsity': 0.8,
'op_types': ['Conv2d']
}]
# generate masks for simulated pruning
wrapped_model, masks = \
L1NormPruner(model, config). \
compress()
# apply the masks for real speed up
ModelSpeedup(unwrapped_model, input, masks). \
speedup_model()
.. codesnippetcard::
:icon: ../img/thumbnails/quantization-small.svg
:title: Quantization
:link: tutorials/quantization_speed_up
.. code-block::
# define a config_list
config = [{
'quant_types': ['input', 'weight'],
'quant_bits': {'input': 8, 'weight': 8},
'op_types': ['Conv2d']
}]
# in case quantizer needs a extra training
quantizer = QAT_Quantizer(model, config)
quantizer.compress()
# Training...
# export calibration config and
# generate TensorRT engine for real speed up
calibration_config = quantizer.export_model(
model_path, calibration_path)
engine = ModelSpeedupTensorRT(
model, input_shape, config=calib_config)
engine.compress()
.. codesnippetcard::
:icon: ../img/thumbnails/multi-trial-nas-small.svg
:title: Neural Architecture Search
:link: tutorials/hello_nas
.. code-block:: diff
# define model space
- self.conv2 = nn.Conv2d(32, 64, 3, 1)
+ self.conv2 = nn.LayerChoice([
+ nn.Conv2d(32, 64, 3, 1),
+ DepthwiseSeparableConv(32, 64)
+ ])
# search strategy + evaluator
strategy = RegularizedEvolution()
evaluator = FunctionalEvaluator(
train_eval_fn)
# run experiment
RetiariiExperiment(model_space,
evaluator, strategy).run()
.. codesnippetcard::
:icon: ../img/thumbnails/one-shot-nas-small.svg
:title: One-shot NAS
:link: nas/index
.. code-block::
# define model space
space = AnySearchSpace()
# get a darts trainer
trainer = DartsTrainer(space, loss, metrics)
trainer.fit()
# get final searched architecture
arch = trainer.export()
.. codesnippetcard::
:icon: ../img/thumbnails/feature-engineering-small.svg
:title: Feature Engineering
:link: FeatureEngineering/Overview
.. code-block::
selector = GBDTSelector()
selector.fit(
X_train, y_train,
lgb_params=lgb_params,
eval_ratio=eval_ratio,
early_stopping_rounds=10,
importance_type='gain',
num_boost_round=1000)
# get selected features
features = selector.get_selected_features()
.. End of code snippet card
.. raw:: html
</div>
<h3>NNI eases the effort to scale and manage AutoML experiments.</h3>
.. codesnippetcard::
:icon: ../img/thumbnails/training-service-small.svg
:title: Training Service
:link: experiment/training_service
:seemore: See more here.
An AutoML experiment requires many trials to explore feasible and potentially good-performing models.
**Training service** aims to make the tuning process easily scalable in a distributed platforms.
It provides a unified user experience for diverse computation resources (e.g., local machine, remote servers, AKS).
Currently, NNI supports **more than 9** kinds of training services.
.. codesnippetcard::
:icon: ../img/thumbnails/web-portal-small.svg
:title: Web Portal
:link: experiment/web_portal
:seemore: See more here.
Web portal visualizes the tuning process, exposing the ability to inspect, monitor and control the experiment.
.. image:: ../static/img/webui.gif
:width: 100%
.. codesnippetcard::
:icon: ../img/thumbnails/experiment-management-small.svg
:title: Experiment Management
:link: experiment/exp_management
:seemore: See more here.
The DNN model tuning often requires more than one experiment.
Users might try different tuning algorithms, fine-tune their search space, or switch to another training service.
**Experiment management** provides the power to aggregate and compare tuning results from multiple experiments,
so that the tuning workflow becomes clean and organized.
.. raw:: html
<h2>Get Support and Contribute Back</h2>
NNI is maintained on the `NNI GitHub repository <https://github.com/microsoft/nni>`_. We collect feedbacks and new proposals/ideas on GitHub. You can:
* Open a `GitHub issue <https://github.com/microsoft/nni/issues>`_ for bugs and feature requests.
* Open a `pull request <https://github.com/microsoft/nni/pulls>`_ to contribute code (make sure to read the `contribution guide </contribution>` before doing this).
* Participate in `NNI Discussion <https://github.com/microsoft/nni/discussions>`_ for general questions and new ideas.
* Join the following IM groups.
.. list-table::
:header-rows: 1
:widths: auto
* - Gitter
- WeChat
* -
.. image:: https://user-images.githubusercontent.com/39592018/80665738-e0574a80-8acc-11ea-91bc-0836dc4cbf89.png
-
.. image:: https://github.com/scarlett2018/nniutil/raw/master/wechat.png
.. raw:: html
<h2>Citing NNI</h2>
If you use NNI in a scientific publication, please consider citing NNI in your references.
Microsoft. Neural Network Intelligence (version |release|). https://github.com/microsoft/nni
Bibtex entry (please replace the version with the particular version you are using): ::
@software{nni2021,
author = {{Microsoft}},
month = {1},
title = {{Neural Network Intelligence}},
url = {https://github.com/microsoft/nni},
version = {2.0},
year = {2021}
}
.. ff683903b57318e8baa425ef7a2afaf1
.. 4d622b7ee5031e9cccec635bf6c7427d
###########################
Neural Network Intelligence
......@@ -10,17 +10,21 @@ Neural Network Intelligence
:titlesonly:
:hidden:
入门 <quickstart>
安装 <installation>
入门<Tutorial/QuickStart>
教程<tutorials>
自动(超参数)调优 <hyperparameter_tune>
神经网络架构搜索<nas>
模型压缩<model_compression>
自动(超参数)调优 <hpo/index>
神经网络架构搜索<nas/index>
模型压缩<compression/index>
特征工程<feature_engineering>
NNI实验 <experiment/overview>
HPO API Reference <reference/hpo>
Experiment API Reference <reference/experiment>
参考<reference>
示例与解决方案<CommunitySharings/community_sharings>
研究和出版物 <ResearchPublications>
常见问题 <Tutorial/FAQ>
示例与解决方案<misc/community_sharings>
研究和出版物 <misc/research_publications>
常见问题 <misc/faq>
从源代码安装 <notes/build_from_source>
如何贡献 <notes/contributing>
更改日志 <Release>
......@@ -479,4 +483,4 @@ Neural Network Intelligence
<h1 class="title">许可协议</h1>
<p>代码库遵循 <a href="https://github.com/microsoft/nni/blob/master/LICENSE">MIT 许可协议</a></p>
</div>
</div>
\ No newline at end of file
</div>
############
Installation
############
Install NNI
===========
Currently we support installation on Linux, Mac and Windows. We also allow you to use docker.
NNI requires Python >= 3.7.
It is tested and supported on Ubuntu >= 18.04,
Windows 10 >= 21H2, and macOS >= 11.
.. toctree::
:maxdepth: 2
Using pip
---------
Linux & Mac <Tutorial/InstallationLinux>
Windows <Tutorial/InstallationWin>
Use Docker <Tutorial/HowToUseDocker>
\ No newline at end of file
NNI provides official packages for x86-64 CPUs. They can be installed with pip:
.. code-block::
python -m pip install --upgrade nni
You can check installation with:
.. code-block::
nnictl --version
On Linux systems without Conda, you may encounter ``bash: nnictl: command not found``.
In this case you need to add pip script directory to ``PATH``:
.. code-block:: bash
echo 'export PATH=${PATH}:${HOME}/.local/bin' >> ~/.bashrc
source ~/.bashrc
Installing from Source Code
---------------------------
NNI hosts source code on `GitHub <https://github.com/microsoft/nni>`__.
NNI has experimental support for ARM64 CPUs, including Apple M1.
It requires to install from source code.
See :doc:`/notes/build_from_source`.
Using Docker
------------
NNI provides official Docker image on `Docker Hub <https://hub.docker.com/r/msranni/nni>`__.
.. code-block::
docker pull msranni/nni
Installing Extra Dependencies
-----------------------------
Some built-in algorithms of NNI requires extra packages.
Use ``nni[<algorithm-name>]`` to install their dependencies.
For example, to install dependencies of :class:`DNGO tuner<nni.algorithms.hpo.dngo_tuner.DNGOTuner>` :
.. code-block::
python -m pip install nni[DNGO]
This command will not reinstall NNI itself, even if it was installed in development mode.
Alternatively, you may install all extra dependencies at once:
.. code-block::
python -m pip install nni[all]
**NOTE**: SMAC tuner depends on swig3, which requires a manual downgrade on Ubuntu:
.. code-block::
sudo apt install swig3.0
sudo rm /usr/bin/swig
sudo ln -s swig3.0 /usr/bin/swig
######################
Automatic Model Tuning
######################
======================
NNI can be applied on various model tuning tasks. Some state-of-the-art model search algorithms, such as EfficientNet, can be easily built on NNI. Popular models, e.g., recommendation models, can be tuned with NNI. The following are some use cases to illustrate how to leverage NNI in your model tuning tasks and how to build your own pipeline with NNI.
.. toctree::
:maxdepth: 1
Tuning SVD automatically <RecommendersSvd>
EfficientNet on NNI <../TrialExample/EfficientNet>
Automatic Model Architecture Search for Reading Comprehension <../TrialExample/SquadEvolutionExamples>
Parallelizing Optimization for TPE <ParallelizingTpeSearch>
\ No newline at end of file
Tuning SVD automatically <recommenders_svd>
EfficientNet on NNI <efficientnet>
Automatic Model Architecture Search for Reading Comprehension <squad_evolution_examples>
Parallelizing Optimization for TPE <parallelizing_tpe_search>
\ No newline at end of file
.. 21be18c35dee2702eb1c7a805dcfd939
.. 41ac2690980be694ff26b4a06b820fd1
######################
自动模型调优
######################
============
NNI 可以应用于各种模型调优任务。 一些最先进的模型搜索算法,如EfficientNet,可以很容易地在NNI上构建。 流行的模型,例如,推荐模型,可以使用 NNI 进行调优。 下面是一些用例,展示了如何在您的模型调优任务中使用 NNI,以及如何使用 NNI 构建您自己的流水线。
.. toctree::
:maxdepth: 1
SVD 自动调优 <RecommendersSvd>
NNI 中的 EfficientNet <../TrialExample/EfficientNet>
用于阅读理解的自动模型架构搜索 <../TrialExample/SquadEvolutionExamples>
TPE 的并行优化 <ParallelizingTpeSearch>
\ No newline at end of file
SVD 自动调优 <recommenders_svd>
NNI 中的 EfficientNet <efficientnet>
用于阅读理解的自动模型架构搜索 <squad_evolution_examples>
TPE 的并行优化 <parallelizing_tpe_search>
\ No newline at end of file
#######################
Automatic System Tuning
#######################
=======================
The performance of systems, such as database, tensor operator implementaion, often need to be tuned to adapt to specific hardware configuration, targeted workload, etc. Manually tuning a system is complicated and often requires detailed understanding of hardware and workload. NNI can make such tasks much easier and help system owners find the best configuration to the system automatically. The detailed design philosophy of automatic system tuning can be found in this `paper <https://dl.acm.org/doi/10.1145/3352020.3352031>`__\ . The following are some typical cases that NNI can help.
.. toctree::
:maxdepth: 1
Tuning SPTAG (Space Partition Tree And Graph) automatically <SptagAutoTune>
Tuning the performance of RocksDB <../TrialExample/RocksdbExamples>
Tuning Tensor Operators automatically <../TrialExample/OpEvoExamples>
\ No newline at end of file
Tuning SPTAG (Space Partition Tree And Graph) automatically <sptag_auto_tune>
Tuning the performance of RocksDB <rocksdb_examples>
Tuning Tensor Operators automatically <op_evo_examples>
\ No newline at end of file
.. e0791b39c8c362669300ce55b42e997b
.. 71f843b3da6b65a4ed7a4683380aa0b4
#######################
自动系统调优
#######################
============
数据库、张量算子实现等系统的性能往往需要进行调优,以适应特定的硬件配置、目标工作负载等。 手动调优系统非常复杂,并且通常需要对硬件和工作负载有详细的了解。 NNI 可以使这些任务变得更容易,并帮助系统所有者自动找到系统的最佳配置。 自动系统调优的详细设计思想可以在 `这篇论文 <https://dl.acm.org/doi/10.1145/3352020.3352031>`__ 中找到。 以下是 NNI 可以发挥作用的一些典型案例。
.. toctree::
:maxdepth: 1
自动调优 SPTAG(Space Partition Tree And Graph)<SptagAutoTune>
调优 RocksDB 的性能<../TrialExample/RocksdbExamples>
自动调优张量算子<../TrialExample/OpEvoExamples>
\ No newline at end of file
自动调优 SPTAG(Space Partition Tree And Graph)<sptag_auto_tune>
调优 RocksDB 的性能 <rocksdb_examples>
自动调优张量算子 <op_evo_examples>
\ No newline at end of file
Use Cases and Solutions
=======================
Different from the tutorials and examples in the rest of the document which show the usage of a feature, this part mainly introduces end-to-end scenarios and use cases to help users further understand how NNI can help them. NNI can be widely adopted in various scenarios. We also encourage community contributors to share their AutoML practices especially the NNI usage practices from their experience.
.. toctree::
:maxdepth: 1
Automatic Model Tuning (HPO/NAS) <automodel>
Automatic System Tuning (AutoSys) <autosys>
Model Compression <model_compression>
Feature Engineering <feature_engineering>
Performance measurement, comparison and analysis <perf_compare>
Use NNI on Google Colab <nni_colab_support>
###################
Feature Engineering
###################
===================
The following is an article about how NNI helps in auto feature engineering shared by a community contributor. More use cases and solutions will be added in the future.
.. toctree::
:maxdepth: 1
NNI review article from Zhihu: - By Garvin Li <NNI_AutoFeatureEng>
\ No newline at end of file
NNI review article from Zhihu: - By Garvin Li <nni_autofeatureeng>
\ No newline at end of file
.. 6b887244cf8fbace30971173f8c6fe8a
.. 9d25b0a6269198806ffda03644282b20
###################
特征工程
###################
========
以下是关于 NNI 如何助力特征工程的文章,由社区贡献者分享。 将来会添加更多用例和解决方案。
.. toctree::
:maxdepth: 1
来自知乎的评论:作者 Garvin Li <NNI_AutoFeatureEng>
\ No newline at end of file
来自知乎的评论:作者 Garvin Li <nni_autofeatureeng>
\ No newline at end of file
#################
Model Compression
#################
=================
The following one shows how to apply knowledge distillation on NNI model compression. More use cases and solutions will be added in the future.
.. toctree::
:maxdepth: 1
Knowledge distillation with NNI model compression <../TrialExample/KDExample>
\ No newline at end of file
Knowledge distillation with NNI model compression <kd_example>
\ No newline at end of file
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment