"src/git@developer.sourcefind.cn:gaoqiong/migraphx.git" did not exist on "d673e0c440831a60be1b87d9a9cf75da92d9544a"
Unverified Commit 36e6e350 authored by SparkSnail's avatar SparkSnail Committed by GitHub
Browse files

Merge pull request #221 from microsoft/master

merge master
parents 543239c6 7cbde508
......@@ -14,11 +14,24 @@
[简体中文](README_zh_CN.md)
**NNI (Neural Network Intelligence)** is an efficient and automatic toolkit to help users design and search neural network architecture, tune machine learning model's parameters or complex system's parameters. The tool manages automated machine learning (AutoML) experiments, dispatches and runs experiments' trial jobs generated by tuning algorithms to search the best neural architecture and/or hyper-parameters in different environments like local machine, remote servers and cloud.
**NNI (Neural Network Intelligence)** is a lightweight but powerful toolkit to help users **automate** <a href="docs/en_US/FeatureEngineering/Overview.md">Feature Engineering</a>, <a href="docs/en_US/NAS/Overview.md">Neural Architecture Search</a>, <a href="docs/en_US/Tuner/BuiltinTuner.md">Hyperparameter Tuning</a> and <a href="docs/en_US/Compressor/Overview.md">Model Compression</a>.
The tool manages automated machine learning (AutoML) experiments, **dispatches and runs** experiments' trial jobs generated by tuning algorithms to search the best neural architecture and/or hyper-parameters in **different training environments** like <a href="docs/en_US/TrainingService/LocalMode.md">Local Machine</a>, <a href="docs/en_US/TrainingService/RemoteMachineMode.md">Remote Servers</a>, <a href="docs/en_US/TrainingService/PaiMode.md">OpenPAI</a>, <a href="docs/en_US/TrainingService/KubeflowMode.md">Kubeflow</a>, <a href="docs/en_US/TrainingService/FrameworkControllerMode.md">FrameworkController on K8S (AKS etc.)</a> and other cloud options.
## **Who should consider using NNI**
* Those who want to **try different AutoML algorithms** in their training code/model.
* Those who want to run AutoML trial jobs **in different environments** to speed up search.
* Researchers and data scientists who want to easily **implement and experiement new AutoML algorithms**, may it be: hyperparameter tuning algorithm, neural architect search algorithm or model compression algorithm.
* ML Platform owners who want to **support AutoML in their platform**.
### **NNI v1.2 has been released! &nbsp;<a href="#nni-released-reminder"><img width="48" src="docs/img/release_icon.png"></a>**
## **NNI capabilities in a glance**
NNI provides CommandLine Tool as well as an user friendly WebUI to manage training experiements. With the extensible API, you can customize your own AutoML algorithms and training services. To make it easy for new users, NNI also provides a set of build-in stat-of-the-art AutoML algorithms and out of box support for popular training platforms.
Within the following table, we summarized the current NNI capabilities, we are gradually adding new capabilities and we'd love to have your contribution.
<p align="center">
<a href="#nni-has-been-released"><img src="docs/img/overview.svg" /></a>
</p>
......@@ -80,47 +93,66 @@
</ul>
</td>
<td align="left" >
<a href="docs/en_US/Tuner/BuiltinTuner.md">Tuner</a>
<a href="docs/en_US/Tuner/BuiltinTuner.md">Hyperparameter Tuning</a>
<ul>
<li><a href="docs/en_US/Tuner/BuiltinTuner.md#Random">Random Search</a></li>
<li><a href="docs/en_US/Tuner/BuiltinTuner.md#Evolution">Naïve Evolution</a></li>
<li><a href="docs/en_US/Tuner/BuiltinTuner.md#TPE">TPE</a></li>
<li><a href="docs/en_US/Tuner/BuiltinTuner.md#Anneal">Anneal</a></li>
<li><a href="docs/en_US/Tuner/BuiltinTuner.md#SMAC">SMAC</a></li>
<li><a href="docs/en_US/Tuner/BuiltinTuner.md#Batch">Batch</a></li>
<li><a href="docs/en_US/Tuner/BuiltinTuner.md#GridSearch">Grid Search</a></li>
<li><a href="docs/en_US/Tuner/BuiltinTuner.md#Hyperband">Hyperband</a></li>
<li><a href="docs/en_US/Tuner/BuiltinTuner.md#MetisTuner">Metis Tuner</a></li>
<li><a href="docs/en_US/Tuner/BuiltinTuner.md#BOHB">BOHB</a></li>
<li><a href="docs/en_US/Tuner/BuiltinTuner.md#GPTuner">GP Tuner</a></li>
<li><a href="docs/en_US/Tuner/BuiltinTuner.md#PPOTuner">PPO Tuner</a></li>
<li><a href="docs/en_US/Tuner/BuiltinTuner.md#NetworkMorphism">Network Morphism</a></li>
</ul>
<a href="docs/en_US/Assessor/BuiltinAssessor.md">Assessor</a>
<b>Exhaustive search</b>
<ul>
<li><a href="docs/en_US/Assessor/BuiltinAssessor.md#Medianstop">Median Stop</a></li>
<li><a href="docs/en_US/Assessor/BuiltinAssessor.md#Curvefitting">Curve Fitting</a></li>
<li><a href="docs/en_US/Tuner/BuiltinTuner.md#Random">Random Search</a></li>
<li><a href="docs/en_US/Tuner/BuiltinTuner.md#GridSearch">Grid Search</a></li>
<li><a href="docs/en_US/Tuner/BuiltinTuner.md#Batch">Batch</a></li>
</ul>
<b>Heuristic search</b>
<ul>
<li><a href="docs/en_US/Tuner/BuiltinTuner.md#Evolution">Naïve Evolution</a></li>
<li><a href="docs/en_US/Tuner/BuiltinTuner.md#Anneal">Anneal</a></li>
<li><a href="docs/en_US/Tuner/BuiltinTuner.md#Hyperband">Hyperband</a></li>
</ul>
<a href="docs/en_US/NAS/Overview.md">NAS (Beta)</a>
<b>Bayesian optimization</b>
<ul>
<li><a href="docs/en_US/Tuner/BuiltinTuner.md#BOHB">BOHB</a></li>
<li><a href="docs/en_US/Tuner/BuiltinTuner.md#TPE">TPE</a></li>
<li><a href="docs/en_US/Tuner/BuiltinTuner.md#SMAC">SMAC</a></li>
<li><a href="docs/en_US/Tuner/BuiltinTuner.md#MetisTuner">Metis Tuner</a></li>
<li><a href="docs/en_US/Tuner/BuiltinTuner.md#GPTuner">GP Tuner</a> </li>
</ul>
<b>RL Based</b>
<ul>
<li><a href="docs/en_US/NAS/Overview.md#enas">ENAS</a></li>
<li><a href="docs/en_US/NAS/Overview.md#darts">DARTS</a></li>
<li><a href="docs/en_US/NAS/Overview.md#p-darts">P-DARTS</a></li>
<li><a href="docs/en_US/Tuner/BuiltinTuner.md#PPOTuner">PPO Tuner</a> </li>
</ul>
<a href="docs/en_US/Compressor/Overview.md">Model Compression (Beta)</a>
</ul>
<a href="docs/en_US/NAS/Overview.md">Neural Architecture Search</a>
<ul>
<ul>
<li><a href="docs/en_US/NAS/Overview.md#enas">ENAS</a></li>
<li><a href="docs/en_US/NAS/Overview.md#darts">DARTS</a></li>
<li><a href="docs/en_US/NAS/Overview.md#p-darts">P-DARTS</a></li>
<li><a href="docs/en_US/Tuner/BuiltinTuner.md#NetworkMorphism">Network Morphism</a> </li>
</ul>
</ul>
<a href="docs/en_US/Compressor/Overview.md">Model Compression</a>
<ul>
<li><a href="docs/en_US/Compressor/Pruner.md#agp-pruner">AGP Pruner</a></li>
<li><a href="docs/en_US/Compressor/Pruner.md#slim-pruner">Slim Pruner</a></li>
<li><a href="docs/en_US/Compressor/Pruner.md#fpgm-pruner">FPGM Pruner</a></li>
<li><a href="docs/en_US/Compressor/Quantizer.md#qat-quantizer">QAT Quantizer</a></li>
<li><a href="docs/en_US/Compressor/Quantizer.md#dorefa-quantizer">DoReFa Quantizer</a></li>
<li><a href="docs/en_US/Compressor/Overview.md">More...</a></li>
<b>Pruning</b>
<ul>
<li><a href="docs/en_US/Compressor/Pruner.md#agp-pruner">AGP Pruner</a></li>
<li><a href="docs/en_US/Compressor/Pruner.md#slim-pruner">Slim Pruner</a></li>
<li><a href="docs/en_US/Compressor/Pruner.md#fpgm-pruner">FPGM Pruner</a></li>
</ul>
<b>Quantization</b>
<ul>
<li><a href="docs/en_US/Compressor/Quantizer.md#qat-quantizer">QAT Quantizer</a></li>
<li><a href="docs/en_US/Compressor/Quantizer.md#dorefa-quantizer">DoReFa Quantizer</a></li>
</ul>
</ul>
<a href="docs/en_US/FeatureEngineering/Overview.md">Feature Engineering (Beta)</a>
<ul>
<li><a href="docs/en_US/FeatureEngineering/GradientFeatureSelector.md">GradientFeatureSelector</a></li>
<li><a href="docs/en_US/FeatureEngineering/GBDTSelector.md">GBDTSelector</a></li>
</ul>
<a href="docs/en_US/Assessor/BuiltinAssessor.md">Early Stop Algorithms</a>
<ul>
<li><a href="docs/en_US/Assessor/BuiltinAssessor.md#Medianstop">Median Stop</a></li>
<li><a href="docs/en_US/Assessor/BuiltinAssessor.md#Curvefitting">Curve Fitting</a></li>
</ul>
</td>
<td>
<ul>
......@@ -164,28 +196,6 @@
</tbody>
</table>
## **Who should consider using NNI**
* Those who want to try different AutoML algorithms in their training code (model) at their local machine.
* Those who want to run AutoML trial jobs in different environments to speed up search (e.g. remote servers and cloud).
* Researchers and data scientists who want to implement their own AutoML algorithms and compare it with other algorithms.
* ML Platform owners who want to support AutoML in their platform.
## Related Projects
Targeting at openness and advancing state-of-art technology, [Microsoft Research (MSR)](https://www.microsoft.com/en-us/research/group/systems-research-group-asia/) had also released few other open source projects.
* [OpenPAI](https://github.com/Microsoft/pai) : an open source platform that provides complete AI model training and resource management capabilities, it is easy to extend and supports on-premise, cloud and hybrid environments in various scale.
* [FrameworkController](https://github.com/Microsoft/frameworkcontroller) : an open source general-purpose Kubernetes Pod Controller that orchestrate all kinds of applications on Kubernetes by a single controller.
* [MMdnn](https://github.com/Microsoft/MMdnn) : A comprehensive, cross-framework solution to convert, visualize and diagnose deep neural network models. The "MM" in MMdnn stands for model management and "dnn" is an acronym for deep neural network.
* [SPTAG](https://github.com/Microsoft/SPTAG) : Space Partition Tree And Graph (SPTAG) is an open source library for large scale vector approximate nearest neighbor search scenario.
We encourage researchers and students leverage these projects to accelerate the AI development and research.
## **Install & Verify**
**Install through pip**
......@@ -300,58 +310,25 @@ You can use these commands to get more information about the experiment
</table>
## **Documentation**
Our primary documentation is at [here](https://nni.readthedocs.io/en/latest/Overview.html) and is generated from this repository.<br/>
Maybe you want to read:
* [NNI overview](docs/en_US/Overview.md)
* [Quick start](docs/en_US/Tutorial/QuickStart.md)
* [WebUI tutorial](docs/en_US/Tutorial/WebUI.md)
* [Contributing](docs/en_US/Tutorial/Contributing.md)
## **How to**
* [Install NNI](docs/en_US/Tutorial/Installation.md)
* [Use command line tool nnictl](docs/en_US/Tutorial/Nnictl.md)
* [Define a trial](docs/en_US/TrialExample/Trials.md)
* [Config an experiment](docs/en_US/Tutorial/ExperimentConfig.md)
* [Define search space](docs/en_US/Tutorial/SearchSpaceSpec.md)
* [choose tuner/search-algorithm](docs/en_US/Tuner/BuiltinTuner.md)
* [Use annotation](docs/en_US/TrialExample/Trials.md#nni-python-annotation)
* [Use NNIBoard](docs/en_US/Tutorial/WebUI.md)
* To learn about what's NNI, read the [NNI Overview](https://nni.readthedocs.io/en/latest/Overview.html).
* To get yourself familiar with how to use NNI, read the [documentation](https://nni.readthedocs.io/en/latest/index.html).
* To get started and install NNI on your system, please refer to [Install NNI](docs/en_US/Tutorial/Installation.md).
## **Contributing**
This project welcomes contributions and suggestions. Most contributions require you to agree to a Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us the rights to use your contribution. For details, visit https://cla.microsoft.com.
## **Tutorials**
When you submit a pull request, a CLA-bot will automatically determine whether you need to provide a CLA and decorate the PR appropriately (e.g., label, comment). Simply follow the instructions provided by the bot. You will only need to do this once across all repos using our CLA.
* [Run an experiment on local (with multiple GPUs)](docs/en_US/TrainingService/LocalMode.md)
* [Run an experiment on OpenPAI](docs/en_US/TrainingService/PaiMode.md)
* [Run an experiment on Kubeflow](docs/en_US/TrainingService/KubeflowMode.md)
* [Run an experiment on multiple machines](docs/en_US/TrainingService/RemoteMachineMode.md)
* [Try different tuners](docs/en_US/Tuner/BuiltinTuner.md)
* [Try different assessors](docs/en_US/Assessor/BuiltinAssessor.md)
* [Implement a customized tuner](docs/en_US/Tuner/CustomizeTuner.md)
* [Implement a customized assessor](docs/en_US/Assessor/CustomizeAssessor.md)
* [Implement TrainingService in NNI](docs/en_US/TrainingService/HowToImplementTrainingService.md)
* [Use Genetic Algorithm to find good model architectures for Reading Comprehension task](docs/en_US/TrialExample/SquadEvolutionExamples.md)
* [Advanced Neural Architecture Search](docs/en_US/AdvancedFeature/AdvancedNas.md)
This project has adopted the [Microsoft Open Source Code of Conduct](https://opensource.microsoft.com/codeofconduct/). For more information see the Code of [Conduct FAQ](https://opensource.microsoft.com/codeofconduct/faq/) or contact opencode@microsoft.com with any additional questions or comments.
## **Contribute**
This project welcomes contributions and there are many ways in which you can participate in the project, for example:
* Open [bug reports](https://github.com/microsoft/nni/issues/new/choose).
* Request a [new feature](https://github.com/microsoft/nni/issues/new/choose).
* Suggest or ask some questions on the [How to Debug](docs/en_US/Tutorial/HowToDebug.md) guidance document.
* Find the issues tagged with ['good first issue'](https://github.com/Microsoft/nni/issues?q=is%3Aissue+is%3Aopen+label%3A%22good+first+issue%22) or ['help-wanted'](https://github.com/microsoft/nni/issues?q=is%3Aopen+is%3Aissue+label%3A%22help+wanted%22), these are simple and easy to start , we recommend new contributors to start with.
Before providing your hacks, you can review the [Contributing Instruction](docs/en_US/Tutorial/Contributing.md) to get more information. In addition, we also provide you with the following documents:
After getting familiar with contribution agreements, you are ready to create your first PR =), follow the NNI developer tutorials to get start:
* We recommend new contributors to start with ['good first issue'](https://github.com/Microsoft/nni/issues?q=is%3Aissue+is%3Aopen+label%3A%22good+first+issue%22) or ['help-wanted'](https://github.com/microsoft/nni/issues?q=is%3Aopen+is%3Aissue+label%3A%22help+wanted%22), these issues are simple and easy to start.
* [NNI developer environment installation tutorial](docs/en_US/Tutorial/SetupNniDeveloperEnvironment.md)
* [How to debug](docs/en_US/Tutorial/HowToDebug.md)
* [Customize Your Own Advisor](docs/en_US/Tuner/CustomizeAdvisor.md)
* [Customize Your Own Tuner](docs/en_US/Tuner/CustomizeTuner.md)
* [Customize your own Tuner](docs/en_US/Tuner/CustomizeTuner.md)
* [Implement customized TrainingService](docs/en_US/TrainingService/HowToImplementTrainingService.md)
* [Implement a new NAS trainer on NNI](https://github.com/microsoft/nni/blob/master/docs/en_US/NAS/NasInterface.md#implement-a-new-nas-trainer-on-nni)
* [Customize your own Advisor](docs/en_US/Tuner/CustomizeAdvisor.md)
## **External Repositories and References**
With authors' permission, we listed a set of NNI usage examples and relevant articles.
......@@ -377,6 +354,15 @@ With authors' permission, we listed a set of NNI usage examples and relevant art
* [File an issue](https://github.com/microsoft/nni/issues/new/choose) on GitHub.
* Ask a question with NNI tags on [Stack Overflow](https://stackoverflow.com/questions/tagged/nni?sort=Newest&edited=true).
## Related Projects
Targeting at openness and advancing state-of-art technology, [Microsoft Research (MSR)](https://www.microsoft.com/en-us/research/group/systems-research-group-asia/) had also released few other open source projects.
* [OpenPAI](https://github.com/Microsoft/pai) : an open source platform that provides complete AI model training and resource management capabilities, it is easy to extend and supports on-premise, cloud and hybrid environments in various scale.
* [FrameworkController](https://github.com/Microsoft/frameworkcontroller) : an open source general-purpose Kubernetes Pod Controller that orchestrate all kinds of applications on Kubernetes by a single controller.
* [MMdnn](https://github.com/Microsoft/MMdnn) : A comprehensive, cross-framework solution to convert, visualize and diagnose deep neural network models. The "MM" in MMdnn stands for model management and "dnn" is an acronym for deep neural network.
* [SPTAG](https://github.com/Microsoft/SPTAG) : Space Partition Tree And Graph (SPTAG) is an open source library for large scale vector approximate nearest neighbor search scenario.
We encourage researchers and students leverage these projects to accelerate the AI development and research.
## **License**
......
......@@ -8,9 +8,24 @@
[English](README.md)
NNI (Neural Network Intelligence) 是自动机器学习(AutoML)的工具包。 它通过多种调优的算法来搜索最好的神经网络结构和(或)超参,并支持单机、本地多机、云等不同的运行环境
**NNI (Neural Network Intelligence)** 是一个轻量但强大的工具包,帮助用户**自动**的进行[特征工程](docs/zh_CN/FeatureEngineering/Overview.md)[神经网络架构搜索](docs/zh_CN/NAS/Overview.md)[超参调优](docs/zh_CN/Tuner/BuiltinTuner.md)以及[模型压缩](docs/zh_CN/Compressor/Overview.md)
### **NNI v1.1 已发布! &nbsp;[<img width="48" src="docs/img/release_icon.png" />](#nni-released-reminder)**
NNI 管理自动机器学习 (AutoML) 的 Experiment,**调度运行**由调优算法生成的 Trial 任务来找到最好的神经网络架构和/或超参,支持**各种训练环境**,如[本机](docs/zh_CN/TrainingService/LocalMode.md)[远程服务器](docs/zh_CN/TrainingService/RemoteMachineMode.md)[OpenPAI](docs/zh_CN/TrainingService/PaiMode.md)[Kubeflow](docs/zh_CN/TrainingService/KubeflowMode.md)[基于 K8S 的 FrameworkController(如,AKS 等)](docs/zh_CN/TrainingService/FrameworkControllerMode.md),以及其它云服务。
## **使用场景**
* 想要在自己的代码、模型中试验**不同的自动机器学习算法**
* 想要在**不同的环境中**加速运行自动机器学习。
* 想要更容易**实现或试验新的自动机器学习算法**的研究员或数据科学家,包括:超参调优算法,神经网络搜索算法以及模型压缩算法。
* 在机器学习平台中**支持自动机器学习**
### **NNI v1.2 已发布! &nbsp;[<img width="48" src="docs/img/release_icon.png" />](#nni-released-reminder)**
## **NNI 功能一览**
NNI 提供命令行工具以及友好的 WebUI 来管理训练的 Experiment。 通过可扩展的 API,可定制自动机器学习算法和训练平台。 为了方便新用户,NNI 内置了最新的自动机器学习算法,并为流行的训练平台提供了开箱即用的支持。
下表中,包含了 NNI 的功能,同时在不断地增添新功能,也非常希望您能贡献其中。
<p align="center">
<a href="#nni-has-been-released"><img src="docs/img/overview.svg" /></a>
......@@ -26,7 +41,7 @@ NNI (Neural Network Intelligence) 是自动机器学习(AutoML)的工具包
<img src="docs/img/bar.png"/>
</td>
<td>
<b>调优算法</b>
<b>算法</b>
<img src="docs/img/bar.png"/>
</td>
<td>
......@@ -63,7 +78,7 @@ NNI (Neural Network Intelligence) 是自动机器学习(AutoML)的工具包
<li><b>示例</b></li>
<ul>
<li><a href="examples/trials/mnist-pytorch">MNIST-pytorch</li></a>
<li><a href="examples/trials/mnist">MNIST-tensorflow</li></a>
<li><a href="examples/trials/mnist-tfv1">MNIST-tensorflow</li></a>
<li><a href="examples/trials/mnist-keras">MNIST-keras</li></a>
<li><a href="docs/zh_CN/TrialExample/GbdtExample.md">Auto-gbdt</a></li>
<li><a href="docs/zh_CN/TrialExample/Cifar10Examples.md">Cifar10-pytorch</li></a>
......@@ -73,38 +88,66 @@ NNI (Neural Network Intelligence) 是自动机器学习(AutoML)的工具包
</ul>
</td>
<td align="left" >
<a href="docs/zh_CN/Tuner/BuiltinTuner.md">Tuner(调参器)</a>
<a href="docs/zh_CN/Tuner/BuiltinTuner.md">超参调优</a>
<ul>
<li><b>通用 Tuner</b></li>
<b>穷举搜索</b>
<ul>
<li><a href="docs/zh_CN/Tuner/BuiltinTuner.md#Random">Random Search(随机搜索)</a></li>
<li><a href="docs/zh_CN/Tuner/BuiltinTuner.md#Evolution">Naïve Evolution(朴素进化)</a></li>
</ul>
<li><b><a href="docs/zh_CN/CommunitySharings/HpoComparision.md">超参调优</a> Tuner</b></li>
<li><a href="docs/zh_CN/Tuner/BuiltinTuner.md#Random">Random Search(随机搜索)</a></li>
<li><a href="docs/zh_CN/Tuner/BuiltinTuner.md#GridSearch">Grid Search(遍历搜索)</a></li>
<li><a href="docs/zh_CN/Tuner/BuiltinTuner.md#Batch">Batch(批处理)</a></li>
</ul>
<b>启发式搜索</b>
<ul>
<li><a href="docs/zh_CN/Tuner/BuiltinTuner.md#TPE">TPE</a></li>
<li><a href="docs/zh_CN/Tuner/BuiltinTuner.md#Anneal">Anneal(退火算法)</a></li>
<li><a href="docs/zh_CN/Tuner/BuiltinTuner.md#SMAC">SMAC</a></li>
<li><a href="docs/zh_CN/Tuner/BuiltinTuner.md#Batch">Batch(批处理)</a></li>
<li><a href="docs/zh_CN/Tuner/BuiltinTuner.md#GridSearch">Grid Search(遍历搜索)</a></li>
<li><a href="docs/zh_CN/Tuner/BuiltinTuner.md#Hyperband">Hyperband</a></li>
<li><a href="docs/zh_CN/Tuner/BuiltinTuner.md#MetisTuner">Metis Tuner</a></li>
<li><a href="docs/zh_CN/Tuner/BuiltinTuner.md#BOHB">BOHB</a></li>
<li><a href="docs/zh_CN/Tuner/BuiltinTuner.md#GPTuner">GP Tuner</a></li>
<li><a href="docs/zh_CN/Tuner/BuiltinTuner.md#Evolution">Naïve Evolution(朴素进化)</a></li>
<li><a href="docs/zh_CN/Tuner/BuiltinTuner.md#Anneal">Anneal(退火算法)</a></li>
<li><a href="docs/zh_CN/Tuner/BuiltinTuner.md#Hyperband">Hyperband</a></li>
</ul>
<li><b><a href="docs/zh_CN/AdvancedFeature/GeneralNasInterfaces.md">NAS</a> Tuner</b></li>
<b>贝叶斯优化</b>
<ul>
<li><a href="docs/zh_CN/Tuner/BuiltinTuner.md#BOHB">BOHB</a></li>
<li><a href="docs/zh_CN/Tuner/BuiltinTuner.md#TPE">TPE</a></li>
<li><a href="docs/zh_CN/Tuner/BuiltinTuner.md#SMAC">SMAC</a></li>
<li><a href="docs/zh_CN/Tuner/BuiltinTuner.md#MetisTuner">Metis Tuner</a></li>
<li><a href="docs/zh_CN/Tuner/BuiltinTuner.md#GPTuner">GP Tuner</a> </li>
</ul>
<b>基于强化学习</b>
<ul>
<li><a href="docs/zh_CN/Tuner/BuiltinTuner.md#NetworkMorphism">Network Morphism</a></li>
<li><a href="examples/tuners/enas_nni/README_zh_CN.md">ENAS</a></li>
<li><a href="docs/zh_CN/Tuner/BuiltinTuner.md#PPOTuner">PPO Tuner</a> </li>
</ul>
</ul>
<a href="docs/zh_CN/Assessor/BuiltinAssessor.md">Assessor(评估器)</a>
<a href="docs/zh_CN/NAS/Overview.md">神经网络架构搜索</a>
<ul>
<ul>
<li><a href="docs/zh_CN/NAS/Overview.md#enas">ENAS</a></li>
<li><a href="docs/zh_CN/NAS/Overview.md#darts">DARTS</a></li>
<li><a href="docs/zh_CN/NAS/Overview.md#p-darts">P-DARTS</a></li>
<li><a href="docs/zh_CN/Tuner/BuiltinTuner.md#NetworkMorphism">Network Morphism</a> </li>
</ul>
</ul>
<a href="docs/zh_CN/Compressor/Overview.md">模型压缩</a>
<ul>
<b>剪枝</b>
<ul>
<li><a href="docs/zh_CN/Compressor/Pruner.md#agp-pruner">AGP Pruner</a></li>
<li><a href="docs/zh_CN/Compressor/Pruner.md#slim-pruner">Slim Pruner</a></li>
<li><a href="docs/zh_CN/Compressor/Pruner.md#fpgm-pruner">FPGM Pruner</a></li>
</ul>
<b>量化</b>
<ul>
<li><a href="docs/zh_CN/Compressor/Quantizer.md#qat-quantizer">QAT Quantizer</a></li>
<li><a href="docs/zh_CN/Compressor/Quantizer.md#dorefa-quantizer">DoReFa Quantizer</a></li>
</ul>
</ul>
<a href="docs/zh_CN/FeatureEngineering/Overview.md">特征工程(测试版)</a>
<ul>
<li><a href="docs/zh_CN/FeatureEngineering/GradientFeatureSelector.md">GradientFeatureSelector</a></li>
<li><a href="docs/zh_CN/FeatureEngineering/GBDTSelector.md">GBDTSelector</a></li>
</ul>
<a href="docs/zh_CN/Assessor/BuiltinAssessor.md">提前终止算法</a>
<ul>
<li><a href="docs/zh_CN/Assessor/BuiltinAssessor.md#Medianstop">Median Stop(中位数终止)</a></li>
<li><a href="docs/zh_CN/Assessor/BuiltinAssessor.md#Curvefitting">Curve Fitting(曲线拟合)</a></li>
</ul>
</ul>
</td>
<td>
<ul>
......@@ -148,24 +191,6 @@ NNI (Neural Network Intelligence) 是自动机器学习(AutoML)的工具包
</tbody>
</table>
## **使用场景**
* 在本机尝试使用不同的自动机器学习(AutoML)算法来训练模型。
* 在分布式环境中加速自动机器学习(如:远程 GPU 工作站和云服务器)。
* 定制自动机器学习算法,或比较不同的自动机器学习算法。
* 在机器学习平台中支持自动机器学习。
## 相关项目
以开发和先进技术为目标,[Microsoft Research (MSR)](https://www.microsoft.com/en-us/research/group/systems-research-group-asia/) 发布了一些开源项目。
* [OpenPAI](https://github.com/Microsoft/pai):作为开源平台,提供了完整的 AI 模型训练和资源管理能力,能轻松扩展,并支持各种规模的私有部署、云和混合环境。
* [FrameworkController](https://github.com/Microsoft/frameworkcontroller):开源的通用 Kubernetes Pod 控制器,通过单个控制器来编排 Kubernetes 上所有类型的应用。
* [MMdnn](https://github.com/Microsoft/MMdnn):一个完整、跨框架的解决方案,能够转换、可视化、诊断深度神经网络模型。 MMdnn 中的 "MM" 表示model management(模型管理),而 "dnn" 是 deep neural network(深度神经网络)的缩写。
* [SPTAG](https://github.com/Microsoft/SPTAG) : Space Partition Tree And Graph (SPTAG) 是用于大规模向量的最近邻搜索场景的开源库。
我们鼓励研究人员和学生利用这些项目来加速 AI 开发和研究。
## **安装和验证**
**通过 pip 命令安装**
......@@ -194,12 +219,12 @@ python -m pip install --upgrade nni
* 当前支持 Linux(Ubuntu 16.04 或更高版本),MacOS(10.14.1)以及 Windows 10(1809 版)。
Linux 和 macOS
Linux 和 MacOS
*`python >= 3.5` 的环境中运行命令: `git``wget`,确保安装了这两个组件。
```bash
git clone -b v1.1 https://github.com/Microsoft/nni.git
git clone -b v1.2 https://github.com/Microsoft/nni.git
cd nni
source install.sh
```
......@@ -209,7 +234,7 @@ Windows
*`python >=3.5` 的环境中运行命令: `git``PowerShell`,确保安装了这两个组件。
```bash
git clone -b v1.1 https://github.com/Microsoft/nni.git
git clone -b v1.2 https://github.com/Microsoft/nni.git
cd nni
powershell -ExecutionPolicy Bypass -file install.ps1
```
......@@ -225,15 +250,15 @@ Windows 上参考 [Windows 上使用 NNI](docs/zh_CN/Tutorial/NniOnWindows.md)
* 通过克隆源代码下载示例。
```bash
git clone -b v1.1 https://github.com/Microsoft/nni.git
git clone -b v1.2 https://github.com/Microsoft/nni.git
```
Linux 和 macOS
Linux 和 MacOS
* 运行 MNIST 示例。
```bash
nnictl create --config nni/examples/trials/mnist/config.yml
nnictl create --config nni/examples/trials/mnist-tfv1/config.yml
```
Windows
......@@ -241,7 +266,7 @@ Windows
* 运行 MNIST 示例。
```bash
nnictl create --config nni\examples\trials\mnist\config_windows.yml
nnictl create --config nni\examples\trials\mnist-tfv1\config_windows.yml
```
* 在命令行中等待输出 `INFO: Successfully started experiment!`。 此消息表明 Experiment 已成功启动。 通过命令行输出的 `Web UI url` 来访问 Experiment 的界面。
......@@ -282,55 +307,27 @@ You can use these commands to get more information about the experiment
## **文档**
主要文档都可以在[这里](https://nni.readthedocs.io/cn/latest/Overview.html)找到,文档均从本代码库生成。
点击阅读:
* [NNI 概述](docs/zh_CN/Overview.md)
* [快速入门](docs/zh_CN/Tutorial/QuickStart.md)
* [Web 界面教程](docs/zh_CN/Tutorial/WebUI.md)
* [贡献](docs/zh_CN/Tutorial/Contributing.md)
## **入门**
* [安装 NNI](docs/zh_CN/Tutorial/Installation.md)
* [使用命令行工具 nnictl](docs/zh_CN/Tutorial/Nnictl.md)
* [实现 Trial](docs/zh_CN/TrialExample/Trials.md)
* [配置 Experiment](docs/zh_CN/Tutorial/ExperimentConfig.md)
* [定制搜索空间](docs/zh_CN/Tutorial/SearchSpaceSpec.md)
* [选择 Tuner、搜索算法](docs/zh_CN/Tuner/BuiltinTuner.md)
* [使用 Annotation](docs/zh_CN/TrialExample/Trials.md#nni-python-annotation)
* [使用 NNIBoard](docs/zh_CN/Tutorial/WebUI.md)
## **教程**
* [在本机运行 Experiment (支持多 GPU 卡)](docs/zh_CN/TrainingService/LocalMode.md)
* [在 OpenPAI 上运行 Experiment](docs/zh_CN/TrainingService/PaiMode.md)
* [在 Kubeflow 上运行 Experiment](docs/zh_CN/TrainingService/KubeflowMode.md)
* [在多机上运行 Experiment](docs/zh_CN/TrainingService/RemoteMachineMode.md)
* [尝试不同的 Tuner](docs/zh_CN/Tuner/BuiltinTuner.md)
* [尝试不同的 Assessor](docs/zh_CN/Assessor/BuiltinAssessor.md)
* [实现自定义 Tuner](docs/zh_CN/Tuner/CustomizeTuner.md)
* [实现自定义 Assessor](docs/zh_CN/Assessor/CustomizeAssessor.md)
* [实现 NNI 训练平台](docs/zh_CN/TrainingService/HowToImplementTrainingService.md)
* [使用进化算法为阅读理解任务找到好模型](docs/zh_CN/TrialExample/SquadEvolutionExamples.md)
* [高级神经网络架构搜索](docs/zh_CN/AdvancedFeature/AdvancedNas.md)
* 要了解 NNI,请阅读 [NNI 概述](https://nni.readthedocs.io/zh/latest/Overview.html)
* 要熟悉如何使用 NNI,请阅读[文档](https://nni.readthedocs.io/zh/latest/index.html)
* 要安装 NNI,请参阅[安装 NNI](docs/zh_CN/Tutorial/Installation.md)
## **贡献**
非常欢迎通过各种方式参与此项目,例如:
本项目欢迎任何贡献和建议。 大多数贡献都需要你同意参与者许可协议(CLA),来声明你有权,并实际上授予我们有权使用你的贡献。 有关详细信息,请访问 https://cla.microsoft.com。
* [报告 Bug](https://github.com/microsoft/nni/issues/new/choose)
* [请求新功能](https://github.com/microsoft/nni/issues/new/choose).
* 建议或询问[如何调试](docs/zh_CN/Tutorial/HowToDebug.md)文档相关的问题。
* 找到标有 ['good first issue'](https://github.com/Microsoft/nni/issues?q=is%3Aissue+is%3Aopen+label%3A%22good+first+issue%22)['help-wanted'](https://github.com/microsoft/nni/issues?q=is%3Aopen+is%3Aissue+label%3A%22help+wanted%22) 标签的 Issue。这些都是简单的 Issue,新的贡献者可以从这些问题开始。
当你提交拉取请求时,CLA机器人会自动检查你是否需要提供CLA,并修饰这个拉取请求(例如,标签、注释)。 只需要按照机器人提供的说明进行操作即可。 CLA 只需要同意一次,就能应用到所有的代码仓库上。
在编写代码前,可以先看看[贡献指南](docs/zh_CN/Tutorial/Contributing.md)来了解更多信息。 此外,还提供了以下文档:
该项目采用了 [ Microsoft 开源行为准则 ](https://opensource.microsoft.com/codeofconduct/)。 有关详细信息,请参阅[常见问题解答](https://opensource.microsoft.com/codeofconduct/faq/),如有任何疑问或意见可联系 opencode@microsoft.com。
熟悉贡献协议后,即可按照 NNI 开发人员教程,创建第一个 PR =):
* 推荐新贡献者先找到标有 ['good first issue'](https://github.com/Microsoft/nni/issues?q=is%3Aissue+is%3Aopen+label%3A%22good+first+issue%22)['help-wanted'](https://github.com/microsoft/nni/issues?q=is%3Aopen+is%3Aissue+label%3A%22help+wanted%22) 标签的 Issue。这些都比较简单,可以从这些问题开始。
* [NNI 开发环境安装教程](docs/zh_CN/Tutorial/SetupNniDeveloperEnvironment.md)
* [如何调试](docs/zh_CN/Tutorial/HowToDebug.md)
* [自定义 Advisor](docs/zh_CN/Tuner/CustomizeAdvisor.md)
* [自定义 Tuner](docs/zh_CN/Tuner/CustomizeTuner.md)
* [实现定制的训练平台](docs/zh_CN/TrainingService/HowToImplementTrainingService.md)
* [在 NNI 上实现新的 NAS Trainer](https://github.com/microsoft/nni/blob/master/docs/en_US/NAS/NasInterface.md#implement-a-new-nas-trainer-on-nni)
* [自定义 Advisor](docs/zh_CN/Tuner/CustomizeAdvisor.md)
## **其它代码库和参考**
......@@ -340,7 +337,7 @@ You can use these commands to get more information about the experiment
* 在 NNI 中运行 [ENAS](examples/tuners/enas_nni/README_zh_CN.md)
* 在 NNI 中运行 [神经网络架构结构搜索](examples/trials/nas_cifar10/README_zh_CN.md)
* [NNI 中的自动特征工程](examples/trials/auto-feature-engineering/README_zh_CN.md)
* [NNI 中的自动特征工程](examples/feature_engineering/auto-feature-engineering/README_zh_CN.md)
* 使用 NNI 的 [矩阵分解超参调优](https://github.com/microsoft/recommenders/blob/master/notebooks/04_model_select_and_optimize/nni_surprise_svd.ipynb)
* [scikit-nni](https://github.com/ksachdeva/scikit-nni) 使用 NNI 为 scikit-learn 开发的超参搜索。
* ### **相关文章**
......@@ -359,6 +356,17 @@ You can use these commands to get more information about the experiment
* [在 GitHub 上提交问题](https://github.com/microsoft/nni/issues/new/choose)
*[Stack Overflow](https://stackoverflow.com/questions/tagged/nni?sort=Newest&edited=true) 上使用 nni 标签提问。
## 相关项目
以探索先进技术和开放为目标,[Microsoft Research (MSR)](https://www.microsoft.com/en-us/research/group/systems-research-group-asia/) 还发布了一些相关的开源项目。
* [OpenPAI](https://github.com/Microsoft/pai):作为开源平台,提供了完整的 AI 模型训练和资源管理能力,能轻松扩展,并支持各种规模的私有部署、云和混合环境。
* [FrameworkController](https://github.com/Microsoft/frameworkcontroller):开源的通用 Kubernetes Pod 控制器,通过单个控制器来编排 Kubernetes 上所有类型的应用。
* [MMdnn](https://github.com/Microsoft/MMdnn):一个完整、跨框架的解决方案,能够转换、可视化、诊断深度神经网络模型。 MMdnn 中的 "MM" 表示 model management(模型管理),而 "dnn" 是 deep neural network(深度神经网络)的缩写。
* [SPTAG](https://github.com/Microsoft/SPTAG) : Space Partition Tree And Graph (SPTAG) 是用于大规模向量的最近邻搜索场景的开源库。
我们鼓励研究人员和学生利用这些项目来加速 AI 开发和研究。
## **许可协议**
代码库遵循 [MIT 许可协议](LICENSE)
\ No newline at end of file
......@@ -21,6 +21,9 @@ jobs:
set -e
cd src/nni_manager
yarn eslint
# uncomment following 2 lines to enable webui eslint
# cd ../webui
# yarn eslint
displayName: 'Run eslint'
- script: |
python3 -m pip install torch==0.4.1 --user
......@@ -43,7 +46,7 @@ jobs:
displayName: 'Run pylint'
- script: |
python3 -m pip install flake8 --user
EXCLUDES=./src/nni_manager/,./tools/nni_annotation/testcase/,./examples/trials/mnist-nas/*/mnist*.py,./examples/trials/nas_cifar10/src/cifar10/general_child.py
EXCLUDES=./src/nni_manager/,./src/webui,./tools/nni_annotation/testcase/,./examples/trials/mnist-nas/*/mnist*.py,./examples/trials/nas_cifar10/src/cifar10/general_child.py
python3 -m flake8 . --count --exclude=$EXCLUDES --select=E9,F63,F72,F82 --show-source --statistics
displayName: 'Run flake8 tests to find Python syntax errors and undefined names'
- script: |
......
......@@ -16,9 +16,9 @@ Install NNI on each of your machines following the install guide [here](../Tutor
## Run an experiment
Install NNI on another machine which has network accessibility to those three machines above, or you can just use any machine above to run nnictl command line tool.
Install NNI on another machine which has network accessibility to those three machines above, or you can just run `nnictl` on any one of the three to launch the experiment.
We use `examples/trials/mnist-annotation` as an example here. `cat ~/nni/examples/trials/mnist-annotation/config_remote.yml` to see the detailed configuration file:
We use `examples/trials/mnist-annotation` as an example here. Shown here is `examples/trials/mnist-annotation/config_remote.yml`:
```yaml
authorName: default
......@@ -57,24 +57,15 @@ machineList:
username: bob
passwd: bob123
```
You can use different systems to run experiments on the remote machine.
#### Linux and MacOS
Simply filling the `machineList` section and then run:
```bash
nnictl create --config ~/nni/examples/trials/mnist-annotation/config_remote.yml
```
to start the experiment.
#### Windows
Simply filling the `machineList` section and then run:
Files in `codeDir` will be automatically uploaded to the remote machine. You can run NNI on different operating systems (Windows, Linux, MacOS) to spawn experiments on the remote machines (only Linux allowed):
```bash
nnictl create --config %userprofile%\nni\examples\trials\mnist-annotation\config_remote.yml
nnictl create --config examples/trials/mnist-annotation/config_remote.yml
```
to start the experiment.
You can also use public/private key pairs instead of username/password for authentication. For advanced usages, please refer to [Experiment Config Reference](../Tutorial/ExperimentConfig.md).
## Version check
## version check
NNI support version check feature in since version 0.6, [refer](PaiMode.md)
\ No newline at end of file
NNI support version check feature in since version 0.6, [reference](PaiMode.md).
\ No newline at end of file
......@@ -112,10 +112,6 @@ trial:
memoryMB: 32869
#The docker image to run nni job on OpenPAI
image: msranni/nni:latest
#The hdfs directory to store data on OpenPAI, format 'hdfs://host:port/directory'
dataDir: hdfs://10.10.10.10:9000/username/nni
#The hdfs directory to store output data generated by nni, format 'hdfs://host:port/directory'
outputDir: hdfs://10.10.10.10:9000/username/nni
paiConfig:
#The username to login OpenPAI
userName: username
......@@ -125,7 +121,7 @@ paiConfig:
host: 10.10.10.10
```
Please change the default value to your personal account and machine information. Including `nniManagerIp`, `dataDir`, `outputDir`, `userName`, `passWord` and `host`.
Please change the default value to your personal account and machine information. Including `nniManagerIp`, `userName`, `passWord` and `host`.
In the "trial" part, if you want to use GPU to perform the architecture search, change `gpuNum` from `0` to `1`. You need to increase the `maxTrialNum` and `maxExecDuration`, according to how long you want to wait for the search result.
......
# Experiment config reference
# Experiment Config Reference
A config file is needed when creating an experiment. The path of the config file is provided to `nnictl`.
The config file is in YAML format.
This document describes the rules to write the config file, and provides some examples and templates.
- [Experiment config reference](#experiment-config-reference)
- [Template](#template)
- [Configuration spec](#configuration-spec)
- [Examples](#examples)
- [Experiment Config Reference](#experiment-config-reference)
* [Template](#template)
* [Configuration Spec](#configuration-spec)
+ [authorName](#authorname)
+ [experimentName](#experimentname)
+ [trialConcurrency](#trialconcurrency)
+ [maxExecDuration](#maxexecduration)
+ [versionCheck](#versioncheck)
+ [debug](#debug)
+ [maxTrialNum](#maxtrialnum)
+ [trainingServicePlatform](#trainingserviceplatform)
+ [searchSpacePath](#searchspacepath)
+ [useAnnotation](#useannotation)
+ [multiPhase](#multiphase)
+ [multiThread](#multithread)
+ [nniManagerIp](#nnimanagerip)
+ [logDir](#logdir)
+ [logLevel](#loglevel)
+ [logCollection](#logcollection)
+ [tuner](#tuner)
- [builtinTunerName](#builtintunername)
- [codeDir](#codedir)
- [classFileName](#classfilename)
- [className](#classname)
- [classArgs](#classargs)
- [gpuIndices](#gpuindices)
- [includeIntermediateResults](#includeintermediateresults)
+ [assessor](#assessor)
- [builtinAssessorName](#builtinassessorname)
- [codeDir](#codedir-1)
- [classFileName](#classfilename-1)
- [className](#classname-1)
- [classArgs](#classargs-1)
+ [advisor](#advisor)
- [builtinAdvisorName](#builtinadvisorname)
- [codeDir](#codedir-2)
- [classFileName](#classfilename-2)
- [className](#classname-2)
- [classArgs](#classargs-2)
- [gpuIndices](#gpuindices-1)
+ [trial](#trial)
+ [localConfig](#localconfig)
- [gpuIndices](#gpuindices-2)
- [maxTrialNumPerGpu](#maxtrialnumpergpu)
- [useActiveGpu](#useactivegpu)
+ [machineList](#machinelist)
- [ip](#ip)
- [port](#port)
- [username](#username)
- [passwd](#passwd)
- [sshKeyPath](#sshkeypath)
- [passphrase](#passphrase)
- [gpuIndices](#gpuindices-3)
- [maxTrialNumPerGpu](#maxtrialnumpergpu-1)
- [useActiveGpu](#useactivegpu-1)
+ [kubeflowConfig](#kubeflowconfig)
- [operator](#operator)
- [storage](#storage)
- [nfs](#nfs)
- [keyVault](#keyvault)
- [azureStorage](#azurestorage)
- [uploadRetryCount](#uploadretrycount)
+ [paiConfig](#paiconfig)
- [userName](#username)
- [password](#password)
- [token](#token)
- [host](#host)
* [Examples](#examples)
+ [Local mode](#local-mode)
+ [Remote mode](#remote-mode)
+ [PAI mode](#pai-mode)
+ [Kubeflow mode](#kubeflow-mode)
+ [Kubeflow with azure storage](#kubeflow-with-azure-storage)
## Template
* __light weight(without Annotation and Assessor)__
* __Light weight (without Annotation and Assessor)__
```yaml
authorName:
......@@ -130,442 +199,481 @@ machineList:
passwd:
```
## Configuration spec
## Configuration Spec
* __authorName__
* Description
### authorName
__authorName__ is the name of the author who create the experiment.
Required. String.
TBD: add default value
The name of the author who create the experiment.
* __experimentName__
* Description
*TBD: add default value.*
__experimentName__ is the name of the experiment created.
### experimentName
TBD: add default value
Required. String.
* __trialConcurrency__
* Description
The name of the experiment created.
__trialConcurrency__ specifies the max num of trial jobs run simultaneously.
*TBD: add default value.*
Note: if trialGpuNum is bigger than the free gpu numbers, and the trial jobs running simultaneously can not reach trialConcurrency number, some trial jobs will be put into a queue to wait for gpu allocation.
### trialConcurrency
* __maxExecDuration__
* Description
Required. Integer between 1 and 99999.
__maxExecDuration__ specifies the max duration time of an experiment.The unit of the time is {__s__, __m__, __h__, __d__}, which means {_seconds_, _minutes_, _hours_, _days_}.
Specifies the max num of trial jobs run simultaneously.
Note: The maxExecDuration spec set the time of an experiment, not a trial job. If the experiment reach the max duration time, the experiment will not stop, but could not submit new trial jobs any more.
If trialGpuNum is bigger than the free gpu numbers, and the trial jobs running simultaneously can not reach __trialConcurrency__ number, some trial jobs will be put into a queue to wait for gpu allocation.
* __versionCheck__
* Description
### maxExecDuration
Optional. String. Default: 999d.
__maxExecDuration__ specifies the max duration time of an experiment. The unit of the time is {__s__, __m__, __h__, __d__}, which means {_seconds_, _minutes_, _hours_, _days_}.
Note: The maxExecDuration spec set the time of an experiment, not a trial job. If the experiment reach the max duration time, the experiment will not stop, but could not submit new trial jobs any more.
### versionCheck
Optional. Bool. Default: false.
NNI will check the version of nniManager process and the version of trialKeeper in remote, pai and kubernetes platform. If you want to disable version check, you could set versionCheck be false.
NNI will check the version of nniManager process and the version of trialKeeper in remote, pai and kubernetes platform. If you want to disable version check, you could set versionCheck be false.
### debug
Optional. Bool. Default: false.
Debug mode will set versionCheck to false and set logLevel to be 'debug'.
### maxTrialNum
Optional. Integer between 1 and 99999. Default: 99999.
Specifies the max number of trial jobs created by NNI, including succeeded and failed jobs.
### trainingServicePlatform
* __debug__
* Description
Required. String.
Debug mode will set versionCheck be False and set logLevel be 'debug'
Specifies the platform to run the experiment, including __local__, __remote__, __pai__, __kubeflow__, __frameworkcontroller__.
* __maxTrialNum__
* Description
* __local__ run an experiment on local ubuntu machine.
__maxTrialNum__ specifies the max number of trial jobs created by NNI, including succeeded and failed jobs.
* __remote__ submit trial jobs to remote ubuntu machines, and __machineList__ field should be filed in order to set up SSH connection to remote machine.
* __trainingServicePlatform__
* Description
* __pai__ submit trial jobs to [OpenPAI](https://github.com/Microsoft/pai) of Microsoft. For more details of pai configuration, please refer to [Guide to PAI Mode](../TrainingService/PaiMode.md)
__trainingServicePlatform__ specifies the platform to run the experiment, including {__local__, __remote__, __pai__, __kubeflow__}.
* __kubeflow__ submit trial jobs to [kubeflow](https://www.kubeflow.org/docs/about/kubeflow/), NNI support kubeflow based on normal kubernetes and [azure kubernetes](https://azure.microsoft.com/en-us/services/kubernetes-service/). For detail please refer to [Kubeflow Docs](../TrainingService/KubeflowMode.md)
* __local__ run an experiment on local ubuntu machine.
* TODO: explain frameworkcontroller.
* __remote__ submit trial jobs to remote ubuntu machines, and __machineList__ field should be filed in order to set up SSH connection to remote machine.
### searchSpacePath
* __pai__ submit trial jobs to [OpenPai](https://github.com/Microsoft/pai) of Microsoft. For more details of pai configuration, please reference [PAIMOdeDoc](../TrainingService/PaiMode.md)
Optional. Path to existing file.
* __kubeflow__ submit trial jobs to [kubeflow](https://www.kubeflow.org/docs/about/kubeflow/), NNI support kubeflow based on normal kubernetes and [azure kubernetes](https://azure.microsoft.com/en-us/services/kubernetes-service/). Detail please reference [KubeflowDoc](../TrainingService/KubeflowMode.md)
Specifies the path of search space file, which should be a valid path in the local linux machine.
* __searchSpacePath__
* Description
The only exception that __searchSpacePath__ can be not fulfilled is when `useAnnotation=True`.
__searchSpacePath__ specifies the path of search space file, which should be a valid path in the local linux machine.
### useAnnotation
Note: if set useAnnotation=True, the searchSpacePath field should be removed.
Optional. Bool. Default: false.
* __useAnnotation__
* Description
Use annotation to analysis trial code and generate search space.
__useAnnotation__ use annotation to analysis trial code and generate search space.
Note: if __useAnnotation__ is true, the searchSpacePath field should be removed.
Note: if set useAnnotation=True, the searchSpacePath field should be removed.
### multiPhase
* __multiPhase__
* Description
Optional. Bool. Default: false.
__multiPhase__ enable [multi-phase experiment](../AdvancedFeature/MultiPhase.md).
Enable [multi-phase experiment](../AdvancedFeature/MultiPhase.md).
* __multiThread__
* Description
### multiThread
__multiThread__ enable multi-thread mode for dispatcher, if multiThread is set to `true`, dispatcher will start a thread to process each command from NNI Manager.
Optional. Bool. Default: false.
* __nniManagerIp__
* Description
Enable multi-thread mode for dispatcher. If multiThread is enabled, dispatcher will start a thread to process each command from NNI Manager.
__nniManagerIp__ set the IP address of the machine on which NNI manager process runs. This field is optional, and if it's not set, eth0 device IP will be used instead.
### nniManagerIp
Note: run ifconfig on NNI manager's machine to check if eth0 device exists. If not, we recommend to set nnimanagerIp explicitly.
Optional. String. Default: eth0 device IP.
* __logDir__
* Description
Set the IP address of the machine on which NNI manager process runs. This field is optional, and if it's not set, eth0 device IP will be used instead.
__logDir__ configures the directory to store logs and data of the experiment. The default value is `<user home directory>/nni/experiment`
Note: run `ifconfig` on NNI manager's machine to check if eth0 device exists. If not, __nniManagerIp__ is recommended to set explicitly.
* __logLevel__
* Description
### logDir
__logLevel__ sets log level for the experiment, available log levels are: `trace, debug, info, warning, error, fatal`. The default value is `info`.
Optional. Path to a directory. Default: `<user home directory>/nni/experiment`.
* __logCollection__
* Description
Configures the directory to store logs and data of the experiment.
__logCollection__ set the way to collect log in remote, pai, kubeflow, frameworkcontroller platform. There are two ways to collect log, one way is from `http`, trial keeper will post log content back from http request in this way, but this way may slow down the speed to process logs in trialKeeper. The other way is `none`, trial keeper will not post log content back, and only post job metrics. If your log content is too big, you could consider setting this param be `none`.
### logLevel
* __tuner__
* Description
Optional. String. Default: `info`.
__tuner__ specifies the tuner algorithm in the experiment, there are two kinds of ways to set tuner. One way is to use tuner provided by NNI sdk, need to set __builtinTunerName__ and __classArgs__. Another way is to use users' own tuner file, and need to set __codeDirectory__, __classFileName__, __className__ and __classArgs__.
* __builtinTunerName__ and __classArgs__
* __builtinTunerName__
Sets log level for the experiment. Available log levels are: `trace`, `debug`, `info`, `warning`, `error`, `fatal`.
__builtinTunerName__ specifies the name of system tuner, NNI sdk provides different tuners introduced [here](../Tuner/BuiltinTuner.md).
### logCollection
* __classArgs__
Optional. `http` or `none`. Default: `none`.
__classArgs__ specifies the arguments of tuner algorithm. Please refer to [this file](../Tuner/BuiltinTuner.md) for the configurable arguments of each built-in tuner.
* __codeDir__, __classFileName__, __className__ and __classArgs__
* __codeDir__
Set the way to collect log in remote, pai, kubeflow, frameworkcontroller platform. There are two ways to collect log, one way is from `http`, trial keeper will post log content back from http request in this way, but this way may slow down the speed to process logs in trialKeeper. The other way is `none`, trial keeper will not post log content back, and only post job metrics. If your log content is too big, you could consider setting this param be `none`.
__codeDir__ specifies the directory of tuner code.
* __classFileName__
### tuner
__classFileName__ specifies the name of tuner file.
* __className__
Required.
__className__ specifies the name of tuner class.
* __classArgs__
Specifies the tuner algorithm in the experiment, there are two kinds of ways to set tuner. One way is to use tuner provided by NNI sdk (built-in tuners), in which case you need to set __builtinTunerName__ and __classArgs__. Another way is to use users' own tuner file, in which case __codeDirectory__, __classFileName__, __className__ and __classArgs__ are needed. *Users must choose exactly one way.*
__classArgs__ specifies the arguments of tuner algorithm.
#### builtinTunerName
* __gpuIndices__
Required if using built-in tuners. String.
__gpuIndices__ specifies the gpus that can be used by the tuner process. Single or multiple GPU indices can be specified, multiple GPU indices are seperated by comma(,), such as `1` or `0,1,3`. If the field is not set, `CUDA_VISIBLE_DEVICES` will be '' in script, that is, no GPU is visible to tuner.
Specifies the name of system tuner, NNI sdk provides different tuners introduced [here](../Tuner/BuiltinTuner.md).
* __includeIntermediateResults__
#### codeDir
If __includeIntermediateResults__ is true, the last intermediate result of the trial that is early stopped by assessor is sent to tuner as final result. The default value of __includeIntermediateResults__ is false.
Required if using customized tuners. Path relative to the location of config file.
Note: users could only use one way to specify tuner, either specifying `builtinTunerName` and `classArgs`, or specifying `codeDir`, `classFileName`, `className` and `classArgs`.
Specifies the directory of tuner code.
* __assessor__
#### classFileName
* Description
Required if using customized tuners. File path relative to __codeDir__.
__assessor__ specifies the assessor algorithm to run an experiment, there are two kinds of ways to set assessor. One way is to use assessor provided by NNI sdk, users need to set __builtinAssessorName__ and __classArgs__. Another way is to use users' own assessor file, and need to set __codeDirectory__, __classFileName__, __className__ and __classArgs__.
* __builtinAssessorName__ and __classArgs__
* __builtinAssessorName__
Specifies the name of tuner file.
__builtinAssessorName__ specifies the name of built-in assessor, NNI sdk provides different assessors introducted [here](../Assessor/BuiltinAssessor.md).
* __classArgs__
#### className
__classArgs__ specifies the arguments of assessor algorithm
Required if using customized tuners. String.
* __codeDir__, __classFileName__, __className__ and __classArgs__
Specifies the name of tuner class.
* __codeDir__
#### classArgs
__codeDir__ specifies the directory of assessor code.
Optional. Key-value pairs. Default: empty.
* __classFileName__
Specifies the arguments of tuner algorithm. Please refer to [this file](../Tuner/BuiltinTuner.md) for the configurable arguments of each built-in tuner.
__classFileName__ specifies the name of assessor file.
#### gpuIndices
* __className__
Optional. String. Default: empty.
__className__ specifies the name of assessor class.
Specifies the GPUs that can be used by the tuner process. Single or multiple GPU indices can be specified. Multiple GPU indices are separated by comma `,`. For example, `1`, or `0,1,3`. If the field is not set, no GPU will be visible to tuner (by setting `CUDA_VISIBLE_DEVICES` to be an empty string).
* __classArgs__
#### includeIntermediateResults
__classArgs__ specifies the arguments of assessor algorithm.
Optional. Bool. Default: false.
Note: users could only use one way to specify assessor, either specifying `builtinAssessorName` and `classArgs`, or specifying `codeDir`, `classFileName`, `className` and `classArgs`. If users do not want to use assessor, assessor fileld should leave to empty.
If __includeIntermediateResults__ is true, the last intermediate result of the trial that is early stopped by assessor is sent to tuner as final result.
* __advisor__
* Description
### assessor
__advisor__ specifies the advisor algorithm in the experiment, there are two kinds of ways to specify advisor. One way is to use advisor provided by NNI sdk, need to set __builtinAdvisorName__ and __classArgs__. Another way is to use users' own advisor file, and need to set __codeDirectory__, __classFileName__, __className__ and __classArgs__.
* __builtinAdvisorName__ and __classArgs__
* __builtinAdvisorName__
Specifies the assessor algorithm to run an experiment. Similar to tuners, there are two kinds of ways to set assessor. One way is to use assessor provided by NNI sdk. Users need to set __builtinAssessorName__ and __classArgs__. Another way is to use users' own assessor file, and users need to set __codeDirectory__, __classFileName__, __className__ and __classArgs__. *Users must choose exactly one way.*
__builtinAdvisorName__ specifies the name of a built-in advisor, NNI sdk provides [different advisors](../Tuner/BuiltinTuner.md).
By default, there is no assessor enabled.
* __classArgs__
#### builtinAssessorName
__classArgs__ specifies the arguments of the advisor algorithm. Please refer to [this file](../Tuner/BuiltinTuner.md) for the configurable arguments of each built-in advisor.
* __codeDir__, __classFileName__, __className__ and __classArgs__
* __codeDir__
Required if using built-in assessors. String.
__codeDir__ specifies the directory of advisor code.
* __classFileName__
Specifies the name of built-in assessor, NNI sdk provides different assessors introduced [here](../Assessor/BuiltinAssessor.md).
__classFileName__ specifies the name of advisor file.
* __className__
#### codeDir
__className__ specifies the name of advisor class.
* __classArgs__
Required if using customized assessors. Path relative to the location of config file.
__classArgs__ specifies the arguments of advisor algorithm.
Specifies the directory of assessor code.
* __gpuIndices__
#### classFileName
__gpuIndices__ specifies the gpus that can be used by the advisor process. Single or multiple GPU indices can be specified, multiple GPU indices are seperated by comma(,), such as `1` or `0,1,3`. If the field is not set, `CUDA_VISIBLE_DEVICES` will be '' in script, that is, no GPU is visible to tuner.
Required if using customized assessors. File path relative to __codeDir__.
Note: users could only use one way to specify advisor, either specifying `builtinAdvisorName` and `classArgs`, or specifying `codeDir`, `classFileName`, `className` and `classArgs`.
Specifies the name of assessor file.
* __trial(local, remote)__
#### className
* __command__
Required if using customized assessors. String.
__command__ specifies the command to run trial process.
Specifies the name of assessor class.
* __codeDir__
#### classArgs
__codeDir__ specifies the directory of your own trial file.
Optional. Key-value pairs. Default: empty.
* __gpuNum__
Specifies the arguments of assessor algorithm.
__gpuNum__ specifies the num of gpu to run the trial process. Default value is 0.
### advisor
* __trial(pai)__
Optional.
* __command__
Specifies the advisor algorithm in the experiment. Similar to tuners and assessors, there are two kinds of ways to specify advisor. One way is to use advisor provided by NNI sdk, need to set __builtinAdvisorName__ and __classArgs__. Another way is to use users' own advisor file, and need to set __codeDirectory__, __classFileName__, __className__ and __classArgs__.
__command__ specifies the command to run trial process.
When advisor is enabled, settings of tuners and advisors will be bypassed.
* __codeDir__
#### builtinAdvisorName
__codeDir__ specifies the directory of the own trial file.
Specifies the name of a built-in advisor. NNI sdk provides [BOHB](../Tuner/BohbAdvisor.md) and [Hyperband](../Tuner/HyperbandAdvisor.md).
* __gpuNum__
#### codeDir
__gpuNum__ specifies the num of gpu to run the trial process. Default value is 0.
Required if using customized advisors. Path relative to the location of config file.
* __cpuNum__
Specifies the directory of advisor code.
__cpuNum__ is the cpu number of cpu to be used in pai container.
#### classFileName
* __memoryMB__
Required if using customized advisors. File path relative to __codeDir__.
__memoryMB__ set the momory size to be used in pai's container.
Specifies the name of advisor file.
* __image__
#### className
__image__ set the image to be used in pai.
Required if using customized advisors. String.
* __dataDir__
Specifies the name of advisor class.
__dataDir__ is the data directory in hdfs to be used.
#### classArgs
* __outputDir__
Optional. Key-value pairs. Default: empty.
__outputDir__ is the output directory in hdfs to be used in pai, the stdout and stderr files are stored in the directory after job finished.
Specifies the arguments of advisor.
* __trial(kubeflow)__
#### gpuIndices
* __codeDir__
Optional. String. Default: empty.
__codeDir__ is the local directory where the code files in.
Specifies the GPUs that can be used. Single or multiple GPU indices can be specified. Multiple GPU indices are separated by comma `,`. For example, `1`, or `0,1,3`. If the field is not set, no GPU will be visible to tuner (by setting `CUDA_VISIBLE_DEVICES` to be an empty string).
* __ps(optional)__
### trial
__ps__ is the configuration for kubeflow's tensorflow-operator.
Required. Key-value pairs.
* __replicas__
In local and remote mode, the following keys are required.
__replicas__ is the replica number of __ps__ role.
* __command__: Required string. Specifies the command to run trial process.
* __command__
* __codeDir__: Required string. Specifies the directory of your own trial file. This directory will be automatically uploaded in remote mode.
__command__ is the run script in __ps__'s container.
* __gpuNum__: Optional integer. Specifies the num of gpu to run the trial process. Default value is 0.
* __gpuNum__
In PAI mode, the following keys are required.
__gpuNum__ set the gpu number to be used in __ps__ container.
* __command__: Required string. Specifies the command to run trial process.
* __cpuNum__
* __codeDir__: Required string. Specifies the directory of the own trial file. Files in the directory will be uploaded in PAI mode.
__cpuNum__ set the cpu number to be used in __ps__ container.
* __gpuNum__: Required integer. Specifies the num of gpu to run the trial process. Default value is 0.
* __memoryMB__
* __cpuNum__: Required integer. Specifies the cpu number of cpu to be used in pai container.
__memoryMB__ set the memory size of the container.
* __memoryMB__: Required integer. Set the memory size to be used in pai container, in megabytes.
* __image__
* __image__: Required string. Set the image to be used in pai.
__image__ set the image to be used in __ps__.
* __authFile__: Optional string. Used to provide Docker registry which needs authentication for image pull in PAI. [Reference](https://github.com/microsoft/pai/blob/2ea69b45faa018662bc164ed7733f6fdbb4c42b3/docs/faq.md#q-how-to-use-private-docker-registry-job-image-when-submitting-an-openpai-job).
* __worker__
* __shmMB__: Optional integer. Shared memory size of container.
__worker__ is the configuration for kubeflow's tensorflow-operator.
* __portList__: List of key-values pairs with `label`, `beginAt`, `portNumber`. See [job tutorial of PAI](https://github.com/microsoft/pai/blob/master/docs/job_tutorial.md) for details.
* __replicas__
In Kubeflow mode, the following keys are required.
__replicas__ is the replica number of __worker__ role.
* __codeDir__: The local directory where the code files are in.
* __command__
* __ps__: An optional configuration for kubeflow's tensorflow-operator, which includes
__command__ is the run script in __worker__'s container.
* __replicas__: The replica number of __ps__ role.
* __gpuNum__
* __command__: The run script in __ps__'s container.
__gpuNum__ set the gpu number to be used in __worker__ container.
* __gpuNum__: The gpu number to be used in __ps__ container.
* __cpuNum__
* __cpuNum__: The cpu number to be used in __ps__ container.
__cpuNum__ set the cpu number to be used in __worker__ container.
* __memoryMB__: The memory size of the container.
* __memoryMB__
* __image__: The image to be used in __ps__.
__memoryMB__ set the memory size of the container.
* __worker__: An optional configuration for kubeflow's tensorflow-operator.
* __image__
* __replicas__: The replica number of __worker__ role.
__image__ set the image to be used in __worker__.
* __command__: The run script in __worker__'s container.
* __localConfig__
* __gpuNum__: The gpu number to be used in __worker__ container.
__localConfig__ is applicable only if __trainingServicePlatform__ is set to `local`, otherwise there should not be __localConfig__ section in configuration file.
* __gpuIndices__
* __cpuNum__: The cpu number to be used in __worker__ container.
__gpuIndices__ is used to specify designated GPU devices for NNI, if it is set, only the specified GPU devices are used for NNI trial jobs. Single or multiple GPU indices can be specified, multiple GPU indices are seperated by comma(,), such as `1` or `0,1,3`.
* __memoryMB__: The memory size of the container.
* __maxTrialNumPerGpu__
* __image__: The image to be used in __worker__.
### localConfig
Optional in local mode. Key-value pairs.
Only applicable if __trainingServicePlatform__ is set to `local`, otherwise there should not be __localConfig__ section in configuration file.
#### gpuIndices
Optional. String. Default: none.
Used to specify designated GPU devices for NNI, if it is set, only the specified GPU devices are used for NNI trial jobs. Single or multiple GPU indices can be specified. Multiple GPU indices should be separated with comma (`,`), such as `1` or `0,1,3`. By default, all GPUs available will be used.
#### maxTrialNumPerGpu
Optional. Integer. Default: 99999.
__maxTrialNumPerGpu__ is used to specify the max concurrency trial number on a GPU device.
Used to specify the max concurrency trial number on a GPU device.
* __useActiveGpu__
__useActiveGpu__ is used to specify whether to use a GPU if there is another process. By default, NNI will use the GPU only if there is no another active process in the GPU, if __useActiveGpu__ is set to true, NNI will use the GPU regardless of another processes. This field is not applicable for NNI on Windows.
#### useActiveGpu
* __machineList__
Optional. Bool. Default: false.
__machineList__ should be set if __trainingServicePlatform__ is set to remote, or it should be empty.
Used to specify whether to use a GPU if there is another process. By default, NNI will use the GPU only if there is no other active process in the GPU. If __useActiveGpu__ is set to true, NNI will use the GPU regardless of another processes. This field is not applicable for NNI on Windows.
* __ip__
### machineList
__ip__ is the ip address of remote machine.
Required in remote mode. A list of key-value pairs with the following keys.
* __port__
#### ip
__port__ is the ssh port to be used to connect machine.
Required. IP address that is accessible from the current machine.
Note: if users set port empty, the default value will be 22.
* __username__
The IP address of remote machine.
__username__ is the account of remote machine.
* __passwd__
#### port
__passwd__ specifies the password of the account.
Optional. Integer. Valid port. Default: 22.
* __sshKeyPath__
The ssh port to be used to connect machine.
If users use ssh key to login remote machine, could set __sshKeyPath__ in config file. __sshKeyPath__ is the path of ssh key file, which should be valid.
#### username
Note: if users set passwd and sshKeyPath simultaneously, NNI will try passwd.
Required if authentication with username/password. String.
* __passphrase__
The account of remote machine.
__passphrase__ is used to protect ssh key, which could be empty if users don't have passphrase.
#### passwd
* __gpuIndices__
Required if authentication with username/password. String.
__gpuIndices__ is used to specify designated GPU devices for NNI on this remote machine, if it is set, only the specified GPU devices are used for NNI trial jobs. Single or multiple GPU indices can be specified, multiple GPU indices are seperated by comma(,), such as `1` or `0,1,3`.
Specifies the password of the account.
* __maxTrialNumPerGpu__
__maxTrialNumPerGpu__ is used to specify the max concurrency trial number on a GPU device.
#### sshKeyPath
* __useActiveGpu__
__useActiveGpu__ is used to specify whether to use a GPU if there is another process. By default, NNI will use the GPU only if there is no another active process in the GPU, if __useActiveGpu__ is set to true, NNI will use the GPU regardless of another processes. This field is not applicable for NNI on Windows.
Required if authentication with ssh key. Path to private key file.
If users use ssh key to login remote machine, __sshKeyPath__ should be a valid path to a ssh key file.
*Note: if users set passwd and sshKeyPath simultaneously, NNI will try passwd first.*
#### passphrase
Optional. String.
Used to protect ssh key, which could be empty if users don't have passphrase.
#### gpuIndices
Optional. String. Default: none.
Used to specify designated GPU devices for NNI, if it is set, only the specified GPU devices are used for NNI trial jobs. Single or multiple GPU indices can be specified. Multiple GPU indices should be separated with comma (`,`), such as `1` or `0,1,3`. By default, all GPUs available will be used.
#### maxTrialNumPerGpu
Optional. Integer. Default: 99999.
Used to specify the max concurrency trial number on a GPU device.
#### useActiveGpu
Optional. Bool. Default: false.
Used to specify whether to use a GPU if there is another process. By default, NNI will use the GPU only if there is no other active process in the GPU. If __useActiveGpu__ is set to true, NNI will use the GPU regardless of another processes. This field is not applicable for NNI on Windows.
### kubeflowConfig
#### operator
Required. String. Has to be `tf-operator` or `pytorch-operator`.
Specifies the kubeflow's operator to be used, NNI support `tf-operator` in current version.
#### storage
Optional. String. Default. `nfs`.
Specifies the storage type of kubeflow, including `nfs` and `azureStorage`.
#### nfs
* __kubeflowConfig__:
Required if using nfs. Key-value pairs.
* __operator__
* __server__ is the host of nfs server.
__operator__ specify the kubeflow's operator to be used, NNI support __tf-operator__ in current version.
* __path__ is the mounted path of nfs.
* __storage__
#### keyVault
__storage__ specify the storage type of kubeflow, including {__nfs__, __azureStorage__}. This field is optional, and the default value is __nfs__. If the config use azureStorage, this field must be completed.
Required if using azure storage. Key-value pairs.
* __nfs__
Set __keyVault__ to storage the private key of your azure storage account. Refer to https://docs.microsoft.com/en-us/azure/key-vault/key-vault-manage-with-cli2.
__server__ is the host of nfs server
* __vaultName__ is the value of `--vault-name` used in az command.
__path__ is the mounted path of nfs
* __name__ is the value of `--name` used in az command.
* __keyVault__
#### azureStorage
If users want to use azure kubernetes service, they should set keyVault to storage the private key of your azure storage account. Refer: https://docs.microsoft.com/en-us/azure/key-vault/key-vault-manage-with-cli2
Required if using azure storage. Key-value pairs.
* __vaultName__
Set azure storage account to store code files.
__vaultName__ is the value of `--vault-name` used in az command.
* __accountName__ is the name of azure storage account.
* __name__
* __azureShare__ is the share of the azure file storage.
__name__ is the value of `--name` used in az command.
#### uploadRetryCount
* __azureStorage__
Required if using azure storage. Integer between 1 and 99999.
If users use azure kubernetes service, they should set azure storage account to store code files.
If upload files to azure storage failed, NNI will retry the process of uploading, this field will specify the number of attempts to re-upload files.
* __accountName__
### paiConfig
__accountName__ is the name of azure storage account.
#### userName
* __azureShare__
Required. String.
__azureShare__ is the share of the azure file storage.
The user name of your pai account.
* __uploadRetryCount__
#### password
If upload files to azure storage failed, NNI will retry the process of uploading, this field will specify the number of attempts to re-upload files.
Required if using password authentication. String.
* __paiConfig__
The password of the pai account.
* __userName__
#### token
__userName__ is the user name of your pai account.
Required if using token authentication. String.
* __password__
Personal access token that can be retrieved from PAI portal.
__password__ is the password of the pai account.
#### host
* __host__
Required. String.
__host__ is the host of pai.
The hostname of IP address of PAI.
## Examples
* __local mode__
### Local mode
If users want to run trial jobs in local machine, and use annotation to generate search space, could use the following config:
If users want to run trial jobs in local machine, and use annotation to generate search space, could use the following config:
```yaml
authorName: test
......@@ -589,7 +697,7 @@ machineList:
gpuNum: 0
```
You can add assessor configuration.
You can add assessor configuration.
```yaml
authorName: test
......@@ -620,7 +728,7 @@ machineList:
gpuNum: 0
```
Or you could specify your own tuner and assessor file as following,
Or you could specify your own tuner and assessor file as following,
```yaml
authorName: test
......@@ -653,9 +761,9 @@ machineList:
gpuNum: 0
```
* __remote mode__
### Remote mode
If run trial jobs in remote machine, users could specify the remote machine information as following format:
If run trial jobs in remote machine, users could specify the remote machine information as following format:
```yaml
authorName: test
......@@ -695,7 +803,7 @@ machineList:
passphrase: qwert
```
* __pai mode__
### PAI mode
```yaml
authorName: test
......@@ -723,10 +831,6 @@ machineList:
memoryMB: 10000
#The docker image to run NNI job on pai
image: msranni/nni:latest
#The hdfs directory to store data on pai, format 'hdfs://host:port/directory'
dataDir: hdfs://10.11.12.13:9000/test
#The hdfs directory to store output data generated by NNI, format 'hdfs://host:port/directory'
outputDir: hdfs://10.11.12.13:9000/test
paiConfig:
#The username to login pai
userName: test
......@@ -736,7 +840,7 @@ machineList:
host: 10.10.10.10
```
* __kubeflow mode__
### Kubeflow mode
kubeflow with nfs storage.
......@@ -773,7 +877,7 @@ machineList:
path: /var/nfs/general
```
kubeflow with azure storage
### Kubeflow with azure storage
```yaml
authorName: default
......
......@@ -32,9 +32,17 @@ Config the network mode to bridge mode or other mode that could make virtual mac
### Could not open webUI link
Unable to open the WebUI may have the following reasons:
* http://127.0.0.1, http://172.17.0.1 and http://10.0.0.15 are referred to localhost, if you start your experiment on the server or remote machine. You can replace the IP to your server IP to view the WebUI, like http://[your_server_ip]:8080
* `http://127.0.0.1`, `http://172.17.0.1` and `http://10.0.0.15` are referred to localhost, if you start your experiment on the server or remote machine. You can replace the IP to your server IP to view the WebUI, like `http://[your_server_ip]:8080`
* If you still can't see the WebUI after you use the server IP, you can check the proxy and the firewall of your machine. Or use the browser on the machine where you start your NNI experiment.
* Another reason may be your experiment is failed and NNI may fail to get the experiment information. You can check the log of NNIManager in the following directory: ~/nni/experiment/[your_experiment_id] /log/nnimanager.log
* Another reason may be your experiment is failed and NNI may fail to get the experiment information. You can check the log of NNIManager in the following directory: `~/nni/experiment/[your_experiment_id]` `/log/nnimanager.log`
### Restful server start failed
Probably it's a problem with your network config. Here is a checklist.
* You might need to link `127.0.0.1` with `localhost`. Add a line `127.0.0.1 localhost` to `/etc/hosts`.
* It's also possible that you have set some proxy config. Check your environment for variables like `HTTP_PROXY` or `HTTPS_PROXY` and unset if they are set.
### NNI on Windows problems
Please refer to [NNI on Windows](NniOnWindows.md)
......
# Installation of NNI
Currently we support installation on Linux, Mac and Windows(local, remote and pai mode).
Currently we support installation on Linux, Mac and Windows.
## **Installation on Linux & Mac**
......
# NNI on Windows (experimental feature)
Currently we support local, remote and pai mode on Windows. Windows 10.1809 is well tested and recommended.
Running NNI on Windows is an experimental feature. Windows 10.1809 is well tested and recommended.
## **Installation on Windows**
......@@ -41,6 +41,9 @@ Make sure C++ 14.0 compiler installed then try to run `nnictl package install --
### Not supported tuner on Windows
SMAC is not supported currently, the specific reason can be referred to this [GitHub issue](https://github.com/automl/SMAC3/issues/483).
### Use a Windows server as a remote worker
Currently you can't.
Note:
* If there is any error like `Segmentation fault`, please refer to [FAQ](FAQ.md)
<svg id="图层_1" data-name="图层 1" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 806.55 233.23"><defs><style>.cls-1,.cls-10,.cls-2,.cls-3,.cls-7,.cls-8,.cls-9{fill:none;}.cls-1,.cls-9{stroke:#c1c1c1;}.cls-1{stroke-miterlimit:10;}.cls-2{stroke:#000;}.cls-10,.cls-2,.cls-3,.cls-7,.cls-8,.cls-9{stroke-miterlimit:8;stroke-width:1.53px;}.cls-3{stroke:#662d91;}.cls-4{font-size:18px;fill:#0071bc;font-family:SegoeUI, Segoe UI;}.cls-5{letter-spacing:-0.1em;}.cls-6{font-family:SegoeUIBlack, Segoe UI;}.cls-7,.cls-8{stroke:#505050;}.cls-10,.cls-8,.cls-9{stroke-linecap:square;}.cls-10{stroke:#29abe2;}.cls-11{font-size:16.14px;fill:#a1c057;font-family:SegoeUI-Bold, Segoe UI;font-weight:700;}</style></defs><title>overview</title><path class="cls-1" d="M698.51,57.48a70.71,70.71,0,0,1,15.66,139.67"/><path class="cls-1" d="M104.59,57.48A70.71,70.71,0,0,0,92.83,197.92"/><path class="cls-2" d="M377.34,201.92c0-1-.15-2-.15-3a20.37,20.37,0,0,1,20.27-20.48c1,0,1.84.14,2.69.14"/><path class="cls-2" d="M391.79,218.68a24.59,24.59,0,0,0,5.67.71,20.37,20.37,0,0,0,20.27-20.48,12.59,12.59,0,0,0-.15-2.44"/><path class="cls-2" d="M397.46,215.1l-5.81,3.43,3.54,5.73"/><path class="cls-2" d="M394.48,182.15l5.81-3.44L396.75,173"/><path class="cls-3" d="M414,184h-8.36v8.31H414V184Z"/><path class="cls-3" d="M383,216.1a5.88,5.88,0,1,0-5.81-5.87A5.9,5.9,0,0,0,383,216.1Z"/><line class="cls-1" x1="701.49" y1="57.48" x2="595.78" y2="57.48"/><line class="cls-1" x1="488.78" y1="57.48" x2="296.61" y2="57.48"/><line class="cls-1" x1="210.1" y1="57.48" x2="104.59" y2="57.48"/><text class="cls-4" transform="translate(174.39 103.07)">Command Line <tspan class="cls-5" x="124.94" y="0">T</tspan><tspan x="132.61" y="0">ool</tspan><tspan class="cls-6"><tspan x="45.01" y="23.6">NNICTL</tspan></tspan></text><text class="cls-4" transform="translate(493.22 105.07)">Visualized UI<tspan class="cls-6"><tspan x="3.81" y="23.6">NNI Board</tspan></tspan></text><path class="cls-7" d="M240.73,53.54a12.69,12.69,0,1,1,12.69,12.38,12.55,12.55,0,0,1-12.69-12.38Zm20.42-21.33a23.92,23.92,0,0,0-7.73-1.28c-12.84,0-23.19,10.1-23.19,22.61A22.58,22.58,0,0,0,243.5,74m2.19.85a22.63,22.63,0,0,0,7.73,1.28c12.83,0,23.19-10.1,23.19-22.62a22.48,22.48,0,0,0-1.17-7m-.87-2.42a23.09,23.09,0,0,0-10.79-10.81m-4.09,5.55-1.17,3.27m10.8,5.54L265.23,49m-.14,9.25,4.08,1.56m-8.9,8.82-1.89-3.7m-11.09,3.84,1.61-3.69M238.1,59.8l3.65-1.42M238.1,47.71,241.6,49m7.15-7-1.6-3.12"/><path class="cls-8" d="M567.21,74.12h-46.5V37.28h46.5V74.12m-46.5-27.63h45.74m-27.14,6.15h-12.4V68h12.4V52.64m4.65,0h9.3M544,58.77h9.3M544,64.91h6.2"/><polyline class="cls-1" points="357.3 194.3 361.8 198.62 357.3 202.95"/><polyline class="cls-1" points="625.71 194.3 630.21 198.62 625.71 202.95"/><line class="cls-1" x1="245.91" y1="198.91" x2="176.69" y2="198.91"/><line class="cls-1" x1="361.68" y1="198.91" x2="285.84" y2="198.91"/><line class="cls-1" x1="630.29" y1="198.91" x2="557.62" y2="198.91"/><line class="cls-1" x1="517.67" y1="198.91" x2="430.83" y2="198.91"/><path class="cls-9" d="M265.88,187.49A11.87,11.87,0,1,1,254,199.35a11.86,11.86,0,0,1,11.86-11.86ZM264,205.39l6-6-6-6"/><path class="cls-9" d="M537.65,187.49a11.87,11.87,0,1,1-11.86,11.86,11.86,11.86,0,0,1,11.86-11.86Zm-1.9,17.9,6-6-6-6"/><path class="cls-2" d="M699.21,204.44a11.15,11.15,0,0,0-8.71-17.93,15.84,15.84,0,0,0-15.67-14.39,15.64,15.64,0,0,0-13.34,7.62,17.76,17.76,0,0,0-3-.17,14.45,14.45,0,0,0-14.34,14.56,14.59,14.59,0,0,0,14.34,14.73h11.66"/><path class="cls-10" d="M671.09,214a2.11,2.11,0,0,1,3,0,2.16,2.16,0,0,1,0,3,2.13,2.13,0,0,1-3,0,2.08,2.08,0,0,1,0-3ZM673,200.81a3.3,3.3,0,1,0,4.59,0,3.31,3.31,0,0,0-4.59,0Zm11.58-4.11a5.21,5.21,0,0,0,0,7.39,5.24,5.24,0,1,0,7.41-7.42,5.22,5.22,0,0,0-7.39,0Zm-5.45,14.68a3.48,3.48,0,0,0,4.92,4.93,3.48,3.48,0,0,0-4.92-4.93Zm14-2.23a5.3,5.3,0,0,0-.15,7.51A5.42,5.42,0,0,0,700.6,209a5.3,5.3,0,0,0-7.51.15Zm-1.82-4.41,2.38,3.86m-13.77,2.29-2.69-5m6,4.93,2.64-5.76m-7.39-2.58,4.65-1m-4.77,2.9,13.5,6.57m-17.15-4.54-1.52,7m11.85.37,6.46-.35m-17.37.54,10-10.25m-5.91,10.77-3.59.48"/><polyline class="cls-2" points="153.46 201.47 153.46 175.14 116.73 175.14 116.73 217.65 141.51 217.65"/><line class="cls-2" x1="121.2" y1="180.65" x2="130.57" y2="180.65"/><line class="cls-2" x1="128.24" y1="186.09" x2="141.31" y2="186.09"/><line class="cls-2" x1="134.77" y1="191.45" x2="147.56" y2="191.45"/><line class="cls-2" x1="134.77" y1="196.5" x2="144.38" y2="196.5"/><line class="cls-2" x1="137.41" y1="201.54" x2="128.24" y2="201.54"/><line class="cls-2" x1="128.24" y1="206.34" x2="141.31" y2="206.34"/><line class="cls-2" x1="130.57" y1="211.63" x2="121.2" y2="211.63"/><text class="cls-11" transform="translate(143.02 216.79)">PY</text></svg>
<svg id="图层_1" data-name="图层 1" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 806.55 377.62"><defs><style>.cls-1,.cls-10,.cls-11,.cls-12,.cls-13,.cls-14,.cls-15,.cls-16,.cls-17,.cls-18,.cls-2,.cls-3,.cls-5,.cls-6,.cls-7,.cls-8{fill:none;}.cls-1,.cls-7{stroke:#c1c1c1;}.cls-1{stroke-miterlimit:10;}.cls-14,.cls-17,.cls-2,.cls-20{stroke:#000;}.cls-2,.cls-3,.cls-5,.cls-6,.cls-7,.cls-8{stroke-miterlimit:8;stroke-width:1.53px;}.cls-3{stroke:#662d91;}.cls-4{fill:#0071bc;}.cls-5,.cls-6{stroke:#505050;}.cls-12,.cls-16,.cls-6,.cls-7,.cls-8{stroke-linecap:square;}.cls-8{stroke:#29abe2;}.cls-9{fill:#a1c057;}.cls-10,.cls-18{stroke:#0b0b0b;}.cls-10,.cls-14,.cls-15,.cls-17,.cls-18{stroke-linecap:round;}.cls-10,.cls-11,.cls-13,.cls-14,.cls-17,.cls-18,.cls-19,.cls-20{stroke-width:2px;}.cls-11{stroke:#48aee4;}.cls-12,.cls-15{stroke:#5d5e5e;}.cls-13{stroke:#7579c0;}.cls-16{stroke:#979797;}.cls-17,.cls-18{stroke-linejoin:round;}.cls-19,.cls-20{fill:#fff;}.cls-19{stroke:#a3c050;}</style></defs><title>overview栅格化</title><path class="cls-1" d="M438.1,340a99.12,99.12,0,0,0,54.84-53"/><path class="cls-1" d="M311.55,287.29A99.09,99.09,0,0,0,366.3,340"/><path class="cls-1" d="M366.3,155.93a99.09,99.09,0,0,0-54.75,52.75"/><path class="cls-1" d="M492.94,208.89a99.09,99.09,0,0,0-54.85-53"/><path d="M372,184.29h-1.72l-.5-1.55h-2.49l-.49,1.55h-1.71l2.55-7h1.87Zm-2.58-2.76-.75-2.36a3.94,3.94,0,0,1-.12-.63h0a2.13,2.13,0,0,1-.12.61l-.76,2.38Z"/><path d="M376,180.69a1.39,1.39,0,0,0-.65-.16,1,1,0,0,0-.79.37,1.62,1.62,0,0,0-.28,1v2.38H372.7v-5h1.55v.93h0a1.38,1.38,0,0,1,1.7-1Z"/><path d="M380.43,184.11a3,3,0,0,1-1.51.31,2.57,2.57,0,0,1-1.86-.7,2.4,2.4,0,0,1-.71-1.8,2.65,2.65,0,0,1,.76-2,2.82,2.82,0,0,1,2-.74,2.56,2.56,0,0,1,1.28.24v1.31a1.7,1.7,0,0,0-1.08-.37,1.42,1.42,0,0,0-1.05.39,1.45,1.45,0,0,0-.39,1.07,1.48,1.48,0,0,0,.37,1.05,1.38,1.38,0,0,0,1,.37,2,2,0,0,0,1.12-.36Z"/><path d="M386.24,184.29h-1.53v-2.84c0-.73-.27-1.1-.81-1.1a.83.83,0,0,0-.66.31,1.2,1.2,0,0,0-.25.78v2.85h-1.55v-7.4H383V180h0a1.74,1.74,0,0,1,1.53-.87c1.14,0,1.7.69,1.7,2.05Z"/><path d="M388.25,178.5a.85.85,0,0,1-.64-.23.72.72,0,0,1-.25-.57.69.69,0,0,1,.25-.56.89.89,0,0,1,.64-.22.92.92,0,0,1,.64.22.71.71,0,0,1,.24.56.75.75,0,0,1-.24.58A.92.92,0,0,1,388.25,178.5Zm.76,5.79h-1.55v-5H389Z"/><path d="M393.33,184.24a2.29,2.29,0,0,1-1,.18c-1.09,0-1.63-.57-1.63-1.7v-2.29h-.81v-1.14h.81v-1.07l1.54-.44v1.51h1.12v1.14h-1.12v2c0,.52.2.78.62.78a1,1,0,0,0,.5-.14Z"/><path d="M398.65,182.23h-3.26c.05.73.51,1.09,1.37,1.09a2.69,2.69,0,0,0,1.45-.39v1.12a3.88,3.88,0,0,1-1.8.37,2.33,2.33,0,0,1-2.55-2.55,2.68,2.68,0,0,1,.72-2,2.4,2.4,0,0,1,1.78-.73,2.19,2.19,0,0,1,1.69.65,2.49,2.49,0,0,1,.6,1.76Zm-1.43-.94q0-1.08-.87-1.08a.81.81,0,0,0-.64.31,1.39,1.39,0,0,0-.33.77Z"/><path d="M403.36,184.11a3,3,0,0,1-1.51.31,2.59,2.59,0,0,1-1.87-.7,2.44,2.44,0,0,1-.71-1.8,2.69,2.69,0,0,1,.76-2,2.85,2.85,0,0,1,2.05-.74,2.59,2.59,0,0,1,1.28.24v1.31a1.7,1.7,0,0,0-1.08-.37,1.42,1.42,0,0,0-1,.39,1.45,1.45,0,0,0-.39,1.07,1.43,1.43,0,0,0,.37,1.05,1.38,1.38,0,0,0,1,.37,2.06,2.06,0,0,0,1.13-.36Z"/><path d="M407.39,184.24a2.29,2.29,0,0,1-1,.18c-1.09,0-1.63-.57-1.63-1.7v-2.29h-.81v-1.14h.81v-1.07l1.54-.44v1.51h1.12v1.14h-1.12v2c0,.52.21.78.62.78a1,1,0,0,0,.5-.14Z"/><path d="M410.85,184v-1.57a3,3,0,0,0,1.93.72,2,2,0,0,0,.52-.06,1,1,0,0,0,.37-.15.58.58,0,0,0,.22-.22.55.55,0,0,0,.07-.28.67.67,0,0,0-.11-.37,1.32,1.32,0,0,0-.32-.3,5.21,5.21,0,0,0-.48-.26l-.61-.26a3.12,3.12,0,0,1-1.22-.83,1.82,1.82,0,0,1-.4-1.19,1.84,1.84,0,0,1,.22-.94,1.78,1.78,0,0,1,.6-.65,2.64,2.64,0,0,1,.87-.38,5.22,5.22,0,0,1,1.06-.11,7,7,0,0,1,1,.06,3.71,3.71,0,0,1,.78.2v1.46a2.42,2.42,0,0,0-.39-.21l-.42-.16-.44-.09-.41,0a1.85,1.85,0,0,0-.49.05,1.14,1.14,0,0,0-.37.14.85.85,0,0,0-.24.22.6.6,0,0,0-.08.29.59.59,0,0,0,.09.32,1,1,0,0,0,.27.26,2.94,2.94,0,0,0,.41.25c.16.08.35.16.55.24a6,6,0,0,1,.76.38,2.87,2.87,0,0,1,.57.44,1.93,1.93,0,0,1,.37.57,2.15,2.15,0,0,1,.12.75,2,2,0,0,1-.22,1,1.76,1.76,0,0,1-.6.64,2.52,2.52,0,0,1-.89.36,5.22,5.22,0,0,1-1.06.11,6.43,6.43,0,0,1-1.1-.1A3.4,3.4,0,0,1,410.85,184Z"/><path d="M421.12,182.23h-3.27c.06.73.51,1.09,1.38,1.09a2.69,2.69,0,0,0,1.45-.39v1.12a3.91,3.91,0,0,1-1.8.37,2.33,2.33,0,0,1-2.55-2.55,2.68,2.68,0,0,1,.72-2,2.38,2.38,0,0,1,1.78-.73,2.19,2.19,0,0,1,1.69.65,2.49,2.49,0,0,1,.6,1.76Zm-1.43-.94q0-1.08-.87-1.08a.84.84,0,0,0-.65.31,1.31,1.31,0,0,0-.32.77Z"/><path d="M426.19,184.29h-1.46v-.71h0a1.64,1.64,0,0,1-1.49.84,1.55,1.55,0,0,1-1.15-.42,1.46,1.46,0,0,1-.42-1.1c0-1,.58-1.53,1.73-1.68l1.35-.18c0-.54-.29-.82-.88-.82a3,3,0,0,0-1.7.53v-1.16a3.61,3.61,0,0,1,.88-.29,4.32,4.32,0,0,1,1-.13,1.89,1.89,0,0,1,2.13,2.14Zm-1.45-2v-.33l-.9.11c-.5.07-.76.29-.76.68a.56.56,0,0,0,.19.43.67.67,0,0,0,.49.17.91.91,0,0,0,.71-.3A1.09,1.09,0,0,0,424.74,182.26Z"/><path d="M430.68,180.69a1.39,1.39,0,0,0-.65-.16,1,1,0,0,0-.79.37,1.62,1.62,0,0,0-.28,1v2.38h-1.54v-5H429v.93h0a1.33,1.33,0,0,1,1.32-1,1.25,1.25,0,0,1,.38,0Z"/><path d="M435.15,184.11a3,3,0,0,1-1.51.31,2.59,2.59,0,0,1-1.87-.7,2.44,2.44,0,0,1-.71-1.8,2.65,2.65,0,0,1,.76-2,2.83,2.83,0,0,1,2.05-.74,2.61,2.61,0,0,1,1.28.24v1.31a1.74,1.74,0,0,0-1.08-.37,1.43,1.43,0,0,0-1.06.39,1.49,1.49,0,0,0-.38,1.07,1.43,1.43,0,0,0,.37,1.05,1.35,1.35,0,0,0,1,.37,2.06,2.06,0,0,0,1.13-.36Z"/><path d="M441,184.29h-1.54v-2.84c0-.73-.27-1.1-.8-1.1a.84.84,0,0,0-.67.31,1.2,1.2,0,0,0-.25.78v2.85h-1.54v-7.4h1.54V180h0a1.75,1.75,0,0,1,1.54-.87c1.13,0,1.7.69,1.7,2.05Z"/><path d="M294.31,263.2h-2.47v1.72h2.27v1.28h-2.27v2.72h-1.58v-7h4.05Z"/><path d="M299.8,266.86h-3.26c0,.72.51,1.09,1.37,1.09a2.67,2.67,0,0,0,1.45-.39v1.11a3.76,3.76,0,0,1-1.8.37,2.53,2.53,0,0,1-1.88-.67,2.56,2.56,0,0,1-.67-1.87,2.67,2.67,0,0,1,.73-2,2.41,2.41,0,0,1,1.77-.72,2.14,2.14,0,0,1,1.69.65,2.49,2.49,0,0,1,.6,1.76Zm-1.43-.95c0-.71-.29-1.07-.87-1.07a.84.84,0,0,0-.64.3,1.46,1.46,0,0,0-.33.77Z"/><path d="M304.88,268.92h-1.46v-.72h0a1.62,1.62,0,0,1-1.49.84,1.58,1.58,0,0,1-1.14-.41,1.48,1.48,0,0,1-.42-1.1c0-1,.57-1.53,1.72-1.68l1.36-.18c0-.55-.29-.82-.89-.82a3,3,0,0,0-1.7.53v-1.16a4.24,4.24,0,0,1,.88-.3,5,5,0,0,1,1-.12,1.89,1.89,0,0,1,2.14,2.13Zm-1.45-2v-.34l-.91.12c-.5.06-.75.29-.75.68a.57.57,0,0,0,.18.43.74.74,0,0,0,.5.17.91.91,0,0,0,.71-.3A1.1,1.1,0,0,0,303.43,266.89Z"/><path d="M309.12,268.86a2.26,2.26,0,0,1-1,.18,1.47,1.47,0,0,1-1.63-1.69v-2.29h-.81v-1.14h.81v-1.08l1.54-.44v1.52h1.12v1.14H308v2c0,.52.21.78.62.78a1.08,1.08,0,0,0,.5-.14Z"/><path d="M314.74,268.92h-1.53v-.76h0a1.74,1.74,0,0,1-1.53.88c-1.16,0-1.73-.7-1.73-2.1v-3h1.53v2.88c0,.71.28,1.06.84,1.06a.86.86,0,0,0,.67-.29,1.2,1.2,0,0,0,.25-.79v-2.86h1.53Z"/><path d="M319.31,265.31a1.38,1.38,0,0,0-.65-.15.93.93,0,0,0-.79.37,1.58,1.58,0,0,0-.28,1v2.39H316v-5h1.55v.93h0a1.33,1.33,0,0,1,1.31-1,1,1,0,0,1,.39.06Z"/><path d="M324.48,266.86h-3.27c.06.72.51,1.09,1.38,1.09a2.69,2.69,0,0,0,1.45-.39v1.11a3.83,3.83,0,0,1-1.81.37,2.31,2.31,0,0,1-2.54-2.54,2.71,2.71,0,0,1,.72-2,2.41,2.41,0,0,1,1.78-.72,2.16,2.16,0,0,1,1.69.65,2.49,2.49,0,0,1,.6,1.76Zm-1.43-.95c0-.71-.29-1.07-.87-1.07a.86.86,0,0,0-.65.3,1.46,1.46,0,0,0-.33.77Z"/><path d="M283.91,280.92h-4.19v-7h4v1.28h-2.46v1.56h2.29V278h-2.29v1.6h2.62Z"/><path d="M289.72,280.92h-1.54v-2.78c0-.77-.28-1.16-.83-1.16a.81.81,0,0,0-.66.31,1.17,1.17,0,0,0-.26.78v2.85h-1.54v-5h1.54v.79h0a1.76,1.76,0,0,1,1.61-.91c1.1,0,1.66.68,1.66,2.06Z"/><path d="M295.84,280.35a2.8,2.8,0,0,1-.81,2.15,3.24,3.24,0,0,1-2.33.77,3.69,3.69,0,0,1-1.6-.29v-1.3a3,3,0,0,0,1.55.45,1.71,1.71,0,0,0,1.22-.41,1.51,1.51,0,0,0,.43-1.12v-.4h0a1.74,1.74,0,0,1-1.56.84,1.86,1.86,0,0,1-1.52-.67,2.76,2.76,0,0,1-.56-1.81,3.09,3.09,0,0,1,.62-2,2.07,2.07,0,0,1,1.65-.74,1.52,1.52,0,0,1,1.37.7h0v-.58h1.54Zm-1.53-1.83v-.4a1.23,1.23,0,0,0-.28-.81.9.9,0,0,0-.73-.33.92.92,0,0,0-.8.4,1.85,1.85,0,0,0-.29,1.13,1.66,1.66,0,0,0,.27,1,.92.92,0,0,0,.77.36.94.94,0,0,0,.77-.37A1.48,1.48,0,0,0,294.31,278.52Z"/><path d="M297.91,275.13a.93.93,0,0,1-.64-.23.75.75,0,0,1-.25-.57.72.72,0,0,1,.25-.57.92.92,0,0,1,.64-.22.91.91,0,0,1,.64.22.77.77,0,0,1,0,1.14A.91.91,0,0,1,297.91,275.13Zm.76,5.79h-1.54v-5h1.54Z"/><path d="M304.8,280.92h-1.54v-2.78c0-.77-.28-1.16-.83-1.16a.81.81,0,0,0-.66.31,1.17,1.17,0,0,0-.26.78v2.85H300v-5h1.54v.79h0a1.77,1.77,0,0,1,1.61-.91c1.1,0,1.66.68,1.66,2.06Z"/><path d="M310.51,278.86h-3.26c.05.72.51,1.09,1.37,1.09a2.69,2.69,0,0,0,1.45-.39v1.11a3.76,3.76,0,0,1-1.8.37,2.33,2.33,0,0,1-2.55-2.54,2.71,2.71,0,0,1,.72-2,2.44,2.44,0,0,1,1.78-.72,2.16,2.16,0,0,1,1.69.65,2.49,2.49,0,0,1,.6,1.76Zm-1.43-.95c0-.71-.29-1.07-.87-1.07a.83.83,0,0,0-.64.3,1.46,1.46,0,0,0-.33.77Z"/><path d="M315.92,278.86h-3.26c0,.72.51,1.09,1.37,1.09a2.69,2.69,0,0,0,1.45-.39v1.11a3.76,3.76,0,0,1-1.8.37,2.33,2.33,0,0,1-2.55-2.54,2.71,2.71,0,0,1,.72-2,2.44,2.44,0,0,1,1.78-.72,2.16,2.16,0,0,1,1.69.65,2.49,2.49,0,0,1,.6,1.76Zm-1.43-.95c0-.71-.29-1.07-.87-1.07a.83.83,0,0,0-.64.3,1.46,1.46,0,0,0-.33.77Z"/><path d="M320.1,277.31a1.33,1.33,0,0,0-.65-.15.91.91,0,0,0-.78.37,1.59,1.59,0,0,0-.29,1v2.39h-1.54v-5h1.54v.93h0a1.35,1.35,0,0,1,1.32-1,1,1,0,0,1,.38.06Z"/><path d="M321.64,275.13a.93.93,0,0,1-.64-.23.75.75,0,0,1-.25-.57.72.72,0,0,1,.25-.57.92.92,0,0,1,.64-.22.91.91,0,0,1,.64.22.77.77,0,0,1,0,1.14A.91.91,0,0,1,321.64,275.13Zm.76,5.79h-1.54v-5h1.54Z"/><path d="M328.53,280.92H327v-2.78c0-.77-.28-1.16-.83-1.16a.81.81,0,0,0-.66.31,1.17,1.17,0,0,0-.26.78v2.85H323.7v-5h1.54v.79h0a1.77,1.77,0,0,1,1.61-.91c1.11,0,1.66.68,1.66,2.06Z"/><path d="M334.65,280.35a2.84,2.84,0,0,1-.8,2.15,3.28,3.28,0,0,1-2.34.77,3.72,3.72,0,0,1-1.6-.29v-1.3a3.07,3.07,0,0,0,1.56.45,1.7,1.7,0,0,0,1.21-.41,1.48,1.48,0,0,0,.43-1.12v-.4h0a1.72,1.72,0,0,1-1.56.84,1.88,1.88,0,0,1-1.52-.67,2.76,2.76,0,0,1-.56-1.81,3,3,0,0,1,.63-2,2,2,0,0,1,1.64-.74,1.51,1.51,0,0,1,1.37.7h0v-.58h1.54Zm-1.52-1.83v-.4a1.28,1.28,0,0,0-.28-.81.92.92,0,0,0-.73-.33,1,1,0,0,0-.81.4,1.92,1.92,0,0,0-.29,1.13,1.59,1.59,0,0,0,.28,1,.9.9,0,0,0,.76.36.92.92,0,0,0,.77-.37A1.48,1.48,0,0,0,333.13,278.52Z"/><path d="M468.1,268.92h-1.58v-2.85h-2.9v2.85H462v-7h1.58v2.79h2.9v-2.79h1.58Z"/><path d="M474.31,263.92l-2,5.4c-.49,1.3-1.22,1.95-2.21,1.95a2.68,2.68,0,0,1-.92-.13v-1.23a1.29,1.29,0,0,0,.67.18.84.84,0,0,0,.83-.56l.26-.62-2-5h1.71l.93,3a3.57,3.57,0,0,1,.14.67h0a4.44,4.44,0,0,1,.16-.66l.94-3.05Z"/><path d="M476.49,268.34h0v2.88h-1.54v-7.3h1.54v.75h0a2,2,0,0,1,3.11-.21,2.88,2.88,0,0,1,.53,1.82,3.06,3.06,0,0,1-.62,2,2,2,0,0,1-1.64.76A1.56,1.56,0,0,1,476.49,268.34Zm0-2v.4a1.28,1.28,0,0,0,.27.84.88.88,0,0,0,.72.33,1,1,0,0,0,.82-.41,2,2,0,0,0,.29-1.15c0-.88-.34-1.32-1-1.32a1,1,0,0,0-.77.36A1.42,1.42,0,0,0,476.45,266.29Z"/><path d="M485.62,266.86h-3.26c.05.72.51,1.09,1.37,1.09a2.67,2.67,0,0,0,1.45-.39v1.11a3.76,3.76,0,0,1-1.8.37,2.53,2.53,0,0,1-1.88-.67,2.56,2.56,0,0,1-.67-1.87,2.67,2.67,0,0,1,.73-2,2.41,2.41,0,0,1,1.77-.72,2.14,2.14,0,0,1,1.69.65,2.49,2.49,0,0,1,.6,1.76Zm-1.43-.95c0-.71-.29-1.07-.87-1.07a.84.84,0,0,0-.64.3,1.46,1.46,0,0,0-.33.77Z"/><path d="M489.8,265.31a1.33,1.33,0,0,0-.65-.15.92.92,0,0,0-.78.37,1.59,1.59,0,0,0-.29,1v2.39h-1.54v-5h1.54v.93h0a1.35,1.35,0,0,1,1.32-1,.94.94,0,0,1,.38.06Z"/><path d="M492.08,268.34h0v2.88h-1.54v-7.3h1.54v.75h0a2,2,0,0,1,3.11-.21,2.88,2.88,0,0,1,.53,1.82,3,3,0,0,1-.62,2,2,2,0,0,1-1.64.76A1.56,1.56,0,0,1,492.08,268.34Zm0-2v.4a1.28,1.28,0,0,0,.27.84.9.9,0,0,0,.72.33,1,1,0,0,0,.82-.41,2,2,0,0,0,.29-1.15c0-.88-.34-1.32-1-1.32a1,1,0,0,0-.77.36A1.42,1.42,0,0,0,492,266.29Z"/><path d="M500.78,268.92h-1.46v-.72h0a1.64,1.64,0,0,1-1.49.84,1.58,1.58,0,0,1-1.15-.41,1.48,1.48,0,0,1-.42-1.1c0-1,.58-1.53,1.73-1.68l1.35-.18c0-.55-.29-.82-.88-.82a3,3,0,0,0-1.7.53v-1.16a4,4,0,0,1,.88-.3,4.85,4.85,0,0,1,1-.12,1.88,1.88,0,0,1,2.13,2.13Zm-1.45-2v-.34l-.9.12c-.5.06-.76.29-.76.68a.54.54,0,0,0,.19.43.71.71,0,0,0,.49.17.91.91,0,0,0,.71-.3A1.1,1.1,0,0,0,499.33,266.89Z"/><path d="M505.27,265.31a1.38,1.38,0,0,0-.65-.15.93.93,0,0,0-.79.37,1.58,1.58,0,0,0-.28,1v2.39H502v-5h1.54v.93h0a1.33,1.33,0,0,1,1.32-1,1,1,0,0,1,.38.06Z"/><path d="M510.15,268.92h-1.46v-.72h0a1.64,1.64,0,0,1-1.49.84,1.58,1.58,0,0,1-1.15-.41,1.48,1.48,0,0,1-.42-1.1c0-1,.58-1.53,1.73-1.68l1.35-.18c0-.55-.29-.82-.88-.82a3,3,0,0,0-1.7.53v-1.16a4,4,0,0,1,.88-.3,4.85,4.85,0,0,1,1-.12,1.89,1.89,0,0,1,2.14,2.13Zm-1.46-2v-.34l-.9.12c-.5.06-.76.29-.76.68a.57.57,0,0,0,.19.43.71.71,0,0,0,.49.17.91.91,0,0,0,.71-.3A1.1,1.1,0,0,0,508.69,266.89Z"/><path d="M519.31,268.92h-1.54v-2.85c0-.73-.27-1.09-.8-1.09a.73.73,0,0,0-.62.33,1.32,1.32,0,0,0-.24.81v2.8h-1.55V266c0-.71-.26-1.06-.78-1.06a.73.73,0,0,0-.63.31,1.36,1.36,0,0,0-.24.85v2.78h-1.54v-5h1.54v.78h0a1.89,1.89,0,0,1,1.61-.9,1.44,1.44,0,0,1,1.45.93,1.87,1.87,0,0,1,1.68-.93c1.09,0,1.64.67,1.64,2Z"/><path d="M525,266.86h-3.26c0,.72.51,1.09,1.37,1.09a2.69,2.69,0,0,0,1.45-.39v1.11a3.76,3.76,0,0,1-1.8.37,2.33,2.33,0,0,1-2.55-2.54,2.71,2.71,0,0,1,.72-2,2.41,2.41,0,0,1,1.78-.72,2.16,2.16,0,0,1,1.69.65,2.49,2.49,0,0,1,.6,1.76Zm-1.43-.95c0-.71-.29-1.07-.87-1.07a.83.83,0,0,0-.64.3,1.46,1.46,0,0,0-.33.77Z"/><path d="M529,268.86a2.29,2.29,0,0,1-1,.18c-1.09,0-1.63-.56-1.63-1.69v-2.29h-.81v-1.14h.81v-1.08l1.54-.44v1.52H529v1.14h-1.12v2c0,.52.2.78.62.78a1.08,1.08,0,0,0,.5-.14Z"/><path d="M534.28,266.86H531c.05.72.51,1.09,1.37,1.09a2.69,2.69,0,0,0,1.45-.39v1.11a3.76,3.76,0,0,1-1.8.37,2.33,2.33,0,0,1-2.55-2.54,2.71,2.71,0,0,1,.72-2,2.43,2.43,0,0,1,1.78-.72,2.16,2.16,0,0,1,1.69.65,2.49,2.49,0,0,1,.6,1.76Zm-1.43-.95c0-.71-.29-1.07-.87-1.07a.83.83,0,0,0-.64.3,1.46,1.46,0,0,0-.33.77Z"/><path d="M538.46,265.31a1.36,1.36,0,0,0-.65-.15.94.94,0,0,0-.79.37,1.65,1.65,0,0,0-.28,1v2.39H535.2v-5h1.54v.93h0a1.34,1.34,0,0,1,1.32-1,1,1,0,0,1,.38.06Z"/><path d="M489.43,275.2h-2v5.72h-1.58V275.2h-2v-1.28h5.57Z"/><path d="M494.28,280.92h-1.54v-.76h0a1.74,1.74,0,0,1-1.53.88q-1.74,0-1.74-2.1v-3H491v2.88c0,.71.28,1.06.84,1.06a.83.83,0,0,0,.66-.29,1.15,1.15,0,0,0,.25-.79v-2.86h1.54Z"/><path d="M500.41,280.92h-1.54v-2.78c0-.77-.28-1.16-.83-1.16a.81.81,0,0,0-.66.31,1.17,1.17,0,0,0-.26.78v2.85h-1.54v-5h1.54v.79h0a1.76,1.76,0,0,1,1.61-.91c1.1,0,1.66.68,1.66,2.06Z"/><path d="M502.41,275.13a.93.93,0,0,1-.64-.23.75.75,0,0,1-.25-.57.72.72,0,0,1,.25-.57.92.92,0,0,1,.64-.22.91.91,0,0,1,.64.22.77.77,0,0,1,0,1.14A.91.91,0,0,1,502.41,275.13Zm.76,5.79h-1.54v-5h1.54Z"/><path d="M509.3,280.92h-1.54v-2.78c0-.77-.28-1.16-.83-1.16a.81.81,0,0,0-.66.31,1.17,1.17,0,0,0-.26.78v2.85h-1.54v-5H506v.79h0a1.77,1.77,0,0,1,1.61-.91c1.1,0,1.66.68,1.66,2.06Z"/><path d="M515.42,280.35a2.84,2.84,0,0,1-.8,2.15,3.28,3.28,0,0,1-2.34.77,3.72,3.72,0,0,1-1.6-.29v-1.3a3,3,0,0,0,1.56.45,1.7,1.7,0,0,0,1.21-.41,1.51,1.51,0,0,0,.43-1.12v-.4h0a1.72,1.72,0,0,1-1.56.84,1.88,1.88,0,0,1-1.52-.67,2.76,2.76,0,0,1-.56-1.81,3,3,0,0,1,.63-2,2,2,0,0,1,1.64-.74,1.51,1.51,0,0,1,1.37.7h0v-.58h1.54Zm-1.52-1.83v-.4a1.28,1.28,0,0,0-.28-.81.92.92,0,0,0-.73-.33,1,1,0,0,0-.81.4,1.92,1.92,0,0,0-.29,1.13,1.59,1.59,0,0,0,.28,1,.9.9,0,0,0,.76.36.92.92,0,0,0,.77-.37A1.48,1.48,0,0,0,513.9,278.52Z"/><path d="M365.39,364.4h-1.55v-4.19c0-.45,0-.95.05-1.5h0a8.77,8.77,0,0,1-.21.93L362,364.4h-1.28L359,359.68a7.55,7.55,0,0,1-.22-1h-.05c0,.69.07,1.29.07,1.81v3.88h-1.43v-7h2.31l1.43,4.16a5,5,0,0,1,.25,1h0a9.34,9.34,0,0,1,.28-1l1.43-4.15h2.25Z"/><path d="M369.22,364.52a2.68,2.68,0,0,1-2-.7,2.92,2.92,0,0,1,0-3.85,2.82,2.82,0,0,1,2-.7,2.7,2.7,0,0,1,2,.7,2.52,2.52,0,0,1,.71,1.86,2.69,2.69,0,0,1-.73,2A2.74,2.74,0,0,1,369.22,364.52Zm0-4.06a1,1,0,0,0-.85.37,1.63,1.63,0,0,0-.31,1.07q0,1.44,1.17,1.44c.73,0,1.1-.5,1.1-1.48S370,360.46,369.26,360.46Z"/><path d="M377.84,364.4H376.3v-.69h0a1.73,1.73,0,0,1-1.55.81,1.89,1.89,0,0,1-1.51-.66,2.75,2.75,0,0,1-.57-1.85,3,3,0,0,1,.63-2,2.05,2.05,0,0,1,1.65-.75,1.43,1.43,0,0,1,1.35.7h0v-3h1.54ZM376.33,362v-.38a1.16,1.16,0,0,0-.29-.81.91.91,0,0,0-.73-.32.94.94,0,0,0-.81.4,1.9,1.9,0,0,0-.28,1.11,1.66,1.66,0,0,0,.27,1,1,1,0,0,0,1.54,0A1.54,1.54,0,0,0,376.33,362Z"/><path d="M383.63,362.34h-3.26c.05.72.51,1.09,1.37,1.09a2.61,2.61,0,0,0,1.45-.4v1.12a3.88,3.88,0,0,1-1.8.37,2.33,2.33,0,0,1-2.55-2.55,2.68,2.68,0,0,1,.72-2,2.44,2.44,0,0,1,1.78-.73,2.19,2.19,0,0,1,1.69.65,2.51,2.51,0,0,1,.6,1.77Zm-1.43-.95q0-1.08-.87-1.08a.84.84,0,0,0-.64.31,1.42,1.42,0,0,0-.33.77Z"/><path d="M386.09,364.4h-1.54V357h1.54Z"/><path d="M395.32,364.15a4.73,4.73,0,0,1-2,.37,3.39,3.39,0,0,1-2.54-.95,3.47,3.47,0,0,1-.92-2.52,3.73,3.73,0,0,1,1-2.73,3.65,3.65,0,0,1,2.69-1,4.89,4.89,0,0,1,1.73.26v1.51a3.05,3.05,0,0,0-1.6-.41,2.09,2.09,0,0,0-1.59.62,2.59,2.59,0,0,0,0,3.29,2,2,0,0,0,1.54.61,3.26,3.26,0,0,0,1.69-.45Z"/><path d="M398.77,364.52a2.66,2.66,0,0,1-2-.7,2.53,2.53,0,0,1-.72-1.9,2.58,2.58,0,0,1,.74-1.95,2.86,2.86,0,0,1,2-.7,2.67,2.67,0,0,1,2,.7,2.48,2.48,0,0,1,.71,1.86,2.65,2.65,0,0,1-.73,2A2.72,2.72,0,0,1,398.77,364.52Zm0-4.06a1,1,0,0,0-.85.37,1.68,1.68,0,0,0-.3,1.07c0,1,.39,1.44,1.16,1.44s1.11-.5,1.11-1.48S399.56,360.46,398.81,360.46Z"/><path d="M410.44,364.4H408.9v-2.85c0-.73-.26-1.09-.8-1.09a.75.75,0,0,0-.62.32,1.35,1.35,0,0,0-.24.82v2.8H405.7v-2.88c0-.71-.26-1.06-.79-1.06a.75.75,0,0,0-.63.31,1.36,1.36,0,0,0-.23.85v2.78H402.5v-5h1.55v.78h0a1.77,1.77,0,0,1,.68-.65,1.8,1.8,0,0,1,.94-.26,1.44,1.44,0,0,1,1.45.94,1.86,1.86,0,0,1,1.68-.94c1.09,0,1.63.68,1.63,2Z"/><path d="M413.23,363.82h0v2.88h-1.55v-7.3h1.55v.75h0a1.95,1.95,0,0,1,3.1-.21,2.82,2.82,0,0,1,.53,1.82,3.06,3.06,0,0,1-.61,2,2,2,0,0,1-1.65.76A1.56,1.56,0,0,1,413.23,363.82Zm-.05-2.06v.41a1.28,1.28,0,0,0,.27.84.91.91,0,0,0,.72.33,1,1,0,0,0,.82-.41,2,2,0,0,0,.29-1.16c0-.87-.34-1.31-1-1.31a1,1,0,0,0-.78.35A1.44,1.44,0,0,0,413.18,361.76Z"/><path d="M421.13,360.79a1.38,1.38,0,0,0-.65-.15.93.93,0,0,0-.79.37,1.58,1.58,0,0,0-.28,1v2.39h-1.55v-5h1.55v.92h0a1.33,1.33,0,0,1,1.31-1,1,1,0,0,1,.39.06Z"/><path d="M426.3,362.34H423c.06.72.51,1.09,1.38,1.09a2.61,2.61,0,0,0,1.45-.4v1.12a3.91,3.91,0,0,1-1.8.37,2.33,2.33,0,0,1-2.55-2.55,2.68,2.68,0,0,1,.72-2,2.42,2.42,0,0,1,1.78-.73,2.19,2.19,0,0,1,1.69.65,2.51,2.51,0,0,1,.6,1.77Zm-1.43-.95q0-1.08-.87-1.08a.88.88,0,0,0-.65.31,1.42,1.42,0,0,0-.33.77Z"/><path d="M426.89,364.26V363a3,3,0,0,0,.76.35,2.67,2.67,0,0,0,.71.11,1.62,1.62,0,0,0,.65-.11.37.37,0,0,0,.24-.35.34.34,0,0,0-.11-.24,1,1,0,0,0-.28-.17,2.64,2.64,0,0,0-.37-.13l-.38-.13a3.76,3.76,0,0,1-.53-.25,1.4,1.4,0,0,1-.38-.3,1.09,1.09,0,0,1-.23-.39,1.42,1.42,0,0,1,.59-1.75,2.1,2.1,0,0,1,.7-.28,3.1,3.1,0,0,1,.81-.1,3.91,3.91,0,0,1,.68.06,5,5,0,0,1,.67.14v1.2a2.26,2.26,0,0,0-.62-.26,2.61,2.61,0,0,0-.66-.09,1.36,1.36,0,0,0-.29,0,.69.69,0,0,0-.23.08.45.45,0,0,0-.17.13.41.41,0,0,0-.05.18.38.38,0,0,0,.08.24,1.06,1.06,0,0,0,.23.17,2.38,2.38,0,0,0,.32.12l.34.12a5,5,0,0,1,.56.23,2,2,0,0,1,.43.3,1.18,1.18,0,0,1,.27.4,1.5,1.5,0,0,1,.1.55,1.43,1.43,0,0,1-.2.76,1.55,1.55,0,0,1-.51.51,2.33,2.33,0,0,1-.74.29,4,4,0,0,1-.86.09A4.47,4.47,0,0,1,426.89,364.26Z"/><path d="M431.29,364.26V363a3,3,0,0,0,.76.35,2.67,2.67,0,0,0,.71.11,1.65,1.65,0,0,0,.65-.11.37.37,0,0,0,.24-.35.34.34,0,0,0-.11-.24,1.22,1.22,0,0,0-.28-.17,2.64,2.64,0,0,0-.37-.13l-.38-.13a3.76,3.76,0,0,1-.53-.25,1.26,1.26,0,0,1-.38-.3,1.09,1.09,0,0,1-.23-.39,1.42,1.42,0,0,1,.59-1.75,2.1,2.1,0,0,1,.7-.28,3.1,3.1,0,0,1,.81-.1,3.91,3.91,0,0,1,.68.06,5,5,0,0,1,.67.14v1.2a2.2,2.2,0,0,0-.63-.26,2.48,2.48,0,0,0-.65-.09,1.36,1.36,0,0,0-.29,0,.69.69,0,0,0-.23.08.45.45,0,0,0-.17.13.31.31,0,0,0-.05.18.33.33,0,0,0,.08.24,1.06,1.06,0,0,0,.23.17,2.38,2.38,0,0,0,.32.12l.34.12a5,5,0,0,1,.56.23,2.34,2.34,0,0,1,.43.3,1.34,1.34,0,0,1,.27.4,1.51,1.51,0,0,1,.09.55,1.43,1.43,0,0,1-.19.76,1.55,1.55,0,0,1-.51.51,2.33,2.33,0,0,1-.74.29,4.14,4.14,0,0,1-.87.09A4.45,4.45,0,0,1,431.29,364.26Z"/><path d="M436.79,358.61a.89.89,0,0,1-.64-.24.75.75,0,0,1-.24-.57.73.73,0,0,1,.24-.56,1.06,1.06,0,0,1,1.29,0,.73.73,0,0,1,.24.56.75.75,0,0,1-.24.58A.94.94,0,0,1,436.79,358.61Zm.77,5.79H436v-5h1.55Z"/><path d="M441.24,364.52a2.68,2.68,0,0,1-2-.7,2.92,2.92,0,0,1,0-3.85,2.82,2.82,0,0,1,2-.7,2.67,2.67,0,0,1,1.95.7,2.52,2.52,0,0,1,.71,1.86,2.69,2.69,0,0,1-.73,2A2.74,2.74,0,0,1,441.24,364.52Zm0-4.06a1,1,0,0,0-.85.37,1.68,1.68,0,0,0-.31,1.07q0,1.44,1.17,1.44c.74,0,1.11-.5,1.11-1.48S442,360.46,441.28,360.46Z"/><path d="M449.8,364.4h-1.54v-2.78c0-.78-.28-1.16-.83-1.16a.83.83,0,0,0-.66.3,1.21,1.21,0,0,0-.26.79v2.85H445v-5h1.54v.79h0a1.77,1.77,0,0,1,1.61-.92c1.1,0,1.66.69,1.66,2.07Z"/><path class="cls-1" d="M714.17,258.81a70.73,70.73,0,0,0,55.05-68.95V113.64a70.71,70.71,0,0,0-70.71-70.71"/><path class="cls-1" d="M104.59,42.93a70.71,70.71,0,0,0-70.71,70.71v76.22a70.72,70.72,0,0,0,59,69.72"/><path class="cls-2" d="M482.33,238.07c0-.88-.12-1.75-.12-2.63a17.76,17.76,0,0,1,17.67-17.86c.87,0,1.61.12,2.35.12"/><path class="cls-2" d="M494.94,252.69a21.7,21.7,0,0,0,4.94.62,17.77,17.77,0,0,0,17.68-17.87,11.93,11.93,0,0,0-.12-2.12"/><path class="cls-2" d="M499.88,249.56l-5.06,3,3.09,5"/><path class="cls-2" d="M497.29,220.83l5.07-3-3.09-5"/><path class="cls-3" d="M514.35,222.45h-7.3v7.25h7.3v-7.25Z"/><path class="cls-3" d="M487.28,250.44a5.13,5.13,0,1,0-5.07-5.12,5.14,5.14,0,0,0,5.07,5.12Z"/><line class="cls-1" x1="701.49" y1="42.93" x2="595.78" y2="42.93"/><line class="cls-1" x1="488.78" y1="42.93" x2="296.61" y2="42.93"/><line class="cls-1" x1="210.1" y1="42.93" x2="104.59" y2="42.93"/><path class="cls-4" d="M184.62,88a7.41,7.41,0,0,1-3.48.74A5.64,5.64,0,0,1,176.83,87a6.4,6.4,0,0,1-1.62-4.55A6.7,6.7,0,0,1,177,77.57a6.16,6.16,0,0,1,4.62-1.87,7.53,7.53,0,0,1,3,.52v1.57a6.06,6.06,0,0,0-3-.75,4.57,4.57,0,0,0-3.52,1.45,5.46,5.46,0,0,0-1.35,3.87A5.19,5.19,0,0,0,178,86a4.27,4.27,0,0,0,3.31,1.37,6.24,6.24,0,0,0,3.29-.84Z"/><path class="cls-4" d="M190.75,88.73a4.16,4.16,0,0,1-3.18-1.26,4.66,4.66,0,0,1-1.19-3.35,4.85,4.85,0,0,1,1.24-3.54A4.43,4.43,0,0,1,191,79.31a4,4,0,0,1,3.15,1.24A4.91,4.91,0,0,1,195.24,84,4.83,4.83,0,0,1,194,87.43,4.25,4.25,0,0,1,190.75,88.73Zm.11-8.21a2.74,2.74,0,0,0-2.2.94,3.88,3.88,0,0,0-.81,2.61,3.67,3.67,0,0,0,.82,2.52,2.76,2.76,0,0,0,2.19.92,2.65,2.65,0,0,0,2.15-.9,3.93,3.93,0,0,0,.75-2.58,4,4,0,0,0-.75-2.6A2.62,2.62,0,0,0,190.86,80.52Z"/><path class="cls-4" d="M210.32,88.52h-1.44V83.35a3.94,3.94,0,0,0-.46-2.16,1.76,1.76,0,0,0-1.56-.67,2,2,0,0,0-1.57.84,3.25,3.25,0,0,0-.64,2v5.14h-1.44V83.17c0-1.77-.69-2.65-2.05-2.65a1.89,1.89,0,0,0-1.56.79,3.29,3.29,0,0,0-.62,2.07v5.14h-1.44v-9H199v1.42h0a3,3,0,0,1,2.79-1.63,2.62,2.62,0,0,1,1.61.51,2.55,2.55,0,0,1,.94,1.35,3.21,3.21,0,0,1,3-1.86q3,0,3,3.66Z"/><path class="cls-4" d="M225.82,88.52h-1.44V83.35a3.85,3.85,0,0,0-.46-2.16,1.75,1.75,0,0,0-1.55-.67,1.92,1.92,0,0,0-1.57.84,3.26,3.26,0,0,0-.65,2v5.14h-1.44V83.17c0-1.77-.68-2.65-2.05-2.65a1.89,1.89,0,0,0-1.56.79,3.29,3.29,0,0,0-.62,2.07v5.14H213v-9h1.44v1.42h0a3.05,3.05,0,0,1,2.79-1.63,2.6,2.6,0,0,1,1.61.51,2.55,2.55,0,0,1,.94,1.35,3.22,3.22,0,0,1,3-1.86q3,0,3,3.66Z"/><path class="cls-4" d="M235,88.52h-1.44V87.11h0a3.28,3.28,0,0,1-4.87.91,2.47,2.47,0,0,1-.76-1.89c0-1.69,1-2.66,3-2.94l2.7-.38c0-1.53-.62-2.29-1.86-2.29a4.41,4.41,0,0,0-2.93,1.11V80.15a5.58,5.58,0,0,1,3.06-.84q3.17,0,3.17,3.35ZM233.55,84l-2.17.3a3.6,3.6,0,0,0-1.52.5,1.44,1.44,0,0,0-.51,1.26,1.37,1.37,0,0,0,.48,1.08,1.8,1.8,0,0,0,1.25.41,2.33,2.33,0,0,0,1.77-.75,2.7,2.7,0,0,0,.7-1.9Z"/><path class="cls-4" d="M245.18,88.52h-1.45V83.38c0-1.91-.69-2.86-2.09-2.86a2.27,2.27,0,0,0-1.79.81,3,3,0,0,0-.7,2v5.14H237.7v-9h1.45V81h0a3.24,3.24,0,0,1,2.95-1.7,2.73,2.73,0,0,1,2.26,1,4.25,4.25,0,0,1,.79,2.75Z"/><path class="cls-4" d="M255.57,88.52h-1.44V87h0a3.64,3.64,0,0,1-5.81.53,4.94,4.94,0,0,1-1-3.29,5.4,5.4,0,0,1,1.12-3.58,3.7,3.7,0,0,1,3-1.34,2.9,2.9,0,0,1,2.7,1.45h0V75.19h1.44Zm-1.44-4.07V83.12a2.59,2.59,0,0,0-.72-1.85,2.43,2.43,0,0,0-1.83-.75,2.46,2.46,0,0,0-2.07,1,4.18,4.18,0,0,0-.76,2.67,3.77,3.77,0,0,0,.73,2.45,2.33,2.33,0,0,0,1.94.9,2.46,2.46,0,0,0,2-.87A3.23,3.23,0,0,0,254.13,84.45Z"/><path class="cls-4" d="M270.15,88.52h-6.54V75.91h1.48V87.18h5.06Z"/><path class="cls-4" d="M272.63,77.23A.93.93,0,0,1,272,77a.94.94,0,0,1,1.33-1.34.93.93,0,0,1,0,1.33A.9.9,0,0,1,272.63,77.23Zm.71,11.29h-1.45v-9h1.45Z"/><path class="cls-4" d="M283.72,88.52h-1.44V83.38c0-1.91-.69-2.86-2.09-2.86a2.27,2.27,0,0,0-1.79.81,3,3,0,0,0-.71,2v5.14h-1.44v-9h1.44V81h0a3.23,3.23,0,0,1,2.95-1.7,2.73,2.73,0,0,1,2.26,1,4.25,4.25,0,0,1,.78,2.75Z"/><path class="cls-4" d="M293.67,84.38h-6.35a3.34,3.34,0,0,0,.81,2.32,2.78,2.78,0,0,0,2.12.81,4.41,4.41,0,0,0,2.8-1v1.36a5.26,5.26,0,0,1-3.14.86,3.81,3.81,0,0,1-3-1.23,5.06,5.06,0,0,1-1.09-3.45,4.89,4.89,0,0,1,1.2-3.42,3.81,3.81,0,0,1,3-1.32,3.41,3.41,0,0,1,2.74,1.14,4.78,4.78,0,0,1,1,3.17Zm-1.47-1.22a3,3,0,0,0-.61-2,2,2,0,0,0-1.64-.69,2.3,2.3,0,0,0-1.73.73,3.25,3.25,0,0,0-.88,1.91Z"/><path class="cls-4" d="M308.43,77.25h-3.64V88.52h-1.48V77.25h-3.62V75.91h8.74Z"/><path class="cls-4" d="M312.22,88.73A4.19,4.19,0,0,1,309,87.47a4.7,4.7,0,0,1-1.19-3.35,4.85,4.85,0,0,1,1.24-3.54,4.45,4.45,0,0,1,3.35-1.27,4,4,0,0,1,3.14,1.24A4.91,4.91,0,0,1,316.7,84a4.83,4.83,0,0,1-1.22,3.45A4.24,4.24,0,0,1,312.22,88.73Zm.1-8.21a2.73,2.73,0,0,0-2.19.94,3.88,3.88,0,0,0-.81,2.61,3.62,3.62,0,0,0,.82,2.52,2.74,2.74,0,0,0,2.18.92,2.63,2.63,0,0,0,2.15-.9,3.93,3.93,0,0,0,.75-2.58,4,4,0,0,0-.75-2.6A2.61,2.61,0,0,0,312.32,80.52Z"/><path class="cls-4" d="M322.77,88.73a4.18,4.18,0,0,1-3.19-1.26,4.66,4.66,0,0,1-1.19-3.35,4.85,4.85,0,0,1,1.24-3.54A4.44,4.44,0,0,1,323,79.31a4,4,0,0,1,3.14,1.24A4.91,4.91,0,0,1,327.25,84,4.83,4.83,0,0,1,326,87.43,4.24,4.24,0,0,1,322.77,88.73Zm.1-8.21a2.74,2.74,0,0,0-2.2.94,3.88,3.88,0,0,0-.81,2.61,3.67,3.67,0,0,0,.82,2.52,2.78,2.78,0,0,0,2.19.92,2.65,2.65,0,0,0,2.15-.9,3.93,3.93,0,0,0,.75-2.58,4,4,0,0,0-.75-2.6A2.62,2.62,0,0,0,322.87,80.52Z"/><path class="cls-4" d="M331,88.52h-1.44V75.19H331Z"/><path class="cls-4" d="M233.64,112.12h-2.87l-5.19-7.92a11.47,11.47,0,0,1-.64-1h0c0,.45.07,1.13.07,2v6.93H222.3V99.51h3.06l5,7.68q.34.51.63,1h0a14.91,14.91,0,0,1-.07-1.73v-7h2.69Z"/><path class="cls-4" d="M247.86,112.12H245l-5.19-7.92a11.47,11.47,0,0,1-.64-1h0c.05.45.07,1.13.07,2v6.93h-2.68V99.51h3.06l5,7.68q.34.51.63,1h0a12.88,12.88,0,0,1-.07-1.73v-7h2.68Z"/><path class="cls-4" d="M253.58,112.12h-2.84V99.51h2.84Z"/><path class="cls-4" d="M265.48,111.67a8.4,8.4,0,0,1-3.61.67,6.07,6.07,0,0,1-4.56-1.71,6.26,6.26,0,0,1-1.66-4.54,6.66,6.66,0,0,1,1.87-4.91,6.54,6.54,0,0,1,4.85-1.88,9.09,9.09,0,0,1,3.11.47v2.73a5.57,5.57,0,0,0-2.88-.75,3.84,3.84,0,0,0-2.87,1.11,4.14,4.14,0,0,0-1.09,3,4,4,0,0,0,1,2.91,3.64,3.64,0,0,0,2.77,1.08,6,6,0,0,0,3-.8Z"/><path class="cls-4" d="M276.53,101.82h-3.6v10.3h-2.85v-10.3h-3.57V99.51h10Z"/><path class="cls-4" d="M285.73,112.12h-7.51V99.51h2.84v10.3h4.67Z"/><path class="cls-4" d="M504.25,77.91l-4.67,12.61H498l-4.57-12.61H495l3.49,10a6.09,6.09,0,0,1,.25,1.11h0a4.83,4.83,0,0,1,.29-1.13l3.55-10Z"/><path class="cls-4" d="M506.6,79.23a.93.93,0,0,1-.66-.26.94.94,0,0,1,.66-1.61.92.92,0,0,1,.67.27.93.93,0,0,1,0,1.33A.92.92,0,0,1,506.6,79.23Zm.7,11.29h-1.44v-9h1.44Z"/><path class="cls-4" d="M509.68,90.19V88.64a4.25,4.25,0,0,0,2.59.87c1.27,0,1.9-.42,1.9-1.26a1.18,1.18,0,0,0-.16-.61,1.71,1.71,0,0,0-.44-.45,4.39,4.39,0,0,0-.65-.34l-.81-.32a10.66,10.66,0,0,1-1.05-.48,3.17,3.17,0,0,1-.75-.55,1.86,1.86,0,0,1-.46-.69,2.49,2.49,0,0,1-.16-.9,2.17,2.17,0,0,1,.29-1.12,2.65,2.65,0,0,1,.78-.82,3.57,3.57,0,0,1,1.1-.5,4.93,4.93,0,0,1,1.28-.16,5.13,5.13,0,0,1,2.09.4v1.46a4.09,4.09,0,0,0-2.28-.65,2.7,2.7,0,0,0-.73.09,1.71,1.71,0,0,0-.56.26,1.13,1.13,0,0,0-.36.4,1,1,0,0,0-.13.51,1.15,1.15,0,0,0,.13.59,1.35,1.35,0,0,0,.37.43,3.37,3.37,0,0,0,.6.33c.23.1.5.21.8.32a10,10,0,0,1,1.07.48,3.57,3.57,0,0,1,.81.54,2.07,2.07,0,0,1,.51.7,2.25,2.25,0,0,1,.19.94,2.2,2.2,0,0,1-.3,1.16,2.57,2.57,0,0,1-.79.82,3.74,3.74,0,0,1-1.13.48,5.4,5.4,0,0,1-1.34.16A5.07,5.07,0,0,1,509.68,90.19Z"/><path class="cls-4" d="M525.14,90.52H523.7V89.09h0a3,3,0,0,1-2.78,1.64q-3.21,0-3.21-3.83V81.52h1.43v5.15c0,1.9.73,2.84,2.18,2.84a2.18,2.18,0,0,0,1.73-.77,3,3,0,0,0,.69-2V81.52h1.44Z"/><path class="cls-4" d="M534.49,90.52h-1.44V89.11h0a3.28,3.28,0,0,1-4.87.91,2.47,2.47,0,0,1-.76-1.89c0-1.69,1-2.66,3-2.94l2.7-.38c0-1.53-.62-2.29-1.86-2.29a4.41,4.41,0,0,0-2.93,1.11V82.15a5.58,5.58,0,0,1,3.06-.84q3.16,0,3.17,3.35ZM533.05,86l-2.17.3a3.6,3.6,0,0,0-1.52.5,1.44,1.44,0,0,0-.51,1.26,1.37,1.37,0,0,0,.48,1.08,1.8,1.8,0,0,0,1.25.41,2.33,2.33,0,0,0,1.77-.75,2.7,2.7,0,0,0,.7-1.9Z"/><path class="cls-4" d="M538.65,90.52H537.2V77.19h1.45Z"/><path class="cls-4" d="M542.3,79.23a.93.93,0,0,1-.66-.26A.94.94,0,0,1,543,77.63.93.93,0,0,1,543,79,.9.9,0,0,1,542.3,79.23ZM543,90.52h-1.45v-9H543Z"/><path class="cls-4" d="M552.2,81.93l-5.33,7.36h5.28v1.23h-7.4v-.45l5.33-7.32h-4.82V81.52h6.94Z"/><path class="cls-4" d="M561.3,86.38h-6.36a3.34,3.34,0,0,0,.81,2.32,2.8,2.8,0,0,0,2.13.81,4.4,4.4,0,0,0,2.79-1v1.36a5.26,5.26,0,0,1-3.14.86,3.78,3.78,0,0,1-3-1.23,5,5,0,0,1-1.09-3.45,4.88,4.88,0,0,1,1.19-3.42,3.82,3.82,0,0,1,3-1.32,3.4,3.4,0,0,1,2.73,1.14,4.78,4.78,0,0,1,1,3.17Zm-1.48-1.22a3,3,0,0,0-.6-2,2.07,2.07,0,0,0-1.65-.69,2.3,2.3,0,0,0-1.73.73,3.25,3.25,0,0,0-.88,1.91Z"/><path class="cls-4" d="M571.16,90.52h-1.44V89h0a3.63,3.63,0,0,1-5.8.53,4.94,4.94,0,0,1-1-3.29A5.4,5.4,0,0,1,564,82.65a3.7,3.7,0,0,1,3-1.34,2.88,2.88,0,0,1,2.7,1.45h0V77.19h1.44Zm-1.44-4.07V85.12a2.56,2.56,0,0,0-.73-1.85,2.42,2.42,0,0,0-1.82-.75,2.49,2.49,0,0,0-2.08,1,4.24,4.24,0,0,0-.75,2.67,3.82,3.82,0,0,0,.72,2.45,2.35,2.35,0,0,0,2,.9,2.45,2.45,0,0,0,2-.87A3.19,3.19,0,0,0,569.72,86.45Z"/><path class="cls-4" d="M588.42,85.42q0,5.31-4.79,5.31-4.59,0-4.59-5.11V77.91h1.48v7.61q0,3.89,3.27,3.88t3.15-3.74V77.91h1.48Z"/><path class="cls-4" d="M593,90.52h-1.48V77.91H593Z"/><path class="cls-4" d="M512,114.12h-2.86L504,106.2c-.3-.47-.51-.81-.63-1h0c.05.45.07,1.13.07,2v6.93H500.7V101.51h3.06l5,7.68c.23.34.44.68.64,1h0a12.88,12.88,0,0,1-.07-1.73v-7H512Z"/><path class="cls-4" d="M526.26,114.12H523.4l-5.2-7.92c-.3-.47-.51-.81-.63-1h0c.05.45.07,1.13.07,2v6.93h-2.68V101.51H518l5,7.68c.23.34.44.68.64,1h0a12.88,12.88,0,0,1-.07-1.73v-7h2.68Z"/><path class="cls-4" d="M532,114.12h-2.83V101.51H532Z"/><path class="cls-4" d="M539.81,114.12V101.51h4.59a5.72,5.72,0,0,1,3.25.78,2.49,2.49,0,0,1,1.13,2.18,2.57,2.57,0,0,1-.69,1.78,3.64,3.64,0,0,1-1.76,1.06v0a3.52,3.52,0,0,1,2.15,1,2.79,2.79,0,0,1,.8,2A3.34,3.34,0,0,1,548,113.1a5.15,5.15,0,0,1-3.38,1Zm2.84-10.52v3h1.25a2.12,2.12,0,0,0,1.39-.42,1.49,1.49,0,0,0,.5-1.18c0-.92-.69-1.39-2.07-1.39Zm0,5.1V112h1.54a2.37,2.37,0,0,0,1.54-.45,1.51,1.51,0,0,0,.56-1.25,1.43,1.43,0,0,0-.55-1.19,2.44,2.44,0,0,0-1.53-.43Z"/><path class="cls-4" d="M555.37,114.34a4.85,4.85,0,0,1-3.54-1.26,4.63,4.63,0,0,1-1.28-3.43,4.56,4.56,0,0,1,1.33-3.49,5.06,5.06,0,0,1,3.61-1.26,4.78,4.78,0,0,1,3.52,1.26,4.47,4.47,0,0,1,1.27,3.33A4.76,4.76,0,0,1,559,113,4.92,4.92,0,0,1,555.37,114.34Zm.07-7.32a1.88,1.88,0,0,0-1.53.68,3,3,0,0,0-.54,1.92c0,1.72.69,2.59,2.09,2.59s2-.89,2-2.66S556.78,107,555.44,107Z"/><path class="cls-4" d="M569.4,114.12h-2.63v-1.3h0a3,3,0,0,1-2.68,1.52,2.81,2.81,0,0,1-2.07-.75,2.65,2.65,0,0,1-.75-2q0-2.61,3.11-3l2.44-.33c0-1-.53-1.47-1.6-1.47a5.54,5.54,0,0,0-3.06.95v-2.09a6.87,6.87,0,0,1,1.59-.52,7.89,7.89,0,0,1,1.83-.23q3.86,0,3.85,3.84Zm-2.61-3.66v-.61l-1.64.21q-1.35.18-1.35,1.23a1,1,0,0,0,.33.77,1.24,1.24,0,0,0,.89.31,1.66,1.66,0,0,0,1.28-.54A2,2,0,0,0,566.79,110.46Z"/><path class="cls-4" d="M577.47,107.62a2.46,2.46,0,0,0-1.17-.27,1.69,1.69,0,0,0-1.42.66,2.92,2.92,0,0,0-.51,1.81v4.3H571.6v-9h2.77v1.67h0a2.4,2.4,0,0,1,2.37-1.83,1.86,1.86,0,0,1,.69.1Z"/><path class="cls-4" d="M587.5,114.12h-2.78v-1.24h0a3.11,3.11,0,0,1-2.78,1.46,3.43,3.43,0,0,1-2.72-1.19,5,5,0,0,1-1-3.33,5.36,5.36,0,0,1,1.14-3.58,3.67,3.67,0,0,1,3-1.34,2.51,2.51,0,0,1,2.42,1.24h0v-5.35h2.78Zm-2.73-4.38v-.68a2.15,2.15,0,0,0-.51-1.46,1.7,1.7,0,0,0-1.33-.58,1.67,1.67,0,0,0-1.44.73,3.41,3.41,0,0,0-.52,2,2.89,2.89,0,0,0,.51,1.83,1.64,1.64,0,0,0,1.38.64,1.68,1.68,0,0,0,1.38-.68A2.84,2.84,0,0,0,584.77,109.74Z"/><path class="cls-5" d="M240.73,39a12.69,12.69,0,1,1,12.69,12.38A12.54,12.54,0,0,1,240.73,39Zm20.42-21.33a23.65,23.65,0,0,0-7.73-1.28c-12.84,0-23.19,10.1-23.19,22.61A22.58,22.58,0,0,0,243.5,59.47m2.19.86a22.63,22.63,0,0,0,7.73,1.28c12.83,0,23.19-10.1,23.19-22.62a22.55,22.55,0,0,0-1.17-7m-.87-2.42a23.12,23.12,0,0,0-10.79-10.8m-4.09,5.54-1.17,3.27m10.8,5.55-4.09,1.28m-.14,9.25,4.08,1.56m-8.9,8.82-1.89-3.7m-11.09,3.84,1.61-3.7m-10.8-5.26,3.65-1.42M238.1,33.16l3.5,1.28m7.15-7-1.6-3.13"/><path class="cls-6" d="M567.21,59.57h-46.5V22.73h46.5V59.57m-46.5-27.63h45.74m-27.14,6.14h-12.4V53.43h12.4V38.08m4.65,0h9.3M544,44.22h9.3M544,50.36h6.2"/><line class="cls-1" x1="212.27" y1="259.58" x2="172.22" y2="259.58"/><line class="cls-1" x1="606.02" y1="259.58" x2="565.97" y2="259.58"/><path class="cls-7" d="M224.17,247.72a11.86,11.86,0,1,1-11.86,11.86,11.86,11.86,0,0,1,11.86-11.86Zm-1.91,17.9,6-6-6-6"/><path class="cls-7" d="M617.88,247.72A11.86,11.86,0,1,1,606,259.58a11.86,11.86,0,0,1,11.86-11.86ZM616,265.62l6-6-6-6"/><path class="cls-2" d="M699.21,261a11.15,11.15,0,0,0-8.71-17.93,15.83,15.83,0,0,0-15.67-14.39,15.64,15.64,0,0,0-13.34,7.62,17.76,17.76,0,0,0-3-.17,14.45,14.45,0,0,0-14.34,14.56,14.6,14.6,0,0,0,14.34,14.73h11.66"/><path class="cls-8" d="M671.09,270.53a2.1,2.1,0,0,1,3,0,2.14,2.14,0,0,1-3,3,2.08,2.08,0,0,1,0-3ZM673,257.38a3.31,3.31,0,1,0,4.59,0,3.32,3.32,0,0,0-4.59,0Zm11.58-4.1a5.2,5.2,0,0,0,0,7.38,5.24,5.24,0,1,0,0-7.38ZM679.14,268a3.48,3.48,0,0,0,4.92,4.92,3.48,3.48,0,0,0-4.92-4.92Zm14-2.23a5.3,5.3,0,0,0-.15,7.51,5.42,5.42,0,0,0,7.66-7.66,5.3,5.3,0,0,0-7.51.15Zm-1.82-4.41,2.38,3.85m-13.77,2.3-2.69-5m6,4.93,2.64-5.76M678.48,259l4.65-1m-4.77,2.89,13.5,6.58m-17.15-4.55-1.52,7m11.85.36,6.46-.34m-17.37.54,10-10.25M678.22,271l-3.59.48"/><polyline class="cls-2" points="153.46 258.05 153.46 231.72 116.73 231.72 116.73 274.23 141.51 274.23"/><line class="cls-2" x1="121.2" y1="237.23" x2="130.57" y2="237.23"/><line class="cls-2" x1="128.24" y1="242.67" x2="141.31" y2="242.67"/><line class="cls-2" x1="134.77" y1="248.03" x2="147.56" y2="248.03"/><line class="cls-2" x1="134.77" y1="253.07" x2="144.38" y2="253.07"/><line class="cls-2" x1="137.41" y1="258.12" x2="128.24" y2="258.12"/><line class="cls-2" x1="128.24" y1="262.92" x2="141.31" y2="262.92"/><line class="cls-2" x1="130.57" y1="268.2" x2="121.2" y2="268.2"/><path class="cls-9" d="M146.86,269.47v3.89h-2.55v-11.3h4q4.27,0,4.27,3.61a3.47,3.47,0,0,1-1.22,2.75,4.87,4.87,0,0,1-3.28,1.05Zm0-5.45v3.52h1c1.35,0,2-.59,2-1.78s-.68-1.74-2-1.74Z"/><path class="cls-9" d="M162.75,262.06l-3.68,7.28v4h-2.55v-4l-3.58-7.33h2.9l1.82,4.22c0,.08.12.39.26.92h0a4,4,0,0,1,.24-.89l1.85-4.25Z"/><g id="页面-1"><g id="画板"><g id="直线-2"><line class="cls-10" x1="287.85" y1="226.61" x2="305.85" y2="218.36"/><line class="cls-10" x1="287.85" y1="236.36" x2="298.35" y2="231.86"/><line class="cls-10" x1="287.85" y1="245.36" x2="297.6" y2="240.86"/><line class="cls-10" x1="305.85" y1="244.61" x2="325.35" y2="236.36"/><line class="cls-10" x1="305.85" y1="253.61" x2="314.85" y2="249.86"/><line class="cls-10" x1="305.85" y1="234.86" x2="324.6" y2="226.61"/><line class="cls-10" x1="287.85" y1="226.61" x2="305.85" y2="234.86"/><line class="cls-10" x1="287.85" y1="236.36" x2="305.85" y2="244.61"/><line class="cls-10" x1="287.85" y1="245.36" x2="305.85" y2="253.61"/><line class="cls-10" x1="314.85" y1="231.11" x2="325.35" y2="236.36"/><line class="cls-10" x1="305.85" y1="218.36" x2="324.6" y2="226.61"/><circle id="椭圆形" class="cls-11" cx="321.6" cy="246.11" r="6.75"/><line id="直线" class="cls-12" x1="318.22" y1="242.74" x2="318.22" y2="244.99"/><line class="cls-12" x1="324.97" y1="242.74" x2="324.97" y2="244.99"/><line class="cls-12" x1="324.97" y1="247.24" x2="324.97" y2="249.49"/><line class="cls-12" x1="318.22" y1="247.24" x2="318.22" y2="249.49"/><line id="直线-4" class="cls-12" x1="318.22" y1="242.74" x2="320.47" y2="242.74"/><line class="cls-12" x1="322.72" y1="242.74" x2="324.97" y2="242.74"/><line class="cls-12" x1="318.22" y1="249.49" x2="320.47" y2="249.49"/><line class="cls-12" x1="322.72" y1="249.49" x2="324.97" y2="249.49"/></g></g></g><g id="页面-1-2" data-name="页面-1"><g id="画板-2" data-name="画板"><g id="Architect-Search--"><g id="椭圆形-2" data-name="椭圆形"><circle class="cls-13" cx="410.22" cy="143.42" r="11.42"/><line id="直线-3" class="cls-14" x1="382.03" y1="132.36" x2="399.16" y2="132.36"/><line class="cls-14" x1="382.38" y1="143.42" x2="395.23" y2="143.42"/><line class="cls-14" x1="382.38" y1="153.42" x2="398.8" y2="153.42"/><line class="cls-14" x1="382.38" y1="164.84" x2="413.79" y2="164.84"/><line class="cls-14" x1="417.36" y1="156.99" x2="420.93" y2="164.13"/></g><line id="直线-4-2" data-name="直线-4" class="cls-15" x1="402.73" y1="144.49" x2="406.3" y2="142.35"/><line class="cls-15" x1="406.3" y1="142.35" x2="408.44" y2="144.49"/><line class="cls-15" x1="414.15" y1="140.21" x2="408.44" y2="144.49"/><line class="cls-15" x1="414.15" y1="140.21" x2="417.77" y2="144.54"/></g></g></g><g id="页面-1-3" data-name="页面-1"><g id="画板-3" data-name="画板"><g id="Model-Compression--"><line id="直线-7" class="cls-16" x1="392.47" y1="308.45" x2="403.56" y2="329.11"/><line class="cls-16" x1="415.69" y1="309.67" x2="404.46" y2="329.89"/><line class="cls-16" x1="415.84" y1="308.45" x2="391.28" y2="328.55"/><line class="cls-16" x1="390.98" y1="308.95" x2="390.93" y2="328.55"/><polyline id="直线-8" class="cls-17" points="392.1 335.51 392.1 342.25 414.57 342.25"/><polyline id="直线-4-3" data-name="直线-4" class="cls-18" points="387.61 330.27 382.56 330.27 382.56 306.3 425.06 306.3 425.06 330.27 413.78 330.27 405.8 330.27"/><ellipse id="椭圆形-3" data-name="椭圆形" class="cls-19" cx="416.44" cy="307.05" rx="4.87" ry="4.49"/><ellipse class="cls-19" cx="392.47" cy="331.02" rx="4.87" ry="4.49"/><ellipse class="cls-19" cx="418.69" cy="342.25" rx="4.87" ry="4.49"/><circle class="cls-20" cx="391.35" cy="307.05" r="2.25"/><circle class="cls-20" cx="404.08" cy="330.27" r="2.25"/></g></g></g></svg>
\ No newline at end of file
......@@ -12,3 +12,4 @@
神经网络结构搜索(NAS)的对比<NasComparision>
超参调优算法的对比<HpoComparision>
TPE 的并行优化<ParallelizingTpeSearch>
使用 NNI 自动调优系统 <TuningSystems>
......@@ -84,7 +84,7 @@ config_list_agp = [{'initial_sparsity': 0, 'final_sparsity': conv0_sparsity,
{'initial_sparsity': 0, 'final_sparsity': conv1_sparsity,
'start_epoch': 0, 'end_epoch': 3,
'frequency': 1,'op_name': 'conv1' },]
PRUNERS = {'level':LevelPruner(model, config_list_level)'agp':AGP_Pruner(model, config_list_agp)}
PRUNERS = {'level':LevelPruner(model, config_list_level), 'agp':AGP_Pruner(model, config_list_agp)}
pruner = PRUNERS(params['prune_method']['_name'])
pruner.compress()
... # fine tuning
......
......@@ -6,16 +6,23 @@ NNI 提供了易于使用的工具包来帮助用户设计并使用压缩算法
## 支持的算法
NNI 提供了两种朴素压缩算法以及三种流行的压缩算法,包括两种剪枝算法以及三种量化算法:
NNI 提供了几种压缩算法,包括剪枝和量化算法:
**剪枝**
| 名称 | 算法简介 |
| ----------------------------------------------- | -------------------------------------------------------------------------------------------------------------------------------------- |
| [Level Pruner](./Pruner.md#level-pruner) | 根据权重的绝对值,来按比例修剪权重。 |
| [AGP Pruner](./Pruner.md#agp-pruner) | 自动的逐步剪枝(是否剪枝的判断:基于对模型剪枝的效果)[参考论文](https://arxiv.org/abs/1710.01878) |
| [L1Filter Pruner](./Pruner.md#l1filter-pruner) | 剪除卷积层中最不重要的过滤器 (PRUNING FILTERS FOR EFFICIENT CONVNETS)[参考论文](https://arxiv.org/abs/1608.08710) |
| [Slim Pruner](./Pruner.md#slim-pruner) | 通过修剪 BN 层中的缩放因子来修剪卷积层中的通道 (Learning Efficient Convolutional Networks through Network Slimming)[参考论文](https://arxiv.org/abs/1708.06519) |
| [Lottery Ticket Pruner](./Pruner.md#agp-pruner) | "The Lottery Ticket Hypothesis: Finding Sparse, Trainable Neural Networks" 提出的剪枝过程。 它会反复修剪模型。 [参考论文](https://arxiv.org/abs/1803.03635) |
| [FPGM Pruner](./Pruner.md#fpgm-pruner) | Filter Pruning via Geometric Median for Deep Convolutional Neural Networks Acceleration [参考论文](https://arxiv.org/pdf/1811.00250.pdf) |
**量化**
| 名称 | 算法简介 |
| --------------------------------------------------- | -------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| [Level Pruner](./Pruner.md#level-pruner) | 根据权重的绝对值,来按比例修剪权重。 |
| [AGP Pruner](./Pruner.md#agp-pruner) | 自动的逐步剪枝(是否剪枝的判断:基于对模型剪枝的效果)[参考论文](https://arxiv.org/abs/1710.01878) |
| [L1Filter Pruner](./Pruner.md#l1filter-pruner) | 剪除卷积层中最不重要的过滤器 (PRUNING FILTERS FOR EFFICIENT CONVNETS)[参考论文](https://arxiv.org/abs/1608.08710) |
| [Slim Pruner](./Pruner.md#slim-pruner) | 通过修剪 BN 层中的缩放因子来修剪卷积层中的通道 (Learning Efficient Convolutional Networks through Network Slimming)[参考论文](https://arxiv.org/abs/1708.06519) |
| [Lottery Ticket Pruner](./Pruner.md#agp-pruner) | "The Lottery Ticket Hypothesis: Finding Sparse, Trainable Neural Networks" 提出的剪枝过程。 它会反复修剪模型。 [参考论文](https://arxiv.org/abs/1803.03635) |
| [FPGM Pruner](./Pruner.md#fpgm-pruner) | Filter Pruning via Geometric Median for Deep Convolutional Neural Networks Acceleration [参考论文](https://arxiv.org/pdf/1811.00250.pdf) |
| [Naive Quantizer](./Quantizer.md#naive-quantizer) | 默认将权重量化为 8 位 |
| [QAT Quantizer](./Quantizer.md#qat-quantizer) | 为 Efficient Integer-Arithmetic-Only Inference 量化并训练神经网络。 [参考论文](http://openaccess.thecvf.com/content_cvpr_2018/papers/Jacob_Quantization_and_Training_CVPR_2018_paper.pdf) |
| [DoReFa Quantizer](./Quantizer.md#dorefa-quantizer) | DoReFa-Net: 通过低位宽的梯度算法来训练低位宽的卷积神经网络。 [参考论文](https://arxiv.org/abs/1606.06160) |
......@@ -24,25 +31,26 @@ NNI 提供了两种朴素压缩算法以及三种流行的压缩算法,包括
通过简单的示例来展示如何修改 Trial 代码来使用压缩算法。 比如,需要通过 Level Pruner 来将权重剪枝 80%,首先在代码中训练模型前,添加以下内容([完整代码](https://github.com/microsoft/nni/tree/master/examples/model_compress))。
TensorFlow 代码
PyTorch 代码
```python
from nni.compression.tensorflow import LevelPruner
from nni.compression.torch import LevelPruner
config_list = [{ 'sparsity': 0.8, 'op_types': ['default'] }]
pruner = LevelPruner(tf.get_default_graph(), config_list)
pruner = LevelPruner(model, config_list)
pruner.compress()
```
PyTorch 代码
TensorFlow 代码
```python
from nni.compression.torch import LevelPruner
from nni.compression.tensorflow import LevelPruner
config_list = [{ 'sparsity': 0.8, 'op_types': ['default'] }]
pruner = LevelPruner(model, config_list)
pruner = LevelPruner(tf.get_default_graph(), config_list)
pruner.compress()
```
可使用 `nni.compression` 中的其它压缩算法。 此算法分别在 `nni.compression.torch``nni.compression.tensorflow` 中实现,支持 PyTorch 和 TensorFlow。 参考 [Pruner](./Pruner.md)[Quantizer](./Quantizer.md) 进一步了解支持的算法。
可使用 `nni.compression` 中的其它压缩算法。 此算法分别在 `nni.compression.torch``nni.compression.tensorflow` 中实现,支持 PyTorch 和 TensorFlow。 参考 [Pruner](./Pruner.md)[Quantizer](./Quantizer.md) 进一步了解支持的算法。 此外,如果要使用知识蒸馏算法,可参考 [KD 示例](../TrialExample/KDExample.md)
函数调用 `pruner.compress()` 来修改用户定义的模型(在 Tensorflow 中,通过 `tf.get_default_graph()` 来获得模型,而 PyTorch 中 model 是定义的模型类),并修改模型来插入 mask。 然后运行模型时,这些 mask 即会生效。 mask 可在运行时通过算法来调整。
......
......@@ -240,16 +240,17 @@ print("Pipeline Score: ", pipeline.score(X_train, y_train))
# 基准测试
`Baseline` 表示没有进行特征选择,直接将数据传入 LogisticRegression。 此基准测试中,仅用了 10% 的训练数据作为测试数据。
| 数据集 | Baseline | GradientFeatureSelector | TreeBasedClassifier | 训练次数 | 特征数量 |
| ------------- | -------- | ----------------------- | ------------------- | ---------- | --------- |
| colon-cancer | 0.7547 | 0.7368 | 0.7223 | 62 | 2,000 |
| gisette | 0.9725 | 0.89416 | 0.9792 | 6,000 | 5,000 |
| avazu | 0.8834 | N/A | N/A | 40,428,967 | 1,000,000 |
| rcv1 | 0.9644 | 0.7333 | 0.9615 | 20,242 | 47,236 |
| news20.binary | 0.9208 | 0.6870 | 0.9070 | 19,996 | 1,355,191 |
| real-sim | 0.9681 | 0.7969 | 0.9591 | 72,309 | 20,958 |
此基准测试可在[这里](https://www.csie.ntu.edu.tw/~cjlin/libsvmtools/datasets/)下载
`Baseline` 表示没有进行特征选择,直接将数据传入 LogisticRegression。 此基准测试中,仅用了 10% 的训练数据作为测试数据。 对于 GradientFeatureSelector,仅使用了前 20 个特征。 下列指标是在给定测试数据和标签上的平均精度。
| 数据集 | 所有特征 + LR (acc, time, memory) | GradientFeatureSelector + LR (acc, time, memory) | TreeBasedClassifier + LR (acc, time, memory) | 训练次数 | 特征数量 |
| ------------- | ----------------------------- | ------------------------------------------------ | -------------------------------------------- | ---------- | --------- |
| colon-cancer | 0.7547, 890ms, 348MiB | 0.7368, 363ms, 286MiB | 0.7223, 171ms, 1171 MiB | 62 | 2,000 |
| gisette | 0.9725, 215ms, 584MiB | 0.89416, 446ms, 397MiB | 0.9792, 911ms, 234MiB | 6,000 | 5,000 |
| avazu | 0.8834, N/A, N/A | N/A, N/A, N/A | N/A, N/A, N/A | 40,428,967 | 1,000,000 |
| rcv1 | 0.9644, 557ms, 241MiB | 0.7333, 401ms, 281MiB | 0.9615, 752ms, 284MiB | 20,242 | 47,236 |
| news20.binary | 0.9208, 707ms, 361MiB | 0.6870, 565ms, 371MiB | 0.9070, 904ms, 364MiB | 19,996 | 1,355,191 |
| real-sim | 0.9681, 433ms, 274MiB | 0.7969, 251ms, 274MiB | 0.9591, 643ms, 367MiB | 72,309 | 20,958 |
此基准测试可在[这里](https://www.csie.ntu.edu.tw/~cjlin/libsvmtools/datasets/)下载
代码参考 `/examples/feature_engineering/gradient_feature_selector/benchmark_test.py`
# NNI 中的 DARTS
## 介绍
论文 [DARTS: Differentiable Architecture Search](https://arxiv.org/abs/1806.09055) 通过可微分的方式来解决架构搜索中的伸缩性挑战。 此方法基于架构的连续放松的表示,从而允许在架构搜索时能使用梯度下降。
为了实现,作者在小批量中交替优化网络权重和架构权重。 还进一步探讨了使用二阶优化(unroll)来替代一阶,来提高性能的可能性。
NNI 的实现基于[官方实现](https://github.com/quark0/darts)以及一个[第三方实现](https://github.com/khanrc/pt.darts)。 目前,在 CIFAR10 上从头训练的一阶和二阶优化均已实现。
## 重现结果
为了重现本文的结果,我们做了一阶和二阶优化的实验。 由于时间限制,我们仅从第二阶段重新训练了*一次**最佳架构*。 我们的结果目前与论文的结果相当。 稍后会增加更多结果
| | 论文中 | 重现 |
| ------------ | ------------- | ---- |
| 一阶 (CIFAR10) | 3.00 +/- 0.14 | 2.78 |
| 二阶(CIFAR10) | 2.76 +/- 0.09 | 2.89 |
# NNI 中的 ENAS
## 介绍
论文 [Efficient Neural Architecture Search via Parameter Sharing](https://arxiv.org/abs/1802.03268) 通过在子模型之间共享参数来加速 NAS 过程。 在 ENAS 中,Contoller 学习在大的计算图中搜索最有子图的方式来发现神经网络。 Controller 通过梯度策略训练,从而选择出能在验证集上有最大期望奖励的子图。 同时对与所选子图对应的模型进行训练,以最小化规范交叉熵损失。
NNI 的实现基于 [Tensorflow 的官方实现](https://github.com/melodyguan/enas),包括了 CIFAR10 上的 Macro/Micro 搜索空间。 NNI 中从头训练的代码还未完成,当前还没有重现结果。
......@@ -2,8 +2,6 @@
我们正在尝试通过统一的编程接口来支持各种 NAS 算法,当前处于试验阶段。 这意味着当前编程接口可能会进行重大变化。
*先前的 [NAS annotation](../AdvancedFeature/GeneralNasInterfaces.md) 接口会很快被弃用。*
## 模型的编程接口
在两种场景下需要用于设计和搜索模型的编程接口。
......@@ -55,7 +53,7 @@ def forward(self, x):
out = self.input_switch([in_tensor1, in_tensor2, in_tensor3])
...
```
`InputChoice` 是一个 PyTorch module,初始化时需要元信息,例如,从多少个输入后选中选择多少个输入,初始化的 `InputChoice` 名称。 真正候选的输入张量只能在 `forward` 函数中获得。 在 `InputChoice` 中,`forward` 会在调用时传入实际候选输入张量
`InputChoice` 是一个 PyTorch module,初始化时需要元信息,例如,从多少个输入后选中选择多少个输入,以及初始化的 `InputChoice` 名称。 真正候选的输入张量只能在 `forward` 函数中获得。 在 `forward` 函数中,`InputChoice` 模块需要在 `__init__` 中创建 (如, `self.input_switch`),其会在有了实际候选输入 Tensor 的时候被调用
一些 [NAS Trainer](#one-shot-training-mode) 需要知道输入张量的来源层,因此在 `InputChoice` 中添加了输入参数 `choose_from` 来表示每个候选输入张量的来源层。 `choose_from` 是 str 的 list,每个元素都是 `LayerChoice``InputChoice``key`,或者 module 的 name (详情参考[代码](https://github.com/microsoft/nni/blob/master/src/sdk/pynni/nni/nas/pytorch/mutables.py))。
......@@ -75,7 +73,7 @@ class Cell(mutables.MutableScope):
## 两种训练模式
在使用上述 API 在模型中嵌入 搜索空间后,下一步是从搜索空间中找到最好的模型。 有两种驯良模式:[one-shot 训练模式](#one-shot-training-mode) and [经典的分布式搜索](#classic-distributed-search)
在使用上述 API 在模型中嵌入 搜索空间后,下一步是从搜索空间中找到最好的模型。 有两种训练模式:[one-shot 训练模式](#one-shot-training-mode) and [经典的分布式搜索](#classic-distributed-search)
### One-shot 训练模式
......@@ -100,9 +98,7 @@ trainer.export(file='./chosen_arch')
不同的 Trainer 可能有不同的输入参数,具体取决于其算法。 详细参数可参考具体的 [Trainer 代码](https://github.com/microsoft/nni/tree/master/src/sdk/pynni/nni/nas/pytorch)。 训练完成后,可通过 `trainer.export()` 导出找到的最好的模型。 无需通过 `nnictl` 来启动 NNI Experiment。
[这里](./Overview.md#supported-one-shot-nas-algorithms)是所有支持的 Trainer。 [这里](https://github.com/microsoft/nni/tree/master/examples/nas/simple/train.py)是使用 NNI NAS API 的简单示例。
[这里]()是完整示例的代码。
[这里](Overview.md#supported-one-shot-nas-algorithms)是所有支持的 Trainer。 [这里](https://github.com/microsoft/nni/tree/master/examples/nas/simple/train.py)是使用 NNI NAS API 的简单示例。
### 经典分布式搜索
......@@ -174,4 +170,4 @@ NNI 中的 NAS Tuner 需要自动生成搜索空间。 `LayerChoice` 和 `InputC
"_idex": [1]
}
}
```
\ No newline at end of file
```
......@@ -6,11 +6,11 @@
以此为动力,NNI 的目标是提供统一的体系结构,以加速NAS上的创新,并将最新的算法更快地应用于现实世界中的问题上。
通过 [统一的接口](NasInterface.md),有两种方式进行架构搜索。 [第一种](#supported-one-shot-nas-algorithms)称为 one-shot NAS,基于搜索空间构建了一个超级网络,并使用 one-shot 训练来生成性能良好的子模型。 [第二种](.ClassicNas.md)是传统的搜索方法,搜索空间中每个子模型作为独立的 Trial 运行,将性能结果发给 Tuner,由 Tuner 来生成新的子模型。
通过[统一的接口](./NasInterface.md),有两种方式进行架构搜索。 [第一种](#supported-one-shot-nas-algorithms)称为 one-shot NAS,基于搜索空间构建了一个超级网络,并使用 one-shot 训练来生成性能良好的子模型。 [第二种](./NasInterface.md#classic-distributed-search)是传统的搜索方法,搜索空间中每个子模型作为独立的 Trial 运行,将性能结果发给 Tuner,由 Tuner 来生成新的子模型。
* [支持的 One-shot NAS 算法](#supported-one-shot-nas-algorithms)
* [使用 NNI Experiment 的经典分布式 NAS](.NasInterface.md#classic-distributed-search)
* [NNI NAS 编程接口](.NasInterface.md)
* [使用 NNI Experiment 的经典分布式 NAS](./NasInterface.md#classic-distributed-search)
* [NNI NAS 编程接口](./NasInterface.md)
## 支持的 One-shot NAS 算法
......@@ -37,7 +37,7 @@ NNI 现在支持以下 NAS 算法,并且正在添加更多算法。 用户可
#### 用法
NNI 中的 ENAS 还在开发中,当前仅支持在 CIFAR10 上 Macro/Micro 搜索空间的搜索阶段。 在 PTB 上从头开始训练及其搜索空间尚未完成。
NNI 中的 ENAS 还在开发中,当前仅支持在 CIFAR10 上 Macro/Micro 搜索空间的搜索阶段。 在 PTB 上从头开始训练及其搜索空间尚未完成。 [详细说明](ENAS.md)
```bash
#如果未克隆 NNI 代码。 如果代码已被克隆,请忽略此行并直接进入代码目录。
......@@ -58,7 +58,7 @@ python3 search.py -h
### DARTS
[DARTS: Differentiable Architecture Search](https://arxiv.org/abs/1806.09055) 在算法上的主要贡献是,引入了一种在两级网络优化中使用的可微分算法。
[DARTS: Differentiable Architecture Search](https://arxiv.org/abs/1806.09055) 在算法上的主要贡献是,引入了一种在两级网络优化中使用的可微分算法。 [详细说明](DARTS.md)
#### 用法
......@@ -97,8 +97,6 @@ python3 retrain.py --arc-checkpoint ../pdarts/checkpoints/epoch_2.json
注意,我们正在尝试通过统一的编程接口来支持各种 NAS 算法,当前处于试验阶段。 这意味着当前编程接口将来会有变化。
*先前的 [NAS annotation](../AdvancedFeature/GeneralNasInterfaces.md) 接口会很快被弃用。*
### 编程接口
在两种场景下需要用于设计和搜索模型的编程接口。
......
......@@ -45,6 +45,36 @@ Experiment 的运行过程为:Tuner 接收搜索空间并生成配置。 这
更多 Experiment 运行的详情,参考[快速入门](Tutorial/QuickStart.md)
## 核心功能
NNI 提供了并行运行多个实例以查找最佳参数组合的能力。 此功能可用于各种领域,例如,为深度学习模型查找最佳超参数,或查找具有真实数据的数据库和其他复杂系统的最佳配置。
NNI 还希望提供用于机器学习和深度学习的算法工具包,尤其是神经体系结构搜索(NAS)算法,模型压缩算法和特征工程算法。
### 超参调优
这是 NNI 最核心、基本的功能,其中提供了许多流行的[自动调优算法](Tuner/BuiltinTuner.md) ( Tuner) 以及 [提前终止算法](Assessor/BuiltinAssessor.md) ( Assessor)。 可查看[快速入门](Tutorial/QuickStart.md)来调优模型或系统。 基本上通过以上三步,就能开始NNI Experiment。
### 通用 NAS 框架
此 NAS 框架可供用户轻松指定候选的神经体系结构,例如,可以为单个层指定多个候选操作(例如,可分离的 conv、扩张 conv),并指定可能的跳过连接。 NNI 将自动找到最佳候选。 另一方面,NAS 框架为其他类型的用户(如,NAS 算法研究人员)提供了简单的接口,以实现新的 NAS 算法。 详情及用法参考[这里](NAS/Overview.md)
NNI 通过 Trial SDK 支持多种 one-shot NAS 算法,如:ENAS、DARTS。 使用这些算法时,不需启动 NNI Experiment。 在 Trial 代码中加入算法,直接运行即可。 如果要调整算法中的超参数,或运行多个实例,可以使用 Tuner 并启动 NNI Experiment。
除了 one-shot NAS 外,NAS 还能以 NNI 模式运行,其中每个候选的网络结构都作为独立 Trial 任务运行。 在此模式下,与超参调优类似,必须启动 NNI Experiment 并为 NAS 选择 Tuner。
### 模型压缩
NNI 上的模型压缩包括剪枝和量化算法。 这些算法通过 NNI Trial SDK 提供。 可以直接在 Trial 代码中使用,并在不启动 NNI Experiment 的情况下运行 Trial 代码。 详情及用法参考[这里](Compressor/Overview.md)
模型压缩中有不同的超参。 一种类型是在输入配置中的超参,例如,压缩算法的稀疏性、量化的位宽。 另一种类型是压缩算法的超参。 NNI 的超参调优可以自动找到最佳的压缩模型。 参考[简单示例](Compressor/AutoCompression.md)
### 自动特征工程
自动特征工程,为下游任务找到最有效的特征。 详情及用法参考[这里](FeatureEngineering/Overview.md)。 通过 NNI Trial SDK 支持,不必创建 NNI Experiment。 只需在 Trial 代码中加入内置的自动特征工程算法,然后直接运行 Trial 代码。
自动特征工程算法通常有一些超参。 如果要自动调整这些超参,可以利用 NNI 的超参数调优,即选择调优算法(即 Tuner)并启动 NNI Experiment。
## 了解更多信息
* [入门](Tutorial/QuickStart.md)
......@@ -56,4 +86,7 @@ Experiment 的运行过程为:Tuner 接收搜索空间并生成配置。 这
* [如何在本机上运行 Experiment?](TrainingService/LocalMode.md)
* [如何在多机上运行 Experiment?](TrainingService/RemoteMachineMode.md)
* [如何在 OpenPAI 上运行 Experiment?](TrainingService/PaiMode.md)
* [示例](TrialExample/MnistExamples.md)
\ No newline at end of file
* [示例](TrialExample/MnistExamples.md)
* [NNI 上的神经网络架构搜索](NAS/Overview.md)
* [NNI 上的自动模型压缩](Compressor/Overview.md)
* [NNI 上的自动特征工程](FeatureEngineering/Overview.md)
\ No newline at end of file
# 更改日志
## 发布 1.2 - 12/02/2019
### 主要功能
* [特征工程](https://github.com/microsoft/nni/blob/v1.2/docs/zh_CN/FeatureEngineering/Overview.md)
- 新增特征工程接口
- 特征选择算法: [Gradient feature selector](https://github.com/microsoft/nni/blob/v1.2/docs/zh_CN/FeatureEngineering/GradientFeatureSelector.md) & [GBDT selector](https://github.com/microsoft/nni/blob/v1.2/docs/zh_CN/FeatureEngineering/GBDTSelector.md)
- [特征工程示例](https://github.com/microsoft/nni/tree/v1.2/examples/feature_engineering)
- 神经网络结构搜索在 NNI 上的应用
- [新的 NAS 接口](https://github.com/microsoft/nni/blob/v1.2/docs/zh_CN/NAS/NasInterface.md)
- NAS 算法: [ENAS](https://github.com/microsoft/nni/blob/v1.2/docs/zh_CN/NAS/Overview.md#enas), [DARTS](https://github.com/microsoft/nni/blob/v1.2/docs/zh_CN/NAS/Overview.md#darts), [P-DARTS](https://github.com/microsoft/nni/blob/v1.2/docs/zh_CN/NAS/Overview.md#p-darts) (PyTorch)
- 经典模式下的 NAS(每次 Trial 独立运行)
- 模型压缩
- [新增模型剪枝算法](https://github.com/microsoft/nni/blob/v1.2/docs/zh_CN/Compressor/Overview.md): lottery ticket 修剪, L1Filter Pruner, Slim Pruner, FPGM Pruner
- [新增模型量化算法](https://github.com/microsoft/nni/blob/v1.2/docs/zh_CN/Compressor/Overview.md): QAT Quantizer, DoReFa Quantizer
- 支持导出压缩后模型的 API。
- 训练平台
- 支持 OpenPAI 令牌身份验证
- 示例:
- [使用 NNI 自动调优 rocksdb 配置的示例](https://github.com/microsoft/nni/tree/v1.2/examples/trials/systems/rocksdb-fillrandom)
- [新的支持 TensorFlow 2.0 的 Trial 示例](https://github.com/microsoft/nni/tree/v1.2/examples/trials/mnist-tfv2)
- 改进
- 远程训练平台中不需要 GPU 的 Trial 任务改为使用随机调度,不再使用轮询调度。
- 添加 pylint 规则来检查拉取请求,新的拉取请求需要符合 [pylint 规则](https://github.com/microsoft/nni/blob/v1.2/pylintrc)
- Web 门户和用户体验
- 支持用户添加自定义 Trial。
- 除了超参外,用户可放大缩小详细图形。
- 文档
- 改进了 NNI API 文档,增加了更多的 docstring。
### 修复的 Bug
- 修复当失败的 Trial 没有指标时,表格的排序问题。 -Issue #1773
- 页面切换时,保留选择的(最大、最小)状态。 -PR#1710
- 使超参数图的默认指标 yAxis 更加精确。 -PR#1736
- 修复 GPU 脚本权限问题。 -Issue #1665
## 发布 1.1 - 10/23/2019
### 主要功能
* 新 Tuner: [PPO Tuner](https://github.com/microsoft/nni/blob/v1.1/docs/zh_CN/Tuner/PPOTuner.md)
* [查看已停止的 Experiment](https://github.com/microsoft/nni/blob/v1.1/docs/zh_CN/Tutorial/Nnictl.md#view)
* Tuner 可使用专门的 GPU 资源(参考[教程](https://github.com/microsoft/nni/blob/v1.1/docs/zh_CN/Tutorial/ExperimentConfig.md)中的 `gpuIndices` 了解详情)
* 改进 WEB 界面
- Trial 详情页面可列出每个 Trial 的超参,以及开始结束时间(需要通过 "add column" 添加)
- 优化大型 Experiment 的显示性能
- 更多示例
- [EfficientNet PyTorch 示例](https://github.com/ultmaster/EfficientNet-PyTorch)
- [Cifar10 NAS 示例](https://github.com/microsoft/nni/blob/v1.1/examples/trials/nas_cifar10/README_zh_CN.md)
- [模型压缩工具包 - Alpha 发布](https://github.com/microsoft/nni/blob/v1.1/docs/zh_CN/Compressor/Overview.md):我们很高兴的宣布 NNI 的模型压缩工具包发布了。它还处于试验阶段,会根据使用反馈来改进。 诚挚邀请您使用、反馈,或更多贡献
### 修复的 Bug
* 当搜索空间结束后,多阶段任务会死锁 (issue #1204)
* 没有日志时,`nnictl` 会失败 (issue #1548)
## 发布1.0 - 9/2/2019
### 主要功能
* Tuners 和 Assessors
- 支持自动特征生成和选择 -Issue#877 -PR #1387 + 提供自动特征接口 + 基于 Beam 搜索的 Tuner + [添加 Pakdd 示例](https://github.com/microsoft/nni/tree/master/examples/trials/auto-feature-engineering)
- 添加并行算法提高 TPE 在高并发下的性能。 -PR #1052
- 为 hyperband 支持多阶段 -PR #1257
- 训练平台
- 支持私有 Docker Registry -PR #755
* 改进
* 增加 RestFUL API 的 Python 包装,支持通过代码获取指标的值 PR #1318
* 新的 Python API : get_experiment_id(), get_trial_id() -PR #1353 -Issue #1331 & -Issue#1368
* 优化 NAS 搜索空间 -PR #1393
+ 使用 _type 统一 NAS 搜索空间 -- "mutable_type"e
+ 更新随机搜索 Tuner
+ 将 gpuNum 设为可选 -Issue #1365
+ 删除 OpenPAI 模式下的 outputDir 和 dataDir 配置 -Issue #1342
+ 在 Kubeflow 模式下创建 Trial 时,codeDir 不再被拷贝到 logDir -Issue #1224
+ Web 门户和用户体验
- 在 Web 界面的搜索过程中显示最好指标的曲线 -Issue #1218
- 在多阶段 Experiment 中,显示参数列表的当前值 -Issue1210 -PR #1348
- 在 AddColumn 中增加 "Intermediate count" 选项。 -Issue #1210
- 在 Web 界面中支持搜索参数的值 -Issue #1208
- 在默认指标图中,启用指标轴的自动缩放 -Issue #1360
- 在命令行中为 nnictl 命令增加详细文档的连接 -Issue #1260
- 用户体验改进:显示 Error 日志 -Issue #1173
- 文档
- 更新文档结构 -Issue #1231
- [多阶段文档的改进](AdvancedFeature/MultiPhase.md) -Issue #1233 -PR #1242 + 增加配置示例
- [Web 界面描述改进](Tutorial/WebUI.md) -PR #1419
### 修复的 Bug
* (Bug 修复)修复 0.9 版本中的链接 -Issue #1236
* (Bug 修复)自动完成脚本
* (Bug 修复) 修复管道中仅检查脚本中最后一个命令退出代码的问题。 -PR #1417
* (Bug 修复) Tuner 的 quniform -Issue #1377
* (Bug fix) 'quniform' 在 GridSearch 和其它 Tuner 之间的含义不同。 -Issue #1335
* (Bug 修复)"nnictl experiment list" 将 "RUNNING" 状态的 Experiment 显示为了 "INITIALIZED" -PR #1388
* (Bug 修复) 在 NNI dev 安装模式下无法安装 SMAC。 -Issue #1376
* (Bug 修复) 无法点击中间结果的过滤按钮 -Issue #1263
* (Bug 修复) API "/api/v1/nni/trial-jobs/xxx" 在多阶段 Experiment 无法显示 Trial 的所有参数 -Issue #1258
* (Bug 修复) 成功的 Trial 没有最终结果,但 Web 界面显示成了 ×××(FINAL) -Issue #1207
* (Bug 修复) nnictl stop -Issue #1298
* (Bug 修复) 修复安全警告
* (Bug 修复) 超参页面损坏 -Issue #1332
* (Bug 修复) 运行 flake8 测试来查找 Python 语法错误和未定义的名称 -PR #1217
## 发布 0.9 - 7/1/2019
### 主要功能
* 生成 NAS 编程接口
* 通用 NAS 编程接口
* 为 NAS 接口添加 `enas-mode``oneshot-mode`[PR #1201](https://github.com/microsoft/nni/pull/1201#issue-291094510)
* [有 Matern 核的高斯 Tuner](Tuner/GPTuner.md)
* 支持多阶段 Experiment
* 为多阶段 Experiment 增加新的训练平台:pai 模式从 v0.9 开始支持多阶段 Experiment。
* 为以下内置 Tuner 增加多阶段的功能:
* TPE, Random Search, Anneal, Naïve Evolution, SMAC, Network Morphism, Metis Tuner。
有关详细信息,参考[实现多阶段的 Tuner](AdvancedFeature/MultiPhase.md)
* 为以下内置 Tuner 增加多阶段的功能:
* TPE, Random Search, Anneal, Naïve Evolution, SMAC, Network Morphism, Metis Tuner。
有关详细信息,参考[实现多阶段的 Tuner](AdvancedFeature/MultiPhase.md)
* Web 界面
* 在 Web 界面中可比较 Trial。 有关详细信息,参考[查看 Trial 状态](Tutorial/WebUI.md)
* 允许用户调节 Web 界面的刷新间隔。 有关详细信息,参考[查看概要页面](Tutorial/WebUI.md)
* 更友好的显示中间结果。 有关详细信息,参考[查看 Trial 状态](Tutorial/WebUI.md)
......@@ -48,7 +158,7 @@
* 在已经运行非 NNI 任务的 GPU 上也能运行 Trial
* 支持 Kubeflow v1beta2 操作符
* 支持 Kubeflow TFJob/PyTorchJob v1beta2
* [生成 NAS 编程接口](AdvancedFeature/GeneralNasInterfaces.md)
* [通用 NAS 编程接口](AdvancedFeature/GeneralNasInterfaces.md)
* 实现了 NAS 的编程接口,可通过 NNI Annotation 很容易的表达神经网络架构搜索空间
* 提供新命令 `nnictl trial codegen` 来调试 NAS 代码生成部分
* 提供 NAS 编程接口教程,NAS 在 MNIST 上的示例,用于 NAS 的可定制的随机 Tuner
......@@ -251,12 +361,12 @@
### NNICTL 的新功能和更新
* 支持同时运行多个 Experiment。
在 v0.3 以前,NNI 仅支持一次运行一个 Experiment。 此版本开始,用户可以同时运行多个 Experiment。 每个 Experiment 都需要一个唯一的端口,第一个 Experiment 会像以前版本一样使用默认端口。 需要为其它 Experiment 指定唯一端口:
```bash
nnictl create --port 8081 --config <config file path>
```
```bash
nnictl create --port 8081 --config <config file path>
```
* 支持更新最大 Trial 的数量。 使用 `nnictl update --help` 了解详情。 或参考 [NNICTL](Tutorial/Nnictl.md) 查看完整帮助。
......@@ -265,15 +375,15 @@
* <span style="color:red"><strong>不兼容的改动</strong></span>:nn.get_parameters() 改为 nni.get_next_parameter。 所有以前版本的样例将无法在 v0.3 上运行,需要重新克隆 NNI 代码库获取新样例。 如果在自己的代码中使用了 NNI,也需要相应的更新。
* 新 API **nni.get_sequence_id()**。 每个 Trial 任务都会被分配一个唯一的序列数字,可通过 nni.get_sequence_id() API 来获取。
```bash
git clone -b v0.3 https://github.com/microsoft/nni.git
```
```bash
git clone -b v0.3 https://github.com/microsoft/nni.git
```
* **nni.report_final_result(result)** API 对结果参数支持更多的数据类型。
可用类型:
* int
* float
* 包含有 'default' 键值的 dict,'default' 的值必须为 int 或 float。 dict 可以包含任何其它键值对。
......@@ -284,11 +394,11 @@
### 新示例
*的 NNI Docker 映像:
```bash
docker pull msranni/nni:latest
```
*的 NNI Docker 映像:
```bash
docker pull msranni/nni:latest
```
* 新的 Trial 样例:[NNI Sklearn 样例](https://github.com/microsoft/nni/tree/master/examples/trials/sklearn)
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment