----------- [![MIT licensed](https://img.shields.io/badge/license-MIT-brightgreen.svg)](LICENSE) [![Build Status](https://msrasrg.visualstudio.com/NNIOpenSource/_apis/build/status/integration-test-local?branchName=master)](https://msrasrg.visualstudio.com/NNIOpenSource/_build/latest?definitionId=17&branchName=master) [![Issues](https://img.shields.io/github/issues-raw/Microsoft/nni.svg)](https://github.com/Microsoft/nni/issues?q=is%3Aissue+is%3Aopen) [![Bugs](https://img.shields.io/github/issues/Microsoft/nni/bug.svg)](https://github.com/Microsoft/nni/issues?q=is%3Aissue+is%3Aopen+label%3Abug) [![Pull Requests](https://img.shields.io/github/issues-pr-raw/Microsoft/nni.svg)](https://github.com/Microsoft/nni/pulls?q=is%3Apr+is%3Aopen) [![Version](https://img.shields.io/github/release/Microsoft/nni.svg)](https://github.com/Microsoft/nni/releases) [![Join the chat at https://gitter.im/Microsoft/nni](https://badges.gitter.im/Microsoft/nni.svg)](https://gitter.im/Microsoft/nni?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge&utm_content=badge) [![Documentation Status](https://readthedocs.org/projects/nni/badge/?version=latest)](https://nni.readthedocs.io/en/latest/?badge=latest) [简体中文](README_zh_CN.md) **NNI (Neural Network Intelligence)** is a lightweight but powerful toolkit to help users **automate** Feature Engineering, Neural Architecture Search, Hyperparameter Tuning and Model Compression. The tool manages automated machine learning (AutoML) experiments, **dispatches and runs** experiments' trial jobs generated by tuning algorithms to search the best neural architecture and/or hyper-parameters in **different training environments** like Local Machine, Remote Servers, OpenPAI, Kubeflow, FrameworkController on K8S (AKS etc.) and other cloud options. ## **Who should consider using NNI** * Those who want to **try different AutoML algorithms** in their training code/model. * Those who want to run AutoML trial jobs **in different environments** to speed up search. * Researchers and data scientists who want to easily **implement and experiement new AutoML algorithms**, may it be: hyperparameter tuning algorithm, neural architect search algorithm or model compression algorithm. * ML Platform owners who want to **support AutoML in their platform**. ### **NNI v1.3 has been released!  ** ## **NNI capabilities in a glance** NNI provides CommandLine Tool as well as an user friendly WebUI to manage training experiements. With the extensible API, you can customize your own AutoML algorithms and training services. To make it easy for new users, NNI also provides a set of build-in stat-of-the-art AutoML algorithms and out of box support for popular training platforms. Within the following table, we summarized the current NNI capabilities, we are gradually adding new capabilities and we'd love to have your contribution.

Frameworks & Libraries Algorithms Training Services
Built-in
  • Supported Frameworks
    • PyTorch
    • Keras
    • TensorFlow
    • MXNet
    • Caffe2
    • More...
  • Supported Libraries
    • Scikit-learn
    • XGBoost
    • LightGBM
    • More...
Hyperparameter Tuning Neural Architecture Search Model Compression Feature Engineering (Beta) Early Stop Algorithms
References
## **Install & Verify** **Install through pip** * We support Linux, MacOS and Windows (local, remote and pai mode) in current stage, Ubuntu 16.04 or higher, MacOS 10.14.1 along with Windows 10.1809 are tested and supported. Simply run the following `pip install` in an environment that has `python >= 3.5`. Linux and MacOS ```bash python3 -m pip install --upgrade nni ``` Windows ```bash python -m pip install --upgrade nni ``` Note: * `--user` can be added if you want to install NNI in your home directory, which does not require any special privileges. * Currently NNI on Windows support local, remote and pai mode. Anaconda or Miniconda is highly recommended to install NNI on Windows. * If there is any error like `Segmentation fault`, please refer to [FAQ](docs/en_US/Tutorial/FAQ.md) **Install through source code** * We support Linux (Ubuntu 16.04 or higher), MacOS (10.14.1) and Windows (10.1809) in our current stage. Linux and MacOS * Run the following commands in an environment that has `python >= 3.5`, `git` and `wget`. ```bash git clone -b v1.3 https://github.com/Microsoft/nni.git cd nni source install.sh ``` Windows * Run the following commands in an environment that has `python >=3.5`, `git` and `PowerShell` ```bash git clone -b v1.3 https://github.com/Microsoft/nni.git cd nni powershell -ExecutionPolicy Bypass -file install.ps1 ``` For the system requirements of NNI, please refer to [Install NNI](docs/en_US/Tutorial/Installation.md) For NNI on Windows, please refer to [NNI on Windows](docs/en_US/Tutorial/NniOnWindows.md) **Verify install** The following example is an experiment built on TensorFlow. Make sure you have **TensorFlow 1.x installed** before running it. Note that **currently Tensorflow 2.0 is NOT supported**. * Download the examples via clone the source code. ```bash git clone -b v1.3 https://github.com/Microsoft/nni.git ``` Linux and MacOS * Run the MNIST example. ```bash nnictl create --config nni/examples/trials/mnist-tfv1/config.yml ``` Windows * Run the MNIST example. ```bash nnictl create --config nni\examples\trials\mnist-tfv1\config_windows.yml ``` * Wait for the message `INFO: Successfully started experiment!` in the command line. This message indicates that your experiment has been successfully started. You can explore the experiment using the `Web UI url`. ```text INFO: Starting restful server... INFO: Successfully started Restful server! INFO: Setting local config... INFO: Successfully set local config! INFO: Starting experiment... INFO: Successfully started experiment! ----------------------------------------------------------------------- The experiment id is egchD4qy The Web UI urls are: http://223.255.255.1:8080 http://127.0.0.1:8080 ----------------------------------------------------------------------- You can use these commands to get more information about the experiment ----------------------------------------------------------------------- commands description 1. nnictl experiment show show the information of experiments 2. nnictl trial ls list all of trial jobs 3. nnictl top monitor the status of running experiments 4. nnictl log stderr show stderr log content 5. nnictl log stdout show stdout log content 6. nnictl stop stop an experiment 7. nnictl trial kill kill a trial job by id 8. nnictl --help get help information about nnictl ----------------------------------------------------------------------- ``` * Open the `Web UI url` in your browser, you can view detail information of the experiment and all the submitted trial jobs as shown below. [Here](docs/en_US/Tutorial/WebUI.md) are more Web UI pages.
drawing drawing
## **Documentation** * To learn about what's NNI, read the [NNI Overview](https://nni.readthedocs.io/en/latest/Overview.html). * To get yourself familiar with how to use NNI, read the [documentation](https://nni.readthedocs.io/en/latest/index.html). * To get started and install NNI on your system, please refer to [Install NNI](docs/en_US/Tutorial/Installation.md). ## **Contributing** This project welcomes contributions and suggestions. Most contributions require you to agree to a Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us the rights to use your contribution. For details, visit https://cla.microsoft.com. When you submit a pull request, a CLA-bot will automatically determine whether you need to provide a CLA and decorate the PR appropriately (e.g., label, comment). Simply follow the instructions provided by the bot. You will only need to do this once across all repos using our CLA. This project has adopted the [Microsoft Open Source Code of Conduct](https://opensource.microsoft.com/codeofconduct/). For more information see the Code of [Conduct FAQ](https://opensource.microsoft.com/codeofconduct/faq/) or contact opencode@microsoft.com with any additional questions or comments. After getting familiar with contribution agreements, you are ready to create your first PR =), follow the NNI developer tutorials to get start: * We recommend new contributors to start with ['good first issue'](https://github.com/Microsoft/nni/issues?q=is%3Aissue+is%3Aopen+label%3A%22good+first+issue%22) or ['help-wanted'](https://github.com/microsoft/nni/issues?q=is%3Aopen+is%3Aissue+label%3A%22help+wanted%22), these issues are simple and easy to start. * [NNI developer environment installation tutorial](docs/en_US/Tutorial/SetupNniDeveloperEnvironment.md) * [How to debug](docs/en_US/Tutorial/HowToDebug.md) * [Customize your own Tuner](docs/en_US/Tuner/CustomizeTuner.md) * [Implement customized TrainingService](docs/en_US/TrainingService/HowToImplementTrainingService.md) * [Implement a new NAS trainer on NNI](https://github.com/microsoft/nni/blob/master/docs/en_US/NAS/NasInterface.md#implement-a-new-nas-trainer-on-nni) * [Customize your own Advisor](docs/en_US/Tuner/CustomizeAdvisor.md) ## **External Repositories and References** With authors' permission, we listed a set of NNI usage examples and relevant articles. * ### **External Repositories** ### * Run [ENAS](examples/tuners/enas_nni/README.md) with NNI * Run [Neural Network Architecture Search](examples/trials/nas_cifar10/README.md) with NNI * [Automatic Feature Engineering](examples/feature_engineering/auto-feature-engineering/README.md) with NNI * [Hyperparameter Tuning for Matrix Factorization](https://github.com/microsoft/recommenders/blob/master/notebooks/04_model_select_and_optimize/nni_surprise_svd.ipynb) with NNI * [scikit-nni](https://github.com/ksachdeva/scikit-nni) Hyper-parameter search for scikit-learn pipelines using NNI * ### **Relevant Articles** ### * [Hyper Parameter Optimization Comparison](docs/en_US/CommunitySharings/HpoComparision.md) * [Neural Architecture Search Comparison](docs/en_US/CommunitySharings/NasComparision.md) * [Parallelizing a Sequential Algorithm TPE](docs/en_US/CommunitySharings/ParallelizingTpeSearch.md) * [Automatically tuning SVD with NNI](docs/en_US/CommunitySharings/RecommendersSvd.md) * [Automatically tuning SPTAG with NNI](docs/en_US/CommunitySharings/SptagAutoTune.md) * [Find thy hyper-parameters for scikit-learn pipelines using Microsoft NNI](https://towardsdatascience.com/find-thy-hyper-parameters-for-scikit-learn-pipelines-using-microsoft-nni-f1015b1224c1) * **Blog (in Chinese)** - [AutoML tools (Advisor, NNI and Google Vizier) comparison](http://gaocegege.com/Blog/%E6%9C%BA%E5%99%A8%E5%AD%A6%E4%B9%A0/katib-new#%E6%80%BB%E7%BB%93%E4%B8%8E%E5%88%86%E6%9E%90) by [@gaocegege](https://github.com/gaocegege) - 总结与分析 section of design and implementation of kubeflow/katib * **Blog (in Chinese)** - [A summary of NNI new capabilities in 2019](https://mp.weixin.qq.com/s/7_KRT-rRojQbNuJzkjFMuA) by @squirrelsc ## **Feedback** * Discuss on the NNI [Gitter](https://gitter.im/Microsoft/nni?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge&utm_content=badge) in NNI. * [File an issue](https://github.com/microsoft/nni/issues/new/choose) on GitHub. * Ask a question with NNI tags on [Stack Overflow](https://stackoverflow.com/questions/tagged/nni?sort=Newest&edited=true). ## Related Projects Targeting at openness and advancing state-of-art technology, [Microsoft Research (MSR)](https://www.microsoft.com/en-us/research/group/systems-research-group-asia/) had also released few other open source projects. * [OpenPAI](https://github.com/Microsoft/pai) : an open source platform that provides complete AI model training and resource management capabilities, it is easy to extend and supports on-premise, cloud and hybrid environments in various scale. * [FrameworkController](https://github.com/Microsoft/frameworkcontroller) : an open source general-purpose Kubernetes Pod Controller that orchestrate all kinds of applications on Kubernetes by a single controller. * [MMdnn](https://github.com/Microsoft/MMdnn) : A comprehensive, cross-framework solution to convert, visualize and diagnose deep neural network models. The "MM" in MMdnn stands for model management and "dnn" is an acronym for deep neural network. * [SPTAG](https://github.com/Microsoft/SPTAG) : Space Partition Tree And Graph (SPTAG) is an open source library for large scale vector approximate nearest neighbor search scenario. We encourage researchers and students leverage these projects to accelerate the AI development and research. ## **License** The entire codebase is under [MIT license](LICENSE)