Unverified Commit aa316742 authored by SparkSnail's avatar SparkSnail Committed by GitHub
Browse files

Merge pull request #233 from microsoft/master

merge master
parents 3fe117f0 24fa4619
...@@ -3,7 +3,36 @@ ...@@ -3,7 +3,36 @@
NNI supports running an experiment on [OpenPAI](https://github.com/Microsoft/pai) (aka pai), called pai mode. Before starting to use NNI pai mode, you should have an account to access an [OpenPAI](https://github.com/Microsoft/pai) cluster. See [here](https://github.com/Microsoft/pai#how-to-deploy) if you don't have any OpenPAI account and want to deploy an OpenPAI cluster. In pai mode, your trial program will run in pai's container created by Docker. NNI supports running an experiment on [OpenPAI](https://github.com/Microsoft/pai) (aka pai), called pai mode. Before starting to use NNI pai mode, you should have an account to access an [OpenPAI](https://github.com/Microsoft/pai) cluster. See [here](https://github.com/Microsoft/pai#how-to-deploy) if you don't have any OpenPAI account and want to deploy an OpenPAI cluster. In pai mode, your trial program will run in pai's container created by Docker.
## Setup environment ## Setup environment
Install NNI, follow the install guide [here](../Tutorial/QuickStart.md). Step 1. Install NNI, follow the install guide [here](../Tutorial/QuickStart.md).
Step 2. Get PAI token.
Click `My profile` button in the top-right side of PAI's webprotal.
![](../../img/pai_token_button.jpg)
Find the token management region, copy one of the token as your account token.
![](../../img/pai_token_profile.jpg)
Step 3. Mount NFS storage to local machine.
Click `Submit job` button in PAI's webportal.
![](../../img/pai_job_submission_page.jpg)
Find the data management region in job submission page.
![](../../img/pai_data_management_page.jpg)
The `DEFAULT_STORAGE`field is the path to be mounted in PAI's container when a job is started. The `Preview container paths` is the NFS host and path that PAI provided, you need to mount the corresponding host and path to your local machine first, then NNI could use the PAI's NFS storage.
For example, use the following command:
```
sudo mount nfs://gcr-openpai-infra02:/pai/data /local/mnt
```
Then the `/data` folder in container will be mounted to `/local/mnt` folder in your local machine.
You could use the following configuration in your NNI's config file:
```
nniManagerNFSMountPath: /local/mnt
containerNFSMountPath: /data
```
Step 4. Get PAI's storage plugin name.
Contact PAI's admin, and get the PAI's storage plugin name for NFS storage. The default storage name is `teamwise_storage`, the configuration in NNI's config file is in following value:
```
paiStoragePlugin: teamwise_storage
```
## Run an experiment ## Run an experiment
Use `examples/trials/mnist-annotation` as an example. The NNI config YAML file's content is like: Use `examples/trials/mnist-annotation` as an example. The NNI config YAML file's content is like:
...@@ -37,6 +66,7 @@ trial: ...@@ -37,6 +66,7 @@ trial:
virtualCluster: default virtualCluster: default
nniManagerNFSMountPath: /home/user/mnt nniManagerNFSMountPath: /home/user/mnt
containerNFSMountPath: /mnt/data/user containerNFSMountPath: /mnt/data/user
paiStoragePlugin: team_wise
# Configuration to access OpenPAI Cluster # Configuration to access OpenPAI Cluster
paiConfig: paiConfig:
userName: your_pai_nni_user userName: your_pai_nni_user
......
...@@ -6,7 +6,7 @@ The original `pai` mode is modificated to `paiYarn` mode, which is a distributed ...@@ -6,7 +6,7 @@ The original `pai` mode is modificated to `paiYarn` mode, which is a distributed
Install NNI, follow the install guide [here](../Tutorial/QuickStart.md). Install NNI, follow the install guide [here](../Tutorial/QuickStart.md).
## Run an experiment ## Run an experiment
Use `examples/trials/mnist-annotation` as an example. The NNI config YAML file's content is like: Use `examples/trials/mnist-tfv1` as an example. The NNI config YAML file's content is like:
```yaml ```yaml
authorName: your_name authorName: your_name
...@@ -22,14 +22,14 @@ trainingServicePlatform: paiYarn ...@@ -22,14 +22,14 @@ trainingServicePlatform: paiYarn
# search space file # search space file
searchSpacePath: search_space.json searchSpacePath: search_space.json
# choice: true, false # choice: true, false
useAnnotation: true useAnnotation: false
tuner: tuner:
builtinTunerName: TPE builtinTunerName: TPE
classArgs: classArgs:
optimize_mode: maximize optimize_mode: maximize
trial: trial:
command: python3 mnist.py command: python3 mnist.py
codeDir: ~/nni/examples/trials/mnist-annotation codeDir: ~/nni/examples/trials/mnist-tfv1
gpuNum: 0 gpuNum: 0
cpuNum: 1 cpuNum: 1
memoryMB: 8196 memoryMB: 8196
......
...@@ -4,9 +4,9 @@ NNI can run one experiment on multiple remote machines through SSH, called `remo ...@@ -4,9 +4,9 @@ NNI can run one experiment on multiple remote machines through SSH, called `remo
## Remote machine requirements ## Remote machine requirements
* It only supports Linux as remote machines, and [linux part in system specification](../Tutorial/Installation.md) is same as NNI local mode. * It only supports Linux as remote machines, and [linux part in system specification](../Tutorial/InstallationLinux.md) is same as NNI local mode.
* Follow [installation](../Tutorial/Installation.md) to install NNI on each machine. * Follow [installation](../Tutorial/InstallationLinux.md) to install NNI on each machine.
* Make sure remote machines meet environment requirements of your trial code. If the default environment does not meet the requirements, the setup script can be added into `command` field of NNI config. * Make sure remote machines meet environment requirements of your trial code. If the default environment does not meet the requirements, the setup script can be added into `command` field of NNI config.
......
...@@ -45,7 +45,7 @@ Probably it's a problem with your network config. Here is a checklist. ...@@ -45,7 +45,7 @@ Probably it's a problem with your network config. Here is a checklist.
### NNI on Windows problems ### NNI on Windows problems
Please refer to [NNI on Windows](NniOnWindows.md) Please refer to [NNI on Windows](InstallationWin.md#FAQ)
### More FAQ issues ### More FAQ issues
......
...@@ -35,7 +35,7 @@ Note: ...@@ -35,7 +35,7 @@ Note:
If you start a docker image using NNI's offical image `msranni/nni`, you could directly start NNI experiments by using `nnictl` command. Our offical image has NNI's running environment and basic python and deep learning frameworks environment. If you start a docker image using NNI's offical image `msranni/nni`, you could directly start NNI experiments by using `nnictl` command. Our offical image has NNI's running environment and basic python and deep learning frameworks environment.
If you start your own docker image, you may need to install NNI package first, please [refer](Installation.md). If you start your own docker image, you may need to install NNI package first, please refer to [NNI installation](InstallationLinux.md).
If you want to run NNI's offical examples, you may need to clone NNI repo in github using If you want to run NNI's offical examples, you may need to clone NNI repo in github using
``` ```
......
# Installation of NNI # Install on Linux & Mac
Currently we support installation on Linux, macOS and Windows. ## Installation
## Install on Linux or macOS Installation on Linux and macOS follow the same instruction below.
* Install NNI through pip ### Install NNI through pip
Prerequisite: `python 64-bit >= 3.5` Prerequisite: `python 64-bit >= 3.5`
...@@ -12,46 +12,22 @@ Currently we support installation on Linux, macOS and Windows. ...@@ -12,46 +12,22 @@ Currently we support installation on Linux, macOS and Windows.
python3 -m pip install --upgrade nni python3 -m pip install --upgrade nni
``` ```
* Install NNI through source code ### Install NNI through source code
If you are interested on special or latest code version, you can install NNI through source code. If you are interested on special or latest code version, you can install NNI through source code.
Prerequisites: `python 64-bit >=3.5`, `git`, `wget` Prerequisites: `python 64-bit >=3.5`, `git`, `wget`
```bash ```bash
git clone -b v0.8 https://github.com/Microsoft/nni.git git clone -b v1.4 https://github.com/Microsoft/nni.git
cd nni cd nni
./install.sh ./install.sh
``` ```
* Use NNI in a docker image ### Use NNI in a docker image
You can also install NNI in a docker image. Please follow the instructions [here](https://github.com/Microsoft/nni/tree/master/deployment/docker/README.md) to build NNI docker image. The NNI docker image can also be retrieved from Docker Hub through the command `docker pull msranni/nni:latest`. You can also install NNI in a docker image. Please follow the instructions [here](https://github.com/Microsoft/nni/tree/master/deployment/docker/README.md) to build NNI docker image. The NNI docker image can also be retrieved from Docker Hub through the command `docker pull msranni/nni:latest`.
## Install on Windows
Anaconda or Miniconda is highly recommended to manage multiple Python environments.
* Install NNI through pip
Prerequisites: `python 64-bit >= 3.5`
```bash
python -m pip install --upgrade nni
```
* Install NNI through source code
If you are interested on special or latest code version, you can install NNI through source code.
Prerequisites: `python 64-bit >=3.5`, `git`, `PowerShell`.
```bash
git clone -b v0.8 https://github.com/Microsoft/nni.git
cd nni
powershell -ExecutionPolicy Bypass -file install.ps1
```
## Verify installation ## Verify installation
The following example is built on TensorFlow 1.x. Make sure **TensorFlow 1.x is used** when running it. The following example is built on TensorFlow 1.x. Make sure **TensorFlow 1.x is used** when running it.
...@@ -59,23 +35,15 @@ The following example is built on TensorFlow 1.x. Make sure **TensorFlow 1.x is ...@@ -59,23 +35,15 @@ The following example is built on TensorFlow 1.x. Make sure **TensorFlow 1.x is
* Download the examples via clone the source code. * Download the examples via clone the source code.
```bash ```bash
git clone -b v1.3 https://github.com/Microsoft/nni.git git clone -b v1.4 https://github.com/Microsoft/nni.git
``` ```
* Run the MNIST example. * Run the MNIST example.
Linux or macOS
```bash ```bash
nnictl create --config nni/examples/trials/mnist-tfv1/config.yml nnictl create --config nni/examples/trials/mnist-tfv1/config.yml
``` ```
Windows
```bash
nnictl create --config nni\examples\trials\mnist-tfv1\config_windows.yml
```
* Wait for the message `INFO: Successfully started experiment!` in the command line. This message indicates that your experiment has been successfully started. You can explore the experiment using the `Web UI url`. * Wait for the message `INFO: Successfully started experiment!` in the command line. This message indicates that your experiment has been successfully started. You can explore the experiment using the `Web UI url`.
```text ```text
...@@ -138,18 +106,6 @@ Due to potential programming changes, the minimum system requirements of NNI may ...@@ -138,18 +106,6 @@ Due to potential programming changes, the minimum system requirements of NNI may
| **Internet** | Boardband internet connection | | **Internet** | Boardband internet connection |
| **Resolution** | 1024 x 768 minimum display resolution | | **Resolution** | 1024 x 768 minimum display resolution |
### Windows
| | Recommended | Minimum |
| -------------------- | ---------------------------------------------- | -------------------------------------- |
| **Operating System** | Windows 10 1809 or above |
| **CPU** | Intel® Core™ i5 or AMD Phenom™ II X3 or better | Intel® Core™ i3 or AMD Phenom™ X3 8650 |
| **GPU** | NVIDIA® GeForce® GTX 660 or better | NVIDIA® GeForce® GTX 460 |
| **Memory** | 6 GB RAM | 4 GB RAM |
| **Storage** | 30 GB available hare drive space |
| **Internet** | Boardband internet connection |
| **Resolution** | 1024 x 768 minimum display resolution |
## Further reading ## Further reading
* [Overview](../Overview.md) * [Overview](../Overview.md)
......
# Install on Windows
## Installation
Anaconda or Miniconda is highly recommended to manage multiple Python environments.
### Install NNI through pip
Prerequisites: `python 64-bit >= 3.5`
```bash
python -m pip install --upgrade nni
```
### Install NNI through source code
If you are interested on special or latest code version, you can install NNI through source code.
Prerequisites: `python 64-bit >=3.5`, `git`, `PowerShell`.
```bash
git clone -b v1.4 https://github.com/Microsoft/nni.git
cd nni
powershell -ExecutionPolicy Bypass -file install.ps1
```
## Verify installation
The following example is built on TensorFlow 1.x. Make sure **TensorFlow 1.x is used** when running it.
* Download the examples via clone the source code.
```bash
git clone -b v1.4 https://github.com/Microsoft/nni.git
```
* Run the MNIST example.
```bash
nnictl create --config nni\examples\trials\mnist-tfv1\config_windows.yml
```
Note: for other examples you need to change trial command `python3` to `python` in each example YAML, if python3 is called through `python` on your machine.
* Wait for the message `INFO: Successfully started experiment!` in the command line. This message indicates that your experiment has been successfully started. You can explore the experiment using the `Web UI url`.
```text
INFO: Starting restful server...
INFO: Successfully started Restful server!
INFO: Setting local config...
INFO: Successfully set local config!
INFO: Starting experiment...
INFO: Successfully started experiment!
-----------------------------------------------------------------------
The experiment id is egchD4qy
The Web UI urls are: http://223.255.255.1:8080 http://127.0.0.1:8080
-----------------------------------------------------------------------
You can use these commands to get more information about the experiment
-----------------------------------------------------------------------
commands description
1. nnictl experiment show show the information of experiments
2. nnictl trial ls list all of trial jobs
3. nnictl top monitor the status of running experiments
4. nnictl log stderr show stderr log content
5. nnictl log stdout show stdout log content
6. nnictl stop stop an experiment
7. nnictl trial kill kill a trial job by id
8. nnictl --help get help information about nnictl
-----------------------------------------------------------------------
```
* Open the `Web UI url` in your browser, you can view detail information of the experiment and all the submitted trial jobs as shown below. [Here](../Tutorial/WebUI.md) are more Web UI pages.
![overview](../../img/webui_overview_page.png)
![detail](../../img/webui_trialdetail_page.png)
## System requirements
Below are the minimum system requirements for NNI on Windows, Windows 10.1809 is well tested and recommend. Due to potential programming changes, the minimum system requirements for NNI may change over time.
| | Recommended | Minimum |
| -------------------- | ---------------------------------------------- | -------------------------------------- |
| **Operating System** | Windows 10 1809 or above |
| **CPU** | Intel® Core™ i5 or AMD Phenom™ II X3 or better | Intel® Core™ i3 or AMD Phenom™ X3 8650 |
| **GPU** | NVIDIA® GeForce® GTX 660 or better | NVIDIA® GeForce® GTX 460 |
| **Memory** | 6 GB RAM | 4 GB RAM |
| **Storage** | 30 GB available hare drive space |
| **Internet** | Boardband internet connection |
| **Resolution** | 1024 x 768 minimum display resolution |
## FAQ
### simplejson failed when installing NNI
Make sure C++ 14.0 compiler installed.
>building 'simplejson._speedups' extension error: [WinError 3] The system cannot find the path specified
### Trial failed with missing DLL in command line or PowerShell
This error caused by missing LIBIFCOREMD.DLL and LIBMMD.DLL and fail to install SciPy. Using Anaconda or Miniconda with Python(64-bit) can solve it.
>ImportError: DLL load failed
### Trial failed on webUI
Please check the trial log file stderr for more details.
If there is a stderr file, please check out. Two possible cases are as follows:
* forget to change the trial command `python3` into `python` in each experiment YAML.
* forget to install experiment dependencies such as TensorFlow, Keras and so on.
### Fail to use BOHB on Windows
Make sure C++ 14.0 compiler installed then try to run `nnictl package install --name=BOHB` to install the dependencies.
### Not supported tuner on Windows
SMAC is not supported currently, the specific reason can be referred to this [GitHub issue](https://github.com/automl/SMAC3/issues/483).
### Use a Windows server as a remote worker
Currently you can't.
Note:
* If there is any error like `Segmentation fault`, please refer to [FAQ](FAQ.md)
## Further reading
* [Overview](../Overview.md)
* [Use command line tool nnictl](Nnictl.md)
* [Use NNIBoard](WebUI.md)
* [Define search space](SearchSpaceSpec.md)
* [Config an experiment](ExperimentConfig.md)
* [How to run an experiment on local (with multiple GPUs)?](../TrainingService/LocalMode.md)
* [How to run an experiment on multiple machines?](../TrainingService/RemoteMachineMode.md)
* [How to run an experiment on OpenPAI?](../TrainingService/PaiMode.md)
* [How to run an experiment on Kubernetes through Kubeflow?](../TrainingService/KubeflowMode.md)
* [How to run an experiment on Kubernetes through FrameworkController?](../TrainingService/FrameworkControllerMode.md)
\ No newline at end of file
# NNI on Windows (experimental feature)
Running NNI on Windows is an experimental feature. Windows 10.1809 is well tested and recommended.
## **Installation on Windows**
please refer to [Installation](Installation.md) for more details.
When these things are done, use the **config_windows.yml** configuration to start an experiment for validation.
```bash
nnictl create --config nni\examples\trials\mnist-tfv1\config_windows.yml
```
For other examples you need to change trial command `python3` into `python` in each example YAML.
## **FAQ**
### simplejson failed when installing NNI
Make sure C++ 14.0 compiler installed.
>building 'simplejson._speedups' extension error: [WinError 3] The system cannot find the path specified
### Trial failed with missing DLL in command line or PowerShell
This error caused by missing LIBIFCOREMD.DLL and LIBMMD.DLL and fail to install SciPy. Using Anaconda or Miniconda with Python(64-bit) can solve it.
>ImportError: DLL load failed
### Trial failed on webUI
Please check the trial log file stderr for more details.
If there is a stderr file, please check out. Two possible cases are as follows:
* forget to change the trial command `python3` into `python` in each experiment YAML.
* forget to install experiment dependencies such as TensorFlow, Keras and so on.
### Fail to use BOHB on Windows
Make sure C++ 14.0 compiler installed then try to run `nnictl package install --name=BOHB` to install the dependencies.
### Not supported tuner on Windows
SMAC is not supported currently, the specific reason can be referred to this [GitHub issue](https://github.com/automl/SMAC3/issues/483).
### Use a Windows server as a remote worker
Currently you can't.
Note:
* If there is any error like `Segmentation fault`, please refer to [FAQ](FAQ.md)
...@@ -20,7 +20,7 @@ Note: ...@@ -20,7 +20,7 @@ Note:
* For Linux and macOS `--user` can be added if you want to install NNI in your home directory, which does not require any special privileges. * For Linux and macOS `--user` can be added if you want to install NNI in your home directory, which does not require any special privileges.
* If there is any error like `Segmentation fault`, please refer to [FAQ](FAQ.md) * If there is any error like `Segmentation fault`, please refer to [FAQ](FAQ.md)
* For the `system requirements` of NNI, please refer to [Install NNI](Installation.md) * For the `system requirements` of NNI, please refer to [Install NNI on Linux&Mac](InstallationLinux.md) or [Windows](InstallationWin.md)
## "Hello World" example on MNIST ## "Hello World" example on MNIST
......
Advanced Features
=====================
.. toctree::
MultiPhase<./AdvancedFeature/MultiPhase>
Assessors
==============
In order to save our computing resources, NNI supports an early stop policy and creates **Assessor** to finish this job.
Assessor receives the intermediate result from Trial and decides whether the Trial should be killed by specific algorithm. Once the Trial experiment meets the early stop conditions(which means assessor is pessimistic about the final results), the assessor will kill the trial and the status of trial will be `"EARLY_STOPPED"`.
Here is an experimental result of MNIST after using 'Curvefitting' Assessor in 'maximize' mode, you can see that assessor successfully **early stopped** many trials with bad hyperparameters in advance. If you use assessor, we may get better hyperparameters under the same computing resources.
*Implemented code directory: config_assessor.yml <https://github.com/Microsoft/nni/blob/master/examples/trials/mnist-tfv1/config_assessor.yml>*
.. image:: ../img/Assessor.png
Like Tuners, users can either use built-in Assessors, or customize an Assessor on their own. Please refer to the following tutorials for detail:
.. toctree::
:maxdepth: 2
Builtin Assessors <builtin_assessor>
Customized Assessors <Assessor/CustomizeAssessor>
# Python API Reference of Auto Tune
```eval_rst
.. contents::
```
## Trial
```eval_rst
.. autofunction:: nni.get_next_parameter
.. autofunction:: nni.get_current_parameter
.. autofunction:: nni.report_intermediate_result
.. autofunction:: nni.report_final_result
.. autofunction:: nni.get_experiment_id
.. autofunction:: nni.get_trial_id
.. autofunction:: nni.get_sequence_id
```
## Tuner
```eval_rst
.. autoclass:: nni.tuner.Tuner
:members:
.. autoclass:: nni.hyperopt_tuner.hyperopt_tuner.HyperoptTuner
:members:
.. autoclass:: nni.evolution_tuner.evolution_tuner.EvolutionTuner
:members:
.. autoclass:: nni.smac_tuner.SMACTuner
:members:
.. autoclass:: nni.gridsearch_tuner.GridSearchTuner
:members:
.. autoclass:: nni.networkmorphism_tuner.networkmorphism_tuner.NetworkMorphismTuner
:members:
.. autoclass:: nni.metis_tuner.metis_tuner.MetisTuner
:members:
.. autoclass:: nni.ppo_tuner.PPOTuner
:members:
.. autoclass:: nni.batch_tuner.batch_tuner.BatchTuner
:members:
.. autoclass:: nni.gp_tuner.gp_tuner.GPTuner
:members:
```
## Assessor
```eval_rst
.. autoclass:: nni.assessor.Assessor
:members:
.. autoclass:: nni.assessor.AssessResult
:members:
.. autoclass:: nni.curvefitting_assessor.CurvefittingAssessor
:members:
.. autoclass:: nni.medianstop_assessor.MedianstopAssessor
:members:
```
## Advisor
```eval_rst
.. autoclass:: nni.msg_dispatcher_base.MsgDispatcherBase
:members:
.. autoclass:: nni.hyperband_advisor.hyperband_advisor.Hyperband
:members:
.. autoclass:: nni.bohb_advisor.bohb_advisor.BOHB
:members:
```
Builtin-Assessors Builtin-Assessors
================= =================
In order to save our computing resources, NNI supports an early stop policy and creates **Assessor** to finish this job.
Assessor receives the intermediate result from Trial and decides whether the Trial should be killed by specific algorithm. Once the Trial experiment meets the early stop conditions(which means assessor is pessimistic about the final results), the assessor will kill the trial and the status of trial will be `"EARLY_STOPPED"`.
Here is an experimental result of MNIST after using 'Curvefitting' Assessor in 'maximize' mode, you can see that assessor successfully **early stopped** many trials with bad hyperparameters in advance. If you use assessor, we may get better hyperparameters under the same computing resources.
*Implemented code directory: config_assessor.yml <https://github.com/Microsoft/nni/blob/master/examples/trials/mnist-tfv1/config_assessor.yml>*
.. image:: ../img/Assessor.png
.. toctree:: .. toctree::
:maxdepth: 1 :maxdepth: 1
......
Builtin-Tuners Builtin-Tuners
================== ==============
NNI provides an easy way to adopt an approach to set up parameter tuning algorithms, we call them **Tuner**.
Tuner receives metrics from `Trial` to evaluate the performance of a specific parameters/architecture configures. And tuner sends next hyper-parameter or architecture configure to Trial.
.. toctree:: .. toctree::
:maxdepth: 1 :maxdepth: 1
......
...@@ -28,7 +28,7 @@ author = 'Microsoft' ...@@ -28,7 +28,7 @@ author = 'Microsoft'
# The short X.Y version # The short X.Y version
version = '' version = ''
# The full version, including alpha/beta/rc tags # The full version, including alpha/beta/rc tags
release = 'v1.3' release = 'v1.4'
# -- General configuration --------------------------------------------------- # -- General configuration ---------------------------------------------------
......
###################
Feature Engineering Feature Engineering
=================== ###################
We are glad to announce the alpha release for Feature Engineering toolkit on top of NNI, We are glad to introduce Feature Engineering toolkit on top of NNI,
it's still in the experiment phase which might evolve based on usage feedback. it's still in the experiment phase which might evolve based on usage feedback.
We'd like to invite you to use, feedback and even contribute. We'd like to invite you to use, feedback and even contribute.
......
Advanced Features
=================
.. toctree::
Enable Multi-phase <AdvancedFeature/MultiPhase>
Write a New Tuner <Tuner/CustomizeTuner>
Write a New Assessor <Assessor/CustomizeAssessor>
Write a New Advisor <Tuner/CustomizeAdvisor>
Write a New Training Service <TrainingService/HowToImplementTrainingService>
#############################
Auto (Hyper-parameter) Tuning
#############################
Auto tuning is one of the key features provided by NNI, a main application scenario is
hyper-parameter tuning. Trial code is the one to be tuned, we provide a lot of popular
auto tuning algorithms (called Tuner), and some early stop algorithms (called Assessor).
NNI supports running trial on various training platforms, for example, on a local machine,
on several servers in a distributed manner, or on platforms such as OpenPAI, Kubernetes.
Other key features of NNI, such as model compression, feature engineering, can also be further
enhanced by auto tuning, which is described when introduing those features.
NNI has high extensibility, advanced users could customized their own Tuner, Assessor, and Training Service
according to their needs.
.. toctree::
:maxdepth: 2
Write Trial <TrialExample/Trials>
Tuners <builtin_tuner>
Assessors <builtin_assessor>
Training Platform <training_services>
Examples <examples>
WebUI <Tutorial/WebUI>
How to Debug <Tutorial/HowToDebug>
Advanced <hpo_advanced>
\ No newline at end of file
...@@ -2,9 +2,6 @@ ...@@ -2,9 +2,6 @@
Neural Network Intelligence Neural Network Intelligence
########################### ###########################
********
Contents
********
.. toctree:: .. toctree::
:caption: Table of Contents :caption: Table of Contents
...@@ -12,11 +9,14 @@ Contents ...@@ -12,11 +9,14 @@ Contents
:titlesonly: :titlesonly:
Overview Overview
QuickStart<Tutorial/QuickStart> Installation <installation>
Tutorials<tutorials> QuickStart <Tutorial/QuickStart>
Examples<examples> Auto (Hyper-parameter) Tuning <hyperparameter_tune>
Reference<reference> Neural Architecture Search <nas>
FAQ<Tutorial/FAQ> Model Compression <model_compression>
Contribution<contribution> Feature Engineering <feature_engineering>
Changelog<Release> References <reference>
Community Sharings<CommunitySharings/community_sharings> Community Sharings <CommunitySharings/community_sharings>
FAQ <Tutorial/FAQ>
How to Contribution <contribution>
Changelog <Release>
\ No newline at end of file
############
Installation
############
Currently we support installation on Linux, Mac and Windows. And also allow you to use docker.
.. toctree::
:maxdepth: 2
Linux & Mac <Tutorial/InstallationLinux>
Windows <Tutorial/InstallationWin>
Use Docker <Tutorial/HowToUseDocker>
\ No newline at end of file
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment