Neural Network Intelligence(NNI) is a light package for supporting hyper-parameter tuning or neural architecture search.
Neural Network Intelligence(NNI) is a light package for supporting hyper-parameter tuning or neural architecture search.
It could easily run in different environments, such as: local/remote machine/cloud.
It could easily run in different environments, such as: local/remote machine/cloud.
And it offers a new annotation language for user to conveniently design search space.
And it offers a new annotation language for user to conveniently design search space.
Also user could write code using any language or any machine learning framework.
Also user could write code using any language or any machine learning framework.
## Getting Started
# Getting Started
TODO: Guide users through getting your code up and running on their own system. In this section you can talk about:
TODO: Guide users through getting your code up and running on their own system. In this section you can talk about:
1. Installation process
1. Installation process
2. Software dependencies
2. Software dependencies
3. Latest releases
3. Latest releases
4. API references
4. API references
## Build and Test
# Build and Test
TODO: Describe and show how to build your code and run the tests.
TODO: Describe and show how to build your code and run the tests.
## Contribute
# Contribute
TODO: Explain how other users and developers can contribute to make your code better.
TODO: Explain how other users and developers can contribute to make your code better.
## Privacy Statement
# Privacy Statement
The [Microsoft Enterprise and Developer Privacy Statement](https://privacy.microsoft.com/en-us/privacystatement) describes the privacy statement of this software.
The [Microsoft Enterprise and Developer Privacy Statement](https://privacy.microsoft.com/en-us/privacystatement) describes the privacy statement of this software.
For our built-in assessors, you need to fill two fields: `assessorName` which chooses NNI provided assessors (refer to [here]() for built-in assessors), `optimizationMode` which includes Maximize and Minimize (you want to maximize or minimize your trial result).
For our built-in assessors, you need to fill two fields: `assessorName` which chooses NNI provided assessors (refer to [here]() for built-in assessors), `optimizationMode` which includes Maximize and Minimize (you want to maximize or minimize your trial result).
This command will start the experiment and WebUI. The WebUI endpoint will be shown in the output of this command (for example, `http://localhost:8080`). Open this URL using your browsers. You can analyze your experiment through WebUI, or open trials' tensorboard.
This command will start the experiment and WebUI. The WebUI endpoint will be shown in the output of this command (for example, `http://localhost:8080`). Open this URL using your browsers. You can analyze your experiment through WebUI, or open trials' tensorboard.
...
@@ -69,9 +64,9 @@ An experiment is to run multiple trial jobs, each trial job tries a configuratio
...
@@ -69,9 +64,9 @@ An experiment is to run multiple trial jobs, each trial job tries a configuratio
* Provide a yaml experiment configure file
* Provide a yaml experiment configure file
* (optional) Provide or choose an assessor
* (optional) Provide or choose an assessor
**Prepare trial**: Let's use a simple trial example, e.g. mnist, provided by NNI. After you installed NNI, NNI examples have been put in $HOME/.nni/examples, run `ls $HOME/.nni/examples/trials` to see all the trial examples. You can simply execute the following command to run the NNI mnist example:
**Prepare trial**: Let's use a simple trial example, e.g. mnist, provided by NNI. After you installed NNI, NNI examples have been put in /usr/share/nni/examples, run `ls /usr/share/nni/examples/trials` to see all the trial examples. You can simply execute the following command to run the NNI mnist example:
This command will be filled in the yaml configure file below. Please refer to [here]() for how to write your own trial.
This command will be filled in the yaml configure file below. Please refer to [here]() for how to write your own trial.
...
@@ -82,7 +77,7 @@ This command will be filled in the yaml configure file below. Please refer to [h
...
@@ -82,7 +77,7 @@ This command will be filled in the yaml configure file below. Please refer to [h
*tunerName* is used to specify a tuner in NNI, *optimizationMode* is to indicate whether you want to maximize or minimize your trial's result.
*tunerName* is used to specify a tuner in NNI, *optimizationMode* is to indicate whether you want to maximize or minimize your trial's result.
**Prepare configure file**: Since you have already known which trial code you are going to run and which tuner you are going to use, it is time to prepare the yaml configure file. NNI provides a demo configure file for each trial example, `cat $HOME/.nni/examples/trials/mnist-annotation/config.yaml` to see it. Its content is basically shown below:
**Prepare configure file**: Since you have already known which trial code you are going to run and which tuner you are going to use, it is time to prepare the yaml configure file. NNI provides a demo configure file for each trial example, `cat /usr/share/nni/examples/trials/mnist-annotation/config.yml` to see it. Its content is basically shown below: