"...resnet50_tensorflow.git" did not exist on "3d03e675d444186a49a665ce2e1be32a24c59215"
Unverified Commit fce6ab4d authored by J-shang's avatar J-shang Committed by GitHub
Browse files

modify customized tuner doc and fix typo in training service doc (#3234)

* fix typo in training service doc

* modify local mode doc

* modify customized tuner doc

* fix lint

* fix typo

* remove indent

* remove install customized tuner doc

* fix typo

* fix typo
parent f68ba4a6
Run an Experiment on FrameworkController
========================================
===
NNI supports running experiment using `FrameworkController <https://github.com/Microsoft/frameworkcontroller>`__\ , called frameworkcontroller mode. FrameworkController is built to orchestrate all kinds of applications on Kubernetes, you don't need to install Kubeflow for specific deep learning framework like tf-operator or pytorch-operator. Now you can use FrameworkController as the training service to run NNI experiment.
Prerequisite for on-premises Kubernetes Service
......
**Run an Experiment on Heterogeneous Mode**
===========================================
Run NNI on heterogeneous mode means that NNI will run trials jobs in multiple kinds of training platforms. For example, NNI could submit trial jobs to remote machine and AML simultaneously
Run NNI on heterogeneous mode means that NNI will run trials jobs in multiple kinds of training platforms. For example, NNI could submit trial jobs to remote machine and AML simultaneously.
Setup environment
-----------------
......
Run an Experiment on Kubeflow
=============================
===
Now NNI supports running experiment on `Kubeflow <https://github.com/kubeflow/kubeflow>`__\ , called kubeflow mode. Before starting to use NNI kubeflow mode, you should have a Kubernetes cluster, either on-premises or `Azure Kubernetes Service(AKS) <https://azure.microsoft.com/en-us/services/kubernetes-service/>`__\ , a Ubuntu machine on which `kubeconfig <https://kubernetes.io/docs/concepts/configuration/organize-cluster-access-kubeconfig/>`__ is setup to connect to your Kubernetes cluster. If you are not familiar with Kubernetes, `here <https://kubernetes.io/docs/tutorials/kubernetes-basics/>`__ is a good start. In kubeflow mode, your trial program will run as Kubeflow job in Kubernetes cluster.
Prerequisite for on-premises Kubernetes Service
......
**Tutorial: Create and Run an Experiment on local with NNI API**
====================================================================
In this tutorial, we will use the example in [~/examples/trials/mnist-tfv1] to explain how to create and run an experiment on local with NNI API.
In this tutorial, we will use the example in [nni/examples/trials/mnist-tfv1] to explain how to create and run an experiment on local with NNI API.
..
......@@ -17,8 +17,9 @@ You have an implementation for MNIST classifer using convolutional layers, the P
To enable NNI API, make the following changes:
* Declare NNI API: include ``import nni`` in your trial code to use NNI APIs.
* Get predefined parameters
1.1 Declare NNI API: include ``import nni`` in your trial code to use NNI APIs.
1.2 Get predefined parameters
Use the following code snippet:
......@@ -32,8 +33,9 @@ to get hyper-parameters' values assigned by tuner. ``RECEIVED_PARAMS`` is an obj
{"conv_size": 2, "hidden_size": 124, "learning_rate": 0.0307, "dropout_rate": 0.2029}
* Report NNI results: Use the API: ``nni.report_intermediate_result(accuracy)`` to send ``accuracy`` to assessor.
Use the API: ``nni.report_final_result(accuracy)`` to send `accuracy` to tuner.
..
1.3 Report NNI results: Use the API: ``nni.report_intermediate_result(accuracy)`` to send ``accuracy`` to assessor. Use the API: ``nni.report_final_result(accuracy)`` to send `accuracy` to tuner.
We had made the changes and saved it to ``mnist.py``.
......@@ -91,7 +93,7 @@ To run an experiment in NNI, you only needed:
..
A set of examples can be found in ~/nni/examples after your installation, run ``ls ~/nni/examples/trials`` to see all the trial examples.
You can download nni source code and a set of examples can be found in ``nni/examples``, run ``ls nni/examples/trials`` to see all the trial examples.
Let's use a simple trial example, e.g. mnist, provided by NNI. After you installed NNI, NNI examples have been put in ~/nni/examples, run ``ls ~/nni/examples/trials`` to see all the trial examples. You can simply execute the following command to run the NNI mnist example:
......
......@@ -37,7 +37,7 @@ Built-in Training Services
* - `Kubeflow <./KubeflowMode.rst>`__
- NNI supports running experiment on `Kubeflow <https://github.com/kubeflow/kubeflow>`__\ , called kubeflow mode. Before starting to use NNI kubeflow mode, you should have a Kubernetes cluster, either on-premises or `Azure Kubernetes Service(AKS) <https://azure.microsoft.com/en-us/services/kubernetes-service/>`__\ , a Ubuntu machine on which `kubeconfig <https://kubernetes.io/docs/concepts/configuration/organize-cluster-access-kubeconfig/>`__ is setup to connect to your Kubernetes cluster. If you are not familiar with Kubernetes, `here <https://kubernetes.io/docs/tutorials/kubernetes-basics/>`__ is a good start. In kubeflow mode, your trial program will run as Kubeflow job in Kubernetes cluster.
* - `AdaptDL <./AdaptDLMode.rst>`__
- NNI supports running experiment on `AdaptDL <https://github.com/petuum/adaptdl>`__\ , called AdaptDL mode. Before starting to use NNI kubeflow mode, you should have a Kubernetes cluster.
- NNI supports running experiment on `AdaptDL <https://github.com/petuum/adaptdl>`__\ , called AdaptDL mode. Before starting to use AdaptDL mode, you should have a Kubernetes cluster.
* - `FrameworkController <./FrameworkControllerMode.rst>`__
- NNI supports running experiment using `FrameworkController <https://github.com/Microsoft/frameworkcontroller>`__\ , called frameworkcontroller mode. FrameworkController is built to orchestrate all kinds of applications on Kubernetes, you don't need to install Kubeflow for specific deep learning framework like tf-operator or pytorch-operator. Now you can use FrameworkController as the training service to run NNI experiment.
* - `DLTS <./DLTSMode.rst>`__
......@@ -65,6 +65,6 @@ Step 1. **Validate config and prepare the training platform.** Training service
Step 2. **Submit the first trial.** To initiate a trial, usually (in non-reuse mode), NNI copies another few files (including parameters, launch script and etc.) onto training platform. After that, NNI launches the trial through subprocess, SSH, RESTful API, and etc.
.. Warning:: The working directory of trial command has exactly the same content as ``codeDir``, but can have a differen path (even on differen machines) Local mode is the only training service that shares one ``codeDir`` across all trials. Other training services copies a ``codeDir`` from the shared copy prepared in step 1 and each trial has an independent working directory. We strongly advise users not to rely on the shared behavior in local mode, as it will make your experiments difficult to scale to other training services.
.. Warning:: The working directory of trial command has exactly the same content as ``codeDir``, but can have different paths (even on different machines) Local mode is the only training service that shares one ``codeDir`` across all trials. Other training services copies a ``codeDir`` from the shared copy prepared in step 1 and each trial has an independent working directory. We strongly advise users not to rely on the shared behavior in local mode, as it will make your experiments difficult to scale to other training services.
Step 3. **Collect metrics.** NNI then monitors the status of trial, updates the status (e.g., from ``WAITING`` to ``RUNNING``\ , ``RUNNING`` to ``SUCCEEDED``\ ) recorded, and also collects the metrics. Currently, most training services are implemented in an "active" way, i.e., training service will call the RESTful API on NNI manager to update the metrics. Note that this usually requires the machine that runs NNI manager to be at least accessible to the worker node.
......@@ -73,7 +73,7 @@ BOHB advisor requires the `ConfigSpace <https://github.com/automl/ConfigSpace>`_
.. code-block:: bash
nnictl package install --name=BOHB
pip install nni[BOHB]
To use BOHB, you should add the following spec in your experiment's YAML config file:
......
......@@ -26,7 +26,7 @@ Currently, we support the following algorithms:
* - `Naïve Evolution <#Evolution>`__
- Naïve Evolution comes from Large-Scale Evolution of Image Classifiers. It randomly initializes a population-based on search space. For each generation, it chooses better ones and does some mutation (e.g., change a hyperparameter, add/remove one layer) on them to get the next generation. Naïve Evolution requires many trials to work, but it's very simple and easy to expand new features. `Reference paper <https://arxiv.org/pdf/1703.01041.pdf>`__
* - `SMAC <#SMAC>`__
- SMAC is based on Sequential Model-Based Optimization (SMBO). It adapts the most prominent previously used model class (Gaussian stochastic process models) and introduces the model class of random forests to SMBO, in order to handle categorical parameters. The SMAC supported by NNI is a wrapper on the SMAC3 GitHub repo. Notice, SMAC needs to be installed by ``nnictl package`` command. `Reference Paper, <https://www.cs.ubc.ca/~hutter/papers/10-TR-SMAC.pdf>`__ `GitHub Repo <https://github.com/automl/SMAC3>`__
- SMAC is based on Sequential Model-Based Optimization (SMBO). It adapts the most prominent previously used model class (Gaussian stochastic process models) and introduces the model class of random forests to SMBO, in order to handle categorical parameters. The SMAC supported by NNI is a wrapper on the SMAC3 GitHub repo. Notice, SMAC needs to be installed by ``pip install nni[SMAC]`` command. `Reference Paper, <https://www.cs.ubc.ca/~hutter/papers/10-TR-SMAC.pdf>`__ `GitHub Repo <https://github.com/automl/SMAC3>`__
* - `Batch tuner <#Batch>`__
- Batch tuner allows users to simply provide several configurations (i.e., choices of hyper-parameters) for their trial code. After finishing all the configurations, the experiment is done. Batch tuner only supports the type choice in search space spec.
* - `Grid Search <#GridSearch>`__
......@@ -52,7 +52,7 @@ Usage of Built-in Tuners
Using a built-in tuner provided by the NNI SDK requires one to declare the **builtinTunerName** and **classArgs** in the ``config.yml`` file. In this part, we will introduce each tuner along with information about usage and suggested scenarios, classArg requirements, and an example configuration.
Note: Please follow the format when you write your ``config.yml`` file. Some built-in tuners need to be installed using ``nnictl package``\ , like SMAC.
Note: Please follow the format when you write your ``config.yml`` file. Some built-in tuners have dependencies that need to be installed using ``pip install nni[<tuner>]``, like SMAC's dependencies can be installed using ``pip install nni[SMAC]``.
:raw-html:`<a name="TPE"></a>`
......@@ -192,11 +192,11 @@ SMAC
**Installation**
SMAC needs to be installed by following command before the first usage. As a reminder, ``swig`` is required for SMAC: for Ubuntu ``swig`` can be installed with ``apt``.
SMAC has dependencies that need to be installed by following command before the first usage. As a reminder, ``swig`` is required for SMAC: for Ubuntu ``swig`` can be installed with ``apt``.
.. code-block:: bash
nnictl package install --name=SMAC
pip install nni[SMAC]
**Suggested scenario**
......@@ -417,7 +417,7 @@ BOHB advisor requires `ConfigSpace <https://github.com/automl/ConfigSpace>`__ pa
.. code-block:: bash
nnictl package install --name=BOHB
pip install nni[BOHB]
**Suggested scenario**
......
How to install customized tuner as a builtin tuner
==================================================
You can following below steps to install a customized tuner in ``nni/examples/tuners/customized_tuner`` as a builtin tuner.
Prepare installation source and install package
-----------------------------------------------
There are 2 options to install this customized tuner:
Option 1: install from directory
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Step 1: From ``nni/examples/tuners/customized_tuner`` directory, run:
``python setup.py develop``
This command will build the ``nni/examples/tuners/customized_tuner`` directory as a pip installation source.
Step 2: Run command:
``nnictl package install ./``
Option 2: install from whl file
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Step 1: From ``nni/examples/tuners/customized_tuner`` directory, run:
``python setup.py bdist_wheel``
This command build a whl file which is a pip installation source.
Step 2: Run command:
``nnictl package install dist/demo_tuner-0.1-py3-none-any.whl``
Check the installed package
---------------------------
Then run command ``nnictl package list``\ , you should be able to see that demotuner is installed:
.. code-block:: bash
+-----------------+------------+-----------+--------=-------------+------------------------------------------+
| Name | Type | Installed | Class Name | Module Name |
+-----------------+------------+-----------+----------------------+------------------------------------------+
| demotuner | tuners | Yes | DemoTuner | demo_tuner |
+-----------------+------------+-----------+----------------------+------------------------------------------+
Use the installed tuner in experiment
-------------------------------------
Now you can use the demotuner in experiment configuration file the same way as other builtin tuners:
.. code-block:: yaml
tuner:
builtinTunerName: demotuner
classArgs:
#choice: maximize, minimize
optimize_mode: maximize
......@@ -103,8 +103,6 @@ Run following command to register the customized algorithms as builtin algorithm
The ``<path_to_meta_file>`` is the path to the yaml file your created in above section.
Reference `customized tuner example <../Tuner/InstallCustomizedTuner.rst>`_ for a full example.
6. Use the installed builtin algorithms in experiment
-----------------------------------------------------
......
......@@ -182,7 +182,7 @@ If there is a stderr file, please check it. Two possible cases are:
Fail to use BOHB on Windows
^^^^^^^^^^^^^^^^^^^^^^^^^^^
Make sure a C++ 14.0 compiler is installed when trying to run ``nnictl package install --name=BOHB`` to install the dependencies.
Make sure a C++ 14.0 compiler is installed when trying to run ``pip install nni[BOHB]`` to install the dependencies.
Not supported tuner on Windows
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
......
......@@ -29,7 +29,7 @@ nnictl support commands:
* `nnictl log <#log>`__
* `nnictl webui <#webui>`__
* `nnictl tensorboard <#tensorboard>`__
* `nnictl package <#package>`__
* `nnictl algo <#algo>`__
* `nnictl ss_gen <#ss_gen>`__
* `nnictl --version <#version>`__
......@@ -1403,82 +1403,79 @@ Manage tensorboard
- ID of the experiment you want to set
:raw-html:`<a name="package"></a>`
:raw-html:`<a name="algo"></a>`
Manage package
^^^^^^^^^^^^^^
Manage builtin algorithms
^^^^^^^^^^^^^^^^^^^^^^^^^
*
**nnictl package install**
**nnictl algo register**
*
Description
Install a package (customized algorithms or nni provided algorithms) as builtin tuner/assessor/advisor.
Register customized algorithms as builtin tuner/assessor/advisor.
*
Usage
.. code-block:: bash
nnictl package install --name <package name>
nnictl algo register --meta <path_to_meta_file>
The available ``<package name>`` can be checked via ``nnictl package list`` command.
or
.. code-block:: bash
nnictl package install <installation source>
Reference `Install customized algorithms <InstallCustomizedAlgos.rst>`__ to prepare the installation source.
``<path_to_meta_file>`` is the path to the meta data file in yml format, which has following keys:
*
``algoType``: type of algorithms, could be one of ``tuner``, ``assessor``, ``advisor``
*
``builtinName``: builtin name used in experiment configuration file
*
``className``: tuner class name, including its module name, for example: ``demo_tuner.DemoTuner``
*
``classArgsValidator``: class args validator class name, including its module name, for example: ``demo_tuner.MyClassArgsValidator``
*
Example
..
Install SMAC tuner
.. code-block:: bash
nnictl package install --name SMAC
..
Install a customized tuner
Install a customized tuner in nni examples
.. code-block:: bash
nnictl package install nni/examples/tuners/customized_tuner/dist/demo_tuner-0.1-py3-none-any.whl
cd nni/examples/tuners/customized_tuner
python3 setup.py develop
nnictl algo register --meta meta_file.yml
*
**nnictl package show**
**nnictl algo show**
*
Description
Show the detailed information of specified packages.
Show the detailed information of specified registered algorithms.
*
Usage
.. code-block:: bash
nnictl package show <package name>
nnictl algo show <builtinName>
*
Example
.. code-block:: bash
nnictl package show SMAC
nnictl algo show SMAC
*
**nnictl package list**
......@@ -1487,78 +1484,46 @@ Manage package
*
Description
List the installed/all packages.
List the registered builtin algorithms.
*
Usage
.. code-block:: bash
nnictl package list [OPTIONS]
*
Options
.. list-table::
:header-rows: 1
:widths: auto
* - Name, shorthand
- Required
- Default
- Description
* - --all
- False
-
- List all packages
nnictl algo list
*
Example
..
List installed packages
.. code-block:: bash
nnictl package list
..
List all packages
.. code-block:: bash
nnictl package list --all
nnictl algo list
*
**nnictl package uninstall**
**nnictl algo unregister**
*
Description
Uninstall a package.
Unregister a registered customized builtin algorithms. The NNI provided builtin algorithms can not be unregistered.
*
Usage
.. code-block:: bash
nnictl package uninstall <package name>
nnictl algo unregister <builtinName>
*
Example
Uninstall SMAC package
.. code-block:: bash
nnictl package uninstall SMAC
nnictl algo unregister demotuner
:raw-html:`<a name="ss_gen"></a>`
......
......@@ -9,4 +9,3 @@ Advanced Features
Write a New Advisor <Tuner/CustomizeAdvisor>
Write a New Training Service <TrainingService/HowToImplementTrainingService>
Install Customized Algorithms as Builtin Tuners/Assessors/Advisors <Tutorial/InstallCustomizedAlgos>
How to install customized tuner as a builtin tuner <Tuner/InstallCustomizedTuner>
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment