Commit 69dfbf5e authored by chicm-ms's avatar chicm-ms Committed by GitHub
Browse files

Merge pull request #2020 from microsoft/dev-refactor-doc

merge back to master: update doc index
parents eab0da15 889218bb
# Python API Reference of Auto Tune
```eval_rst
.. contents::
```
## Trial
```eval_rst
.. autofunction:: nni.get_next_parameter
.. autofunction:: nni.get_current_parameter
.. autofunction:: nni.report_intermediate_result
.. autofunction:: nni.report_final_result
.. autofunction:: nni.get_experiment_id
.. autofunction:: nni.get_trial_id
.. autofunction:: nni.get_sequence_id
```
## Tuner
```eval_rst
.. autoclass:: nni.tuner.Tuner
:members:
.. autoclass:: nni.hyperopt_tuner.hyperopt_tuner.HyperoptTuner
:members:
.. autoclass:: nni.evolution_tuner.evolution_tuner.EvolutionTuner
:members:
.. autoclass:: nni.smac_tuner.SMACTuner
:members:
.. autoclass:: nni.gridsearch_tuner.GridSearchTuner
:members:
.. autoclass:: nni.networkmorphism_tuner.networkmorphism_tuner.NetworkMorphismTuner
:members:
.. autoclass:: nni.metis_tuner.metis_tuner.MetisTuner
:members:
.. autoclass:: nni.ppo_tuner.PPOTuner
:members:
.. autoclass:: nni.batch_tuner.batch_tuner.BatchTuner
:members:
.. autoclass:: nni.gp_tuner.gp_tuner.GPTuner
:members:
```
## Assessor
```eval_rst
.. autoclass:: nni.assessor.Assessor
:members:
.. autoclass:: nni.assessor.AssessResult
:members:
.. autoclass:: nni.curvefitting_assessor.CurvefittingAssessor
:members:
.. autoclass:: nni.medianstop_assessor.MedianstopAssessor
:members:
```
## Advisor
```eval_rst
.. autoclass:: nni.msg_dispatcher_base.MsgDispatcherBase
:members:
.. autoclass:: nni.hyperband_advisor.hyperband_advisor.Hyperband
:members:
.. autoclass:: nni.bohb_advisor.bohb_advisor.BOHB
:members:
```
Builtin-Assessors Builtin-Assessors
================= =================
In order to save our computing resources, NNI supports an early stop policy and creates **Assessor** to finish this job.
Assessor receives the intermediate result from Trial and decides whether the Trial should be killed by specific algorithm. Once the Trial experiment meets the early stop conditions(which means assessor is pessimistic about the final results), the assessor will kill the trial and the status of trial will be `"EARLY_STOPPED"`.
Here is an experimental result of MNIST after using 'Curvefitting' Assessor in 'maximize' mode, you can see that assessor successfully **early stopped** many trials with bad hyperparameters in advance. If you use assessor, we may get better hyperparameters under the same computing resources.
*Implemented code directory: config_assessor.yml <https://github.com/Microsoft/nni/blob/master/examples/trials/mnist-tfv1/config_assessor.yml>*
.. image:: ../img/Assessor.png
.. toctree:: .. toctree::
:maxdepth: 1 :maxdepth: 1
......
Builtin-Tuners Builtin-Tuners
================== ==============
NNI provides an easy way to adopt an approach to set up parameter tuning algorithms, we call them **Tuner**.
Tuner receives metrics from `Trial` to evaluate the performance of a specific parameters/architecture configures. And tuner sends next hyper-parameter or architecture configure to Trial.
.. toctree:: .. toctree::
:maxdepth: 1 :maxdepth: 1
......
###################
Feature Engineering Feature Engineering
=================== ###################
We are glad to announce the alpha release for Feature Engineering toolkit on top of NNI, We are glad to introduce Feature Engineering toolkit on top of NNI,
it's still in the experiment phase which might evolve based on usage feedback. it's still in the experiment phase which might evolve based on usage feedback.
We'd like to invite you to use, feedback and even contribute. We'd like to invite you to use, feedback and even contribute.
......
Advanced Features
=================
.. toctree::
Enable Multi-phase <AdvancedFeature/MultiPhase>
Write a New Tuner <Tuner/CustomizeTuner>
Write a New Assessor <Assessor/CustomizeAssessor>
Write a New Advisor <Tuner/CustomizeAdvisor>
Write a New Training Service <TrainingService/HowToImplementTrainingService>
#############################
Auto (Hyper-parameter) Tuning
#############################
Auto tuning is one of the key features provided by NNI, a main application scenario is
hyper-parameter tuning. Trial code is the one to be tuned, we provide a lot of popular
auto tuning algorithms (called Tuner), and some early stop algorithms (called Assessor).
NNI supports running trial on various training platforms, for example, on a local machine,
on several servers in a distributed manner, or on platforms such as OpenPAI, Kubernetes.
Other key features of NNI, such as model compression, feature engineering, can also be further
enhanced by auto tuning, which is described when introduing those features.
NNI has high extensibility, advanced users could customized their own Tuner, Assessor, and Training Service
according to their needs.
.. toctree::
:maxdepth: 2
Write Trial <TrialExample/Trials>
Tuners <builtin_tuner>
Assessors <builtin_assessor>
Training Platform <training_services>
Examples <examples>
WebUI <Tutorial/WebUI>
How to Debug <Tutorial/HowToDebug>
Advanced <hpo_advanced>
\ No newline at end of file
...@@ -2,9 +2,6 @@ ...@@ -2,9 +2,6 @@
Neural Network Intelligence Neural Network Intelligence
########################### ###########################
********
Contents
********
.. toctree:: .. toctree::
:caption: Table of Contents :caption: Table of Contents
...@@ -12,11 +9,14 @@ Contents ...@@ -12,11 +9,14 @@ Contents
:titlesonly: :titlesonly:
Overview Overview
QuickStart<Tutorial/QuickStart> Installation <installation>
Tutorials<tutorials> QuickStart <Tutorial/QuickStart>
Examples<examples> Auto (Hyper-parameter) Tuning <hyperparameter_tune>
Reference<reference> Neural Architecture Search <nas>
FAQ<Tutorial/FAQ> Model Compression <model_compression>
Contribution<contribution> Feature Engineering <feature_engineering>
Changelog<Release> References <reference>
Community Sharings<CommunitySharings/community_sharings> Community Sharings <CommunitySharings/community_sharings>
FAQ <Tutorial/FAQ>
How to Contribution <contribution>
Changelog <Release>
\ No newline at end of file
############
Installation
############
Currently we support installation on Linux, Mac and Windows. And also allow you to use docker.
.. toctree::
:maxdepth: 2
Linux & Mac <Tutorial/InstallationLinux>
Windows <Tutorial/InstallationWin>
Use Docker <Tutorial/HowToUseDocker>
\ No newline at end of file
...@@ -16,13 +16,6 @@ For details, please refer to the following tutorials: ...@@ -16,13 +16,6 @@ For details, please refer to the following tutorials:
:maxdepth: 2 :maxdepth: 2
Overview <Compressor/Overview> Overview <Compressor/Overview>
Level Pruner <Compressor/Pruner> Pruners <pruners>
AGP Pruner <Compressor/Pruner> Quantizers <quantizers>
L1Filter Pruner <Compressor/l1filterpruner>
Slim Pruner <Compressor/SlimPruner>
Lottery Ticket Pruner <Compressor/LotteryTicketHypothesis>
FPGM Pruner <Compressor/Pruner>
Naive Quantizer <Compressor/Quantizer>
QAT Quantizer <Compressor/Quantizer>
DoReFa Quantizer <Compressor/Quantizer>
Automatic Model Compression <Compressor/AutoCompression> Automatic Model Compression <Compressor/AutoCompression>
############## ##########################
NAS Algorithms Neural Architecture Search
############## ##########################
Automatic neural architecture search is taking an increasingly important role on finding better models. Automatic neural architecture search is taking an increasingly important role on finding better models.
Recent research works have proved the feasibility of automatic NAS, and also found some models that could beat manually designed and tuned models. Recent research works have proved the feasibility of automatic NAS, and also found some models that could beat manually tuned models.
Some of representative works are NASNet, ENAS, DARTS, Network Morphism, and Evolution. There are new innovations keeping emerging. Some of representative works are NASNet, ENAS, DARTS, Network Morphism, and Evolution. Moreover, new innovations keep emerging.
However, it takes great efforts to implement NAS algorithms, and it is hard to reuse code base of existing algorithms in new one. However, it takes great efforts to implement NAS algorithms, and it is hard to reuse code base of existing algorithms in a new one.
To facilitate NAS innovations (e.g., design and implement new NAS models, compare different NAS models side-by-side), To facilitate NAS innovations (e.g., design and implement new NAS models, compare different NAS models side-by-side),
an easy-to-use and flexible programming interface is crucial. an easy-to-use and flexible programming interface is crucial.
With this motivation, our ambition is to provide a unified architecture in NNI, Therefore, we provide a unified interface for NAS,
to accelerate innovations on NAS, and apply state-of-art algorithms on real world problems faster. to accelerate innovations on NAS, and apply state-of-art algorithms on real world problems faster.
For details, please refer to the following tutorials: For details, please refer to the following tutorials:
.. toctree:: .. toctree::
:maxdepth: 2 :maxdepth: 2
Overview <NAS/Overview> Overview <NAS/Overview>
NAS Interface <NAS/NasInterface> Tutorial <NAS/NasGuide>
ENAS <NAS/ENAS> ENAS <NAS/ENAS>
DARTS <NAS/DARTS> DARTS <NAS/DARTS>
P-DARTS <NAS/PDARTS> P-DARTS <NAS/PDARTS>
SPOS <NAS/SPOS> SPOS <NAS/SPOS>
CDARTS <NAS/CDARTS> CDARTS <NAS/CDARTS>
API Reference <NAS/NasReference>
############################
Supported Pruning Algorithms
############################
.. toctree::
:maxdepth: 1
Level Pruner <Compressor/Pruner>
AGP Pruner <Compressor/Pruner>
Lottery Ticket Pruner <Compressor/LotteryTicketHypothesis>
FPGM Pruner <Compressor/Pruner>
L1Filter Pruner <Compressor/l1filterpruner>
L2Filter Pruner <Compressor/Pruner>
ActivationAPoZRankFilterPruner <Compressor/Pruner>
ActivationMeanRankFilterPruner <Compressor/Pruner>
Slim Pruner <Compressor/SlimPruner>
#################################
Supported Quantization Algorithms
#################################
.. toctree::
:maxdepth: 1
Naive Quantizer <Compressor/Quantizer>
QAT Quantizer <Compressor/Quantizer>
DoReFa Quantizer <Compressor/Quantizer>
BNN Quantizer <Compressor/Quantizer>
\ No newline at end of file
...@@ -2,12 +2,11 @@ References ...@@ -2,12 +2,11 @@ References
================== ==================
.. toctree:: .. toctree::
:maxdepth: 3 :maxdepth: 2
Command Line <Tutorial/Nnictl> nnictl Commands <Tutorial/Nnictl>
Python API <sdk_reference> Experiment Configuration <Tutorial/ExperimentConfig>
Annotation <Tutorial/AnnotationSpec>
Configuration<Tutorial/ExperimentConfig>
Search Space <Tutorial/SearchSpaceSpec> Search Space <Tutorial/SearchSpaceSpec>
TrainingService <TrainingService/HowToImplementTrainingService> NNI Annotation <Tutorial/AnnotationSpec>
Framework Library <SupportedFramework_Library> SDK API References <sdk_reference>
Supported Framework Library <SupportedFramework_Library>
########################### ####################
Python API Reference Python API Reference
########################### ####################
Trial
------------------------
.. autofunction:: nni.get_next_parameter
.. autofunction:: nni.get_current_parameter
.. autofunction:: nni.report_intermediate_result
.. autofunction:: nni.report_final_result
.. autofunction:: nni.get_experiment_id
.. autofunction:: nni.get_trial_id
.. autofunction:: nni.get_sequence_id
.. toctree::
:maxdepth: 1
Tuner Auto Tune <autotune_ref>
------------------------ NAS <NAS/NasReference>
.. autoclass:: nni.tuner.Tuner \ No newline at end of file
:members:
.. autoclass:: nni.hyperopt_tuner.hyperopt_tuner.HyperoptTuner
:members:
.. autoclass:: nni.evolution_tuner.evolution_tuner.EvolutionTuner
:members:
.. autoclass:: nni.smac_tuner.SMACTuner
:members:
.. autoclass:: nni.gridsearch_tuner.GridSearchTuner
:members:
.. autoclass:: nni.networkmorphism_tuner.networkmorphism_tuner.NetworkMorphismTuner
:members:
.. autoclass:: nni.metis_tuner.metis_tuner.MetisTuner
:members:
.. autoclass:: nni.ppo_tuner.PPOTuner
:members:
.. autoclass:: nni.batch_tuner.batch_tuner.BatchTuner
:members:
.. autoclass:: nni.gp_tuner.gp_tuner.GPTuner
:members:
Assessor
------------------------
.. autoclass:: nni.assessor.Assessor
:members:
.. autoclass:: nni.assessor.AssessResult
:members:
.. autoclass:: nni.curvefitting_assessor.CurvefittingAssessor
:members:
.. autoclass:: nni.medianstop_assessor.MedianstopAssessor
:members:
Advisor
------------------------
.. autoclass:: nni.msg_dispatcher_base.MsgDispatcherBase
:members:
.. autoclass:: nni.hyperband_advisor.hyperband_advisor.Hyperband
:members:
.. autoclass:: nni.bohb_advisor.bohb_advisor.BOHB
:members:
#################
Tuners
#################
NNI provides an easy way to adopt an approach to set up parameter tuning algorithms, we call them **Tuner**.
Tuner receives metrics from `Trial` to evaluate the performance of a specific parameters/architecture configures. And tuner sends next hyper-parameter or architecture configure to Trial.
In NNI, we support two approaches to set the tuner: first is directly use builtin tuner provided by nni sdk, second is customize a tuner file by yourself. We also have Advisor that combines the functinality of Tuner & Assessor.
For details, please refer to the following tutorials:
.. toctree::
:maxdepth: 2
Builtin Tuners <builtin_tuner>
Customized Tuners <Tuner/CustomizeTuner>
Customized Advisor <Tuner/CustomizeAdvisor>
######################
Tutorials
######################
.. toctree::
:maxdepth: 2
Installation <Tutorial/Installation>
Write Trial <TrialExample/Trials>
Tuners <tuners>
Assessors <assessors>
NAS (Beta) <nas>
Model Compression (Beta) <model_compression>
Feature Engineering (Beta) <feature_engineering>
WebUI <Tutorial/WebUI>
Training Platform <training_services>
How to use docker <Tutorial/HowToUseDocker>
advanced
Debug HowTo <Tutorial/HowToDebug>
NNI on Windows <Tutorial/NniOnWindows>
\ No newline at end of file
...@@ -13,7 +13,12 @@ logger = logging.getLogger(__name__) ...@@ -13,7 +13,12 @@ logger = logging.getLogger(__name__)
class BaseMutator(nn.Module): class BaseMutator(nn.Module):
""" """
A mutator is responsible for mutating a graph by obtaining the search space from the network and implementing A mutator is responsible for mutating a graph by obtaining the search space from the network and implementing
callbacks that are called in ``forward`` in Mutables. callbacks that are called in ``forward`` in mutables.
Parameters
----------
model : nn.Module
PyTorch model to apply mutator on.
""" """
def __init__(self, model): def __init__(self, model):
...@@ -52,9 +57,19 @@ class BaseMutator(nn.Module): ...@@ -52,9 +57,19 @@ class BaseMutator(nn.Module):
@property @property
def mutables(self): def mutables(self):
"""
A generator of all modules inheriting :class:`~nni.nas.pytorch.mutables.Mutable`.
Modules are yielded in the order that they are defined in ``__init__``.
For mutables with their keys appearing multiple times, only the first one will appear.
"""
return self._structured_mutables return self._structured_mutables
def forward(self, *inputs): def forward(self, *inputs):
"""
Warnings
--------
Don't call forward of a mutator.
"""
raise RuntimeError("Forward is undefined for mutators.") raise RuntimeError("Forward is undefined for mutators.")
def __setattr__(self, name, value): def __setattr__(self, name, value):
...@@ -70,6 +85,7 @@ class BaseMutator(nn.Module): ...@@ -70,6 +85,7 @@ class BaseMutator(nn.Module):
Parameters Parameters
---------- ----------
mutable_scope : MutableScope mutable_scope : MutableScope
The mutable scope that is entered.
""" """
pass pass
...@@ -80,6 +96,7 @@ class BaseMutator(nn.Module): ...@@ -80,6 +96,7 @@ class BaseMutator(nn.Module):
Parameters Parameters
---------- ----------
mutable_scope : MutableScope mutable_scope : MutableScope
The mutable scope that is exited.
""" """
pass pass
...@@ -90,12 +107,14 @@ class BaseMutator(nn.Module): ...@@ -90,12 +107,14 @@ class BaseMutator(nn.Module):
Parameters Parameters
---------- ----------
mutable : LayerChoice mutable : LayerChoice
Module whose forward is called.
inputs : list of torch.Tensor inputs : list of torch.Tensor
The arguments of its forward function.
Returns Returns
------- -------
tuple of torch.Tensor and torch.Tensor tuple of torch.Tensor and torch.Tensor
output tensor and mask Output tensor and mask.
""" """
raise NotImplementedError raise NotImplementedError
...@@ -106,12 +125,14 @@ class BaseMutator(nn.Module): ...@@ -106,12 +125,14 @@ class BaseMutator(nn.Module):
Parameters Parameters
---------- ----------
mutable : InputChoice mutable : InputChoice
Mutable that is called.
tensor_list : list of torch.Tensor tensor_list : list of torch.Tensor
The arguments mutable is called with.
Returns Returns
------- -------
tuple of torch.Tensor and torch.Tensor tuple of torch.Tensor and torch.Tensor
output tensor and mask Output tensor and mask.
""" """
raise NotImplementedError raise NotImplementedError
...@@ -123,5 +144,6 @@ class BaseMutator(nn.Module): ...@@ -123,5 +144,6 @@ class BaseMutator(nn.Module):
Returns Returns
------- -------
dict dict
Mappings from mutable keys to decisions.
""" """
raise NotImplementedError raise NotImplementedError
...@@ -8,16 +8,33 @@ class BaseTrainer(ABC): ...@@ -8,16 +8,33 @@ class BaseTrainer(ABC):
@abstractmethod @abstractmethod
def train(self): def train(self):
"""
Override the method to train.
"""
raise NotImplementedError raise NotImplementedError
@abstractmethod @abstractmethod
def validate(self): def validate(self):
"""
Override the method to validate.
"""
raise NotImplementedError raise NotImplementedError
@abstractmethod @abstractmethod
def export(self, file): def export(self, file):
"""
Override the method to export to file.
Parameters
----------
file : str
File path to export to.
"""
raise NotImplementedError raise NotImplementedError
@abstractmethod @abstractmethod
def checkpoint(self): def checkpoint(self):
"""
Override to dump a checkpoint.
"""
raise NotImplementedError raise NotImplementedError
...@@ -11,6 +11,9 @@ _logger = logging.getLogger(__name__) ...@@ -11,6 +11,9 @@ _logger = logging.getLogger(__name__)
class Callback: class Callback:
"""
Callback provides an easy way to react to events like begin/end of epochs.
"""
def __init__(self): def __init__(self):
self.model = None self.model = None
...@@ -18,14 +21,42 @@ class Callback: ...@@ -18,14 +21,42 @@ class Callback:
self.trainer = None self.trainer = None
def build(self, model, mutator, trainer): def build(self, model, mutator, trainer):
"""
Callback needs to be built with model, mutator, trainer, to get updates from them.
Parameters
----------
model : nn.Module
Model to be trained.
mutator : nn.Module
Mutator that mutates the model.
trainer : BaseTrainer
Trainer that is to call the callback.
"""
self.model = model self.model = model
self.mutator = mutator self.mutator = mutator
self.trainer = trainer self.trainer = trainer
def on_epoch_begin(self, epoch): def on_epoch_begin(self, epoch):
"""
Implement this to do something at the begin of epoch.
Parameters
----------
epoch : int
Epoch number, starting from 0.
"""
pass pass
def on_epoch_end(self, epoch): def on_epoch_end(self, epoch):
"""
Implement this to do something at the end of epoch.
Parameters
----------
epoch : int
Epoch number, starting from 0.
"""
pass pass
def on_batch_begin(self, epoch): def on_batch_begin(self, epoch):
...@@ -36,6 +67,14 @@ class Callback: ...@@ -36,6 +67,14 @@ class Callback:
class LRSchedulerCallback(Callback): class LRSchedulerCallback(Callback):
"""
Calls scheduler on every epoch ends.
Parameters
----------
scheduler : LRScheduler
Scheduler to be called.
"""
def __init__(self, scheduler, mode="epoch"): def __init__(self, scheduler, mode="epoch"):
super().__init__() super().__init__()
assert mode == "epoch" assert mode == "epoch"
...@@ -43,28 +82,54 @@ class LRSchedulerCallback(Callback): ...@@ -43,28 +82,54 @@ class LRSchedulerCallback(Callback):
self.mode = mode self.mode = mode
def on_epoch_end(self, epoch): def on_epoch_end(self, epoch):
"""
Call ``self.scheduler.step()`` on epoch end.
"""
self.scheduler.step() self.scheduler.step()
class ArchitectureCheckpoint(Callback): class ArchitectureCheckpoint(Callback):
"""
Calls ``trainer.export()`` on every epoch ends.
Parameters
----------
checkpoint_dir : str
Location to save checkpoints.
"""
def __init__(self, checkpoint_dir): def __init__(self, checkpoint_dir):
super().__init__() super().__init__()
self.checkpoint_dir = checkpoint_dir self.checkpoint_dir = checkpoint_dir
os.makedirs(self.checkpoint_dir, exist_ok=True) os.makedirs(self.checkpoint_dir, exist_ok=True)
def on_epoch_end(self, epoch): def on_epoch_end(self, epoch):
"""
Dump to ``/checkpoint_dir/epoch_{number}.json`` on epoch end.
"""
dest_path = os.path.join(self.checkpoint_dir, "epoch_{}.json".format(epoch)) dest_path = os.path.join(self.checkpoint_dir, "epoch_{}.json".format(epoch))
_logger.info("Saving architecture to %s", dest_path) _logger.info("Saving architecture to %s", dest_path)
self.trainer.export(dest_path) self.trainer.export(dest_path)
class ModelCheckpoint(Callback): class ModelCheckpoint(Callback):
"""
Calls ``trainer.export()`` on every epoch ends.
Parameters
----------
checkpoint_dir : str
Location to save checkpoints.
"""
def __init__(self, checkpoint_dir): def __init__(self, checkpoint_dir):
super().__init__() super().__init__()
self.checkpoint_dir = checkpoint_dir self.checkpoint_dir = checkpoint_dir
os.makedirs(self.checkpoint_dir, exist_ok=True) os.makedirs(self.checkpoint_dir, exist_ok=True)
def on_epoch_end(self, epoch): def on_epoch_end(self, epoch):
"""
Dump to ``/checkpoint_dir/epoch_{number}.pth.tar`` on every epoch end.
``DataParallel`` object will have their inside modules exported.
"""
if isinstance(self.model, nn.DataParallel): if isinstance(self.model, nn.DataParallel):
state_dict = self.model.module.state_dict() state_dict = self.model.module.state_dict()
else: else:
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment