.. raw:: html
.. codesnippetcard::
:icon: ../img/thumbnails/hpo-icon-small.png
:title: Hyper-parameter Tuning
:link: autotune_ref
.. code-block::
params = nni.get_next_parameter()
class Net(nn.Module):
...
model = Net()
optimizer = optim.SGD(model.parameters(),
params['lr'],
params['momentum'])
for epoch in range(10):
train(...)
accuracy = test(model)
nni.report_final_result(accuracy)
.. codesnippetcard::
:icon: ../img/thumbnails/pruning-icon-small.png
:title: Model Pruning
:link: tutorials/pruning_quick_start_mnist
.. code-block::
# define a config_list
config = [{
'sparsity': 0.8,
'op_types': ['Conv2d']
}]
# generate masks for simulated pruning
wrapped_model, masks = \
L1NormPruner(model, config). \
compress()
# apply the masks for real speed up
ModelSpeedup(unwrapped_model, input, masks). \
speedup_model()
.. codesnippetcard::
:icon: ../img/thumbnails/quantization-icon-small.png
:title: Quantization
:link: tutorials/quantization_speed_up
.. code-block::
# define a config_list
config = [{
'quant_types': ['input', 'weight'],
'quant_bits': {'input': 8, 'weight': 8},
'op_types': ['Conv2d']
}]
# in case quantizer needs a extra training
quantizer = QAT_Quantizer(model, config)
quantizer.compress()
# Training...
# export calibration config and
# generate TensorRT engine for real speed up
calibration_config = quantizer.export_model(
model_path, calibration_path)
engine = ModelSpeedupTensorRT(
model, input_shape, config=calib_config)
engine.compress()
.. codesnippetcard::
:icon: ../img/thumbnails/multi-trial-nas-icon-small.png
:title: Neural Architecture Search
:link: tutorials/hello_nas
.. code-block:: diff
# define model space
- self.conv2 = nn.Conv2d(32, 64, 3, 1)
+ self.conv2 = nn.LayerChoice([
+ nn.Conv2d(32, 64, 3, 1),
+ DepthwiseSeparableConv(32, 64)
+ ])
# search strategy + evaluator
strategy = RegularizedEvolution()
evaluator = FunctionalEvaluator(
train_eval_fn)
# run experiment
RetiariiExperiment(model_space,
evaluator, strategy).run()
.. codesnippetcard::
:icon: ../img/thumbnails/one-shot-nas-icon-small.png
:title: One-shot NAS
:link: nas/index
.. code-block::
# define model space
space = AnySearchSpace()
# get a darts trainer
trainer = DartsTrainer(space, loss, metrics)
trainer.fit()
# get final searched architecture
arch = trainer.export()
.. codesnippetcard::
:icon: ../img/thumbnails/feature-engineering-icon-small.png
:title: Feature Engineering
:link: FeatureEngineering/Overview
.. code-block::
selector = GBDTSelector()
selector.fit(
X_train, y_train,
lgb_params=lgb_params,
eval_ratio=eval_ratio,
early_stopping_rounds=10,
importance_type='gain',
num_boost_round=1000)
# get selected features
features = selector.get_selected_features()
.. End of code snippet card
.. raw:: html
.. raw:: html
The tool manages automated machine learning (AutoML) experiments,
dispatches and runs
experiments' trial jobs generated by tuning algorithms to search the best neural
architecture and/or hyper-parameters in
different training environments like
Local Machine,
Remote Servers,
OpenPAI,
Kubeflow,
FrameworkController on K8S (AKS etc.),
DLWorkspace (aka. DLTS),
AML (Azure Machine Learning),
AdaptDL (aka. ADL), other cloud options and even Hybrid mode.
Who should consider using NNI
- Those who want to try different AutoML algorithms in their training code/model.
- Those who want to run AutoML trial jobs in different environments to speed up search.
- Researchers and data scientists who want to easily implement and experiement new AutoML
algorithms
, may it be: hyperparameter tuning algorithm,
neural architect search algorithm or model compression algorithm.
- ML Platform owners who want to support AutoML in their platform
What's NEW!
NNI capabilities in a glance
NNI provides CommandLine Tool as well as an user friendly WebUI to manage training experiements.
With the extensible API, you can customize your own AutoML algorithms and training services.
To make it easy for new users, NNI also provides a set of build-in stat-of-the-art
AutoML algorithms and out of box support for popular training platforms.
Within the following table, we summarized the current NNI capabilities,
we are gradually adding new capabilities and we'd love to have your contribution.
|
Frameworks & Libraries
|
Algorithms
|
Training Services
|
| Built-in |
- Supported Frameworks
- PyTorch
- Keras
- TensorFlow
- MXNet
- Caffe2
More...
- Supported Libraries
- Scikit-learn
- XGBoost
- LightGBM
More...
|
Hyperparameter Tuning
Exhaustive search
Heuristic search
Bayesian optimization
Neural Architecture Search (Retiarii)
Model Compression
Feature Engineering (Beta)
Early Stop Algorithms
|
|
| References |
|
|
|
Installation
Install
NNI supports and is tested on Ubuntu >= 16.04, macOS >= 10.14.1,
and Windows 10 >= 1809. Simply run the following pip install
in an environment that has python 64-bit >= 3.6.
Linux or macOS
python3 -m pip install --upgrade nni
Windows
python -m pip install --upgrade nni
If you want to try latest code, please
install
NNI from source code.
For detail system requirements of NNI, please refer to
here
for Linux & macOS, and
here for Windows.
Note:
- If there is any privilege issue, add --user to install NNI in the user directory.
- Currently NNI on Windows supports local, remote and pai mode. Anaconda or Miniconda is highly
recommended to install NNI on Windows.
- If there is any error like Segmentation fault, please refer to FAQ. For FAQ on Windows, please refer
to NNI on Windows.
Verify installation
The following example is built on TensorFlow 1.x. Make sure TensorFlow 1.x is used when running
it.
-
Download the examples via clone the source code.
git clone -b v2.6 https://github.com/Microsoft/nni.git
-
Run the MNIST example.
Linux or macOS
nnictl create --config nni/examples/trials/mnist-pytorch/config.yml
Windows
nnictl create --config nni\examples\trials\mnist-pytorch\config_windows.yml
-
Wait for the message INFO: Successfully started experiment! in the command line.
This message indicates that your experiment has been successfully started.
You can explore the experiment using the Web UI url.
INFO: Starting restful server...
INFO: Successfully started Restful server!
INFO: Setting local config...
INFO: Successfully set local config!
INFO: Starting experiment...
INFO: Successfully started experiment!
-----------------------------------------------------------------------
The experiment id is egchD4qy
The Web UI urls are: http://223.255.255.1:8080 http://127.0.0.1:8080
-----------------------------------------------------------------------
You can use these commands to get more information about the experiment
-----------------------------------------------------------------------
commands description
1. nnictl experiment show show the information of experiments
2. nnictl trial ls list all of trial jobs
3. nnictl top monitor the status of running experiments
4. nnictl log stderr show stderr log content
5. nnictl log stdout show stdout log content
6. nnictl stop stop an experiment
7. nnictl trial kill kill a trial job by id
8. nnictl --help get help information about nnictl
-----------------------------------------------------------------------
-
Open the Web UI url in your browser, you can view detail information of the experiment and
all the submitted trial jobs as shown below. Here are more Web UI
pages.
Releases and Contributing
NNI has a monthly release cycle (major releases). Please let us know if you encounter a bug by filling an issue.
We appreciate all contributions. If you are planning to contribute any bug-fixes, please do so without further discussions.
If you plan to contribute new features, new tuners, new training services, etc. please first open an issue or reuse an exisiting issue, and discuss the feature with us. We will discuss with you on the issue timely or set up conference calls if needed.
We appreciate all contributions and thank all the contributors!
Feedback
Join IM discussion groups:
| Gitter |
|
WeChat |
|
OR |
|
Test status
Essentials
| Type |
Status |
| Fast test |
|
| Full linux |
|
| Full windows |
|
Training services
| Type |
Status |
| Remote - linux to linux |
|
| Remote - linux to windows |
|
| Remote - windows to linux |
|
| OpenPAI |
|
| Frameworkcontroller |
|
| Kubeflow |
|
| Hybrid |
|
| AzureML |
|
Related Projects
Targeting at openness and advancing state-of-art technology,
Microsoft Research (MSR)
had also released few
other open source projects.
We encourage researchers and students leverage these projects to accelerate the AI development and research.