Unverified Commit 21584f11 authored by QuanluZhang's avatar QuanluZhang Committed by GitHub
Browse files

Merge pull request #4957 from microsoft/v2.8

parents 45af1d6b e8c78bba
......@@ -20,8 +20,8 @@ NNI automates feature engineering, neural architecture search, hyperparameter tu
## What's NEW! &nbsp;<a href="#nni-released-reminder"><img width="48" src="docs/img/release_icon.png"></a>
* **New release**: [v2.7 is available](https://github.com/microsoft/nni/releases/tag/v2.7) - _released on Apr-18-2022_
* **New demo available**: [Youtube entry](https://www.youtube.com/channel/UCKcafm6861B2mnYhPbZHavw) | [Bilibili 入口](https://space.bilibili.com/1649051673) - _last updated on Apr-18-2022_
* **New release**: [v2.8 is available](https://github.com/microsoft/nni/releases/tag/v2.8) - _released on June-22-2022_
* **New demo available**: [Youtube entry](https://www.youtube.com/channel/UCKcafm6861B2mnYhPbZHavw) | [Bilibili 入口](https://space.bilibili.com/1649051673) - _last updated on June-22-2022_
* **New webinar**: [Introducing Retiarii: A deep learning exploratory-training framework on NNI](https://note.microsoft.com/MSR-Webinar-Retiarii-Registration-Live.html) - _scheduled on June-24-2021_
* **Newly upgraded documentation**: [Doc upgraded](https://nni.readthedocs.io/en/stable)
......
......@@ -31,7 +31,7 @@ author = 'Microsoft'
version = ''
# The full version, including alpha/beta/rc tags
# FIXME: this should be written somewhere globally
release = 'v2.7'
release = 'v2.8'
# -- General configuration ---------------------------------------------------
......
......@@ -5,6 +5,66 @@
Change Log
==========
Release 2.8 - 6/22/2022
-----------------------
Neural Architecture Search
^^^^^^^^^^^^^^^^^^^^^^^^^^
* Align user experience of one-shot NAS with multi-trial NAS, i.e., users can use one-shot NAS by specifying the corresponding strategy (`doc <https://nni.readthedocs.io/en/v2.8/nas/exploration_strategy.html#one-shot-strategy>`__)
* Support multi-GPU training of one-shot NAS
* *Preview* Support load/retrain the pre-searched model of some search spaces, i.e., 18 models in 4 different search spaces (`doc <https://github.com/microsoft/nni/tree/v2.8/nni/retiarii/hub>`__)
* Support AutoFormer search space in search space hub, thanks our collaborators @nbl97 and @penghouwen
* One-shot NAS supports the NAS API ``repeat`` and ``cell``
* Refactor of RetiariiExperiment to share the common implementation with HPO experiment
* CGO supports pytorch-lightning 1.6
Model Compression
^^^^^^^^^^^^^^^^^
* *Preview* Refactor and improvement of automatic model compress with a new ``CompressionExperiment``
* Support customizating module replacement function for unsupported modules in model speedup (`doc <https://nni.readthedocs.io/en/v2.8/reference/compression/pruning_speedup.html#nni.compression.pytorch.speedup.ModelSpeedup>`__)
* Support the module replacement function for some user mentioned modules
* Support output_padding for convtranspose2d in model speedup, thanks external contributor @haoshuai-orka
Hyper-Parameter Optimization
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
* Make ``config.tuner.name`` case insensitive
* Allow writing configurations of advisor in tuner format, i.e., aligning the configuration of advisor and tuner
Experiment
^^^^^^^^^^
* Support launching multiple HPO experiments in one process
* Internal refactors and improvements
* Refactor of the logging mechanism in NNI
* Refactor of NNI manager globals for flexible and high extensibility
* Migrate dispatcher IPC to WebSocket
* Decouple lock stuffs from experiments manager logic
* Use launcher's sys.executable to detect Python interpreter
WebUI
^^^^^
* Improve user experience of trial ordering in the overview page
* Fix the update issue in the trial detail page
Documentation
^^^^^^^^^^^^^
* A new translation framework for document
* Add a new quantization demo (`doc <https://nni.readthedocs.io/en/v2.8/tutorials/quantization_quick_start_mnist.html>`__)
Notable Bugfixes
^^^^^^^^^^^^^^^^
* Fix TPE import issue for old metrics
* Fix the issue in TPE nested search space
* Support ``RecursiveScriptModule`` in speedup
* Fix the issue of failed "implicit type cast" in merge_parameter()
Release 2.7 - 4/18/2022
-----------------------
......
......@@ -47,7 +47,7 @@ export function serveWebSocket(ws: WebSocket): void {
}
class WebSocketChannelImpl implements WebSocketChannel {
private deferredInit: Deferred<void> = new Deferred<void>();
private deferredInit: Deferred<void> | null = new Deferred<void>();
private emitter: EventEmitter = new EventEmitter();
private heartbeatTimer!: NodeJS.Timer;
private serving: boolean = false;
......@@ -56,8 +56,13 @@ class WebSocketChannelImpl implements WebSocketChannel {
public setWebSocket(ws: WebSocket): void {
if (this.ws !== undefined) {
logger.error('A second client is trying to connect');
ws.close(4030, 'Already serving a tuner.');
logger.error('A second client is trying to connect.');
ws.close(4030, 'Already serving a tuner');
return;
}
if (this.deferredInit === null) {
logger.error('Connection timed out.');
ws.close(4080, 'Timeout');
return;
}
......@@ -72,11 +77,26 @@ class WebSocketChannelImpl implements WebSocketChannel {
this.heartbeatTimer = setInterval(this.heartbeat.bind(this), heartbeatInterval);
this.deferredInit.resolve();
this.deferredInit = null;
}
public init(): Promise<void> {
logger.debug(this.ws === undefined ? 'Waiting connection...' : 'Initialized.');
return this.deferredInit.promise;
if (this.ws === undefined) {
logger.debug('Waiting connection...');
// TODO: This is a quick fix. It should check tuner's process status instead.
setTimeout(() => {
if (this.deferredInit !== null) {
const msg = 'Tuner did not connect in 10 seconds. Please check tuner (dispatcher) log.';
this.deferredInit.reject(new Error('tuner_command_channel: ' + msg));
this.deferredInit = null;
}
}, 10000);
return this.deferredInit!.promise;
} else {
logger.debug('Initialized.');
return Promise.resolve();
}
}
public async shutdown(): Promise<void> {
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment