# SOME DESCRIPTIVE TITLE. # Copyright (C) 2022, Microsoft # This file is distributed under the same license as the NNI package. # FIRST AUTHOR , 2022. # #, fuzzy msgid "" msgstr "" "Project-Id-Version: NNI \n" "Report-Msgid-Bugs-To: \n" "POT-Creation-Date: 2022-04-20 05:50+0000\n" "PO-Revision-Date: YEAR-MO-DA HO:MI+ZONE\n" "Last-Translator: FULL NAME \n" "Language-Team: LANGUAGE \n" "MIME-Version: 1.0\n" "Content-Type: text/plain; charset=utf-8\n" "Content-Transfer-Encoding: 8bit\n" "Generated-By: Babel 2.9.1\n" #: ../../source/nas/overview.rst:2 msgid "Overview" msgstr "" #: ../../source/nas/overview.rst:4 msgid "" "NNI's latest NAS supports are all based on Retiarii Framework, users who " "are still on `early version using NNI NAS v1.0 " "`__ shall migrate your work " "to Retiarii as soon as possible. We plan to remove the legacy NAS " "framework in the next few releases." msgstr "" #: ../../source/nas/overview.rst:6 msgid "" "PyTorch is the **only supported framework on Retiarii**. Inquiries of NAS" " support on Tensorflow is in `this discussion " "`__. If you intend to " "run NAS with DL frameworks other than PyTorch and Tensorflow, please " "`open new issues `__ to let us " "know." msgstr "" #: ../../source/nas/overview.rst:9 msgid "Basics" msgstr "" #: ../../source/nas/overview.rst:11 msgid "" "Automatic neural architecture search is playing an increasingly important" " role in finding better models. Recent research has proven the " "feasibility of automatic NAS and has led to models that beat many " "manually designed and tuned models. Representative works include `NASNet " "`__, `ENAS " "`__, `DARTS " "`__, `Network Morphism " "`__, and `Evolution " "`__. In addition, new innovations " "continue to emerge." msgstr "" #: ../../source/nas/overview.rst:13 msgid "" "High-level speaking, aiming to solve any particular task with neural " "architecture search typically requires: search space design, search " "strategy selection, and performance evaluation. The three components work" " together with the following loop (from the famous `NAS survey " "`__):" msgstr "" #: ../../source/nas/overview.rst:19 msgid "In this figure:" msgstr "" #: ../../source/nas/overview.rst:21 msgid "" "*Model search space* means a set of models from which the best model is " "explored/searched. Sometimes we use *search space* or *model space* in " "short." msgstr "" #: ../../source/nas/overview.rst:22 msgid "" "*Exploration strategy* is the algorithm that is used to explore a model " "search space. Sometimes we also call it *search strategy*." msgstr "" #: ../../source/nas/overview.rst:23 msgid "" "*Model evaluator* is responsible for training a model and evaluating its " "performance." msgstr "" #: ../../source/nas/overview.rst:25 msgid "" "The process is similar to :doc:`Hyperparameter Optimization " "`, except that the target is the best architecture rather " "than hyperparameter. Concretely, an exploration strategy selects an " "architecture from a predefined search space. The architecture is passed " "to a performance evaluation to get a score, which represents how well " "this architecture performs on a particular task. This process is repeated" " until the search process is able to find the best architecture." msgstr "" #: ../../source/nas/overview.rst:28 msgid "Key Features" msgstr "" #: ../../source/nas/overview.rst:30 msgid "" "The current NAS framework in NNI is powered by the research of `Retiarii:" " A Deep Learning Exploratory-Training Framework " "`__, where " "we highlight the following features:" msgstr "" #: ../../source/nas/overview.rst:32 msgid ":doc:`Simple APIs to construct search space easily `" msgstr "" #: ../../source/nas/overview.rst:33 msgid ":doc:`SOTA NAS algorithms to explore search space `" msgstr "" #: ../../source/nas/overview.rst:34 msgid "" ":doc:`Experiment backend support to scale up experiments on large-scale " "AI platforms `" msgstr "" #: ../../source/nas/overview.rst:37 msgid "Why NAS with NNI" msgstr "" #: ../../source/nas/overview.rst:39 msgid "" "We list out the three perspectives where NAS can be particularly " "challegning without NNI. NNI provides solutions to relieve users' " "engineering effort when they want to try NAS techniques in their own " "scenario." msgstr "" #: ../../source/nas/overview.rst:42 msgid "Search Space Design" msgstr "" #: ../../source/nas/overview.rst:44 msgid "" "The search space defines which architectures can be represented in " "principle. Incorporating prior knowledge about typical properties of " "architectures well-suited for a task can reduce the size of the search " "space and simplify the search. However, this also introduces a human " "bias, which may prevent finding novel architectural building blocks that " "go beyond the current human knowledge. Search space design can be very " "challenging for beginners, who might not possess the experience to " "balance the richness and simplicity." msgstr "" #: ../../source/nas/overview.rst:46 msgid "" "In NNI, we provide a wide range of APIs to build the search space. There " "are :doc:`high-level APIs `, that enables the " "possibility to incorporate human knowledge about what makes a good " "architecture or search space. There are also :doc:`low-level APIs " "`, that is a list of primitives to construct a network from " "operation to operation." msgstr "" #: ../../source/nas/overview.rst:49 msgid "Exploration strategy" msgstr "" #: ../../source/nas/overview.rst:51 msgid "" "The exploration strategy details how to explore the search space (which " "is often exponentially large). It encompasses the classical exploration-" "exploitation trade-off since, on the one hand, it is desirable to find " "well-performing architectures quickly, while on the other hand, premature" " convergence to a region of suboptimal architectures should be avoided. " "The \"best\" exploration strategy for a particular scenario is usually " "found via trial-and-error. As many state-of-the-art strategies are " "implemented with their own code-base, it becomes very troublesome to " "switch from one to another." msgstr "" #: ../../source/nas/overview.rst:53 msgid "" "In NNI, we have also provided :doc:`a list of strategies " "`. Some of them are powerful yet time consuming, " "while others might be suboptimal but really efficient. Given that all " "strategies are implemented with a unified interface, users can always " "find one that matches their need." msgstr "" #: ../../source/nas/overview.rst:56 msgid "Performance estimation" msgstr "" #: ../../source/nas/overview.rst:58 msgid "" "The objective of NAS is typically to find architectures that achieve high" " predictive performance on unseen data. Performance estimation refers to " "the process of estimating this performance. The problem with performance " "estimation is mostly its scalability, i.e., how can I run and manage " "multiple trials simultaneously." msgstr "" #: ../../source/nas/overview.rst:60 msgid "" "In NNI, we standardize this process is implemented with :doc:`evaluator " "`, which is responsible of estimating a model's performance. " "NNI has quite a few built-in supports of evaluators, ranging from the " "simplest option, e.g., to perform a standard training and validation of " "the architecture on data, to complex configurations and implementations. " "Evaluators are run in *trials*, where trials can be spawn onto " "distributed platforms with our powerful :doc:`training service " "`." msgstr "" #: ../../source/nas/overview.rst:63 msgid "Tutorials" msgstr "" #: ../../source/nas/overview.rst:65 msgid "" "To start using NNI NAS framework, we recommend at least going through the" " following tutorials:" msgstr "" #: ../../source/nas/overview.rst:67 msgid ":doc:`Quickstart `" msgstr "" #: ../../source/nas/overview.rst:68 msgid ":doc:`construct_space`" msgstr "" #: ../../source/nas/overview.rst:69 msgid ":doc:`exploration_strategy`" msgstr "" #: ../../source/nas/overview.rst:70 msgid ":doc:`evaluator`" msgstr "" #: ../../source/nas/overview.rst:73 msgid "Resources" msgstr "" #: ../../source/nas/overview.rst:75 msgid "" "The following articles will help with a better understanding of the " "current arts of NAS:" msgstr "" #: ../../source/nas/overview.rst:77 msgid "" "`Neural Architecture Search: A Survey " "`__" msgstr "" #: ../../source/nas/overview.rst:78 msgid "" "`A Comprehensive Survey of Neural Architecture Search: Challenges and " "Solutions `__" msgstr "" #~ msgid "Basics" #~ msgstr "" #~ msgid "Basic Concepts" #~ msgstr "" #~ msgid "" #~ "The process is similar to " #~ ":doc:`Hyperparameter Optimization `, " #~ "except that the target is the best" #~ " architecture rather than hyperparameter. " #~ "Concretely, an exploration strategy selects" #~ " an architecture from a predefined " #~ "search space. The architecture is passed" #~ " to a performance evaluation to get" #~ " a score, which represents how well" #~ " this architecture performs on a " #~ "particular task. This process is " #~ "repeated until the search process is " #~ "able to find the best architecture." #~ msgstr "" #~ msgid "" #~ "In NNI, we provide a wide range" #~ " of APIs to build the search " #~ "space. There are :doc:`high-level APIs" #~ " `, that enables incorporating" #~ " human knowledge about what makes a" #~ " good architecture or search space. " #~ "There are also :doc:`low-level APIs " #~ "`, that is a list of " #~ "primitives to construct a network from" #~ " operator to operator." #~ msgstr "" #~ msgid "" #~ "In NNI, we standardize this process " #~ "is implemented with :doc:`evaluator " #~ "`, which is responsible of " #~ "estimating a model's performance. The " #~ "choices of evaluators also range from" #~ " the simplest option, e.g., to " #~ "perform a standard training and " #~ "validation of the architecture on data," #~ " to complex configurations and " #~ "implementations. Evaluators are run in " #~ "*trials*, where trials can be spawn " #~ "onto distributed platforms with our " #~ "powerful :doc:`training service " #~ "`." #~ msgstr ""