tutorials.po 24.7 KB
Newer Older
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
676
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
700
701
702
703
704
705
706
707
708
709
710
711
712
713
714
715
716
717
718
719
720
721
722
723
724
725
726
727
728
729
730
731
732
733
734
735
736
737
738
739
740
741
742
743
744
745
746
747
748
749
# SOME DESCRIPTIVE TITLE.
# Copyright (C) 2022, Microsoft
# This file is distributed under the same license as the NNI package.
# FIRST AUTHOR <EMAIL@ADDRESS>, 2022.
#
#, fuzzy
msgid ""
msgstr ""
"Project-Id-Version: NNI \n"
"Report-Msgid-Bugs-To: \n"
"POT-Creation-Date: 2022-04-13 03:14+0000\n"
"PO-Revision-Date: YEAR-MO-DA HO:MI+ZONE\n"
"Last-Translator: FULL NAME <EMAIL@ADDRESS>\n"
"Language-Team: LANGUAGE <LL@li.org>\n"
"MIME-Version: 1.0\n"
"Content-Type: text/plain; charset=utf-8\n"
"Content-Transfer-Encoding: 8bit\n"
"Generated-By: Babel 2.9.1\n"

#: ../../source/tutorials/hello_nas.rst:13
msgid ""
"Click :ref:`here <sphx_glr_download_tutorials_hello_nas.py>` to download "
"the full example code"
msgstr ""

#: ../../source/tutorials/hello_nas.rst:22
msgid "Hello, NAS!"
msgstr ""

#: ../../source/tutorials/hello_nas.rst:24
msgid ""
"This is the 101 tutorial of Neural Architecture Search (NAS) on NNI. In "
"this tutorial, we will search for a neural architecture on MNIST dataset "
"with the help of NAS framework of NNI, i.e., *Retiarii*. We use multi-"
"trial NAS as an example to show how to construct and explore a model "
"space."
msgstr ""

#: ../../source/tutorials/hello_nas.rst:28
msgid ""
"There are mainly three crucial components for a neural architecture "
"search task, namely,"
msgstr ""

#: ../../source/tutorials/hello_nas.rst:30
msgid "Model search space that defines a set of models to explore."
msgstr ""

#: ../../source/tutorials/hello_nas.rst:31
msgid "A proper strategy as the method to explore this model space."
msgstr ""

#: ../../source/tutorials/hello_nas.rst:32
msgid ""
"A model evaluator that reports the performance of every model in the "
"space."
msgstr ""

#: ../../source/tutorials/hello_nas.rst:34
msgid ""
"Currently, PyTorch is the only supported framework by Retiarii, and we "
"have only tested **PyTorch 1.7 to 1.10**. This tutorial assumes PyTorch "
"context but it should also apply to other frameworks, which is in our "
"future plan."
msgstr ""

#: ../../source/tutorials/hello_nas.rst:38
msgid "Define your Model Space"
msgstr ""

#: ../../source/tutorials/hello_nas.rst:40
msgid ""
"Model space is defined by users to express a set of models that users "
"want to explore, which contains potentially good-performing models. In "
"this framework, a model space is defined with two parts: a base model and"
" possible mutations on the base model."
msgstr ""

#: ../../source/tutorials/hello_nas.rst:46
msgid "Define Base Model"
msgstr ""

#: ../../source/tutorials/hello_nas.rst:48
msgid ""
"Defining a base model is almost the same as defining a PyTorch (or "
"TensorFlow) model. Usually, you only need to replace the code ``import "
"torch.nn as nn`` with ``import nni.retiarii.nn.pytorch as nn`` to use our"
" wrapped PyTorch modules."
msgstr ""

#: ../../source/tutorials/hello_nas.rst:52
msgid "Below is a very simple example of defining a base model."
msgstr ""

#: ../../source/tutorials/hello_nas.rst:93
msgid ""
"Always keep in mind that you should use ``import nni.retiarii.nn.pytorch "
"as nn`` and :meth:`nni.retiarii.model_wrapper`. Many mistakes are a "
"result of forgetting one of those. Also, please use ``torch.nn`` for "
"submodules of ``nn.init``, e.g., ``torch.nn.init`` instead of "
"``nn.init``."
msgstr ""

#: ../../source/tutorials/hello_nas.rst:98
msgid "Define Model Mutations"
msgstr ""

#: ../../source/tutorials/hello_nas.rst:100
msgid ""
"A base model is only one concrete model not a model space. We provide "
":doc:`API and Primitives </nas/construct_space>` for users to express how"
" the base model can be mutated. That is, to build a model space which "
"includes many models."
msgstr ""

#: ../../source/tutorials/hello_nas.rst:103
msgid "Based on the above base model, we can define a model space as below."
msgstr ""

#: ../../source/tutorials/hello_nas.rst:134
msgid "This results in the following code:"
msgstr ""

#: ../../source/tutorials/hello_nas.rst:189
#: ../../source/tutorials/hello_nas.rst:259
#: ../../source/tutorials/hello_nas.rst:551
#: ../../source/tutorials/hpo_quickstart_pytorch/main.rst:247
#: ../../source/tutorials/hpo_quickstart_pytorch/main.rst:284
#: ../../source/tutorials/pruning_quick_start_mnist.rst:65
#: ../../source/tutorials/pruning_quick_start_mnist.rst:107
#: ../../source/tutorials/pruning_quick_start_mnist.rst:172
#: ../../source/tutorials/pruning_quick_start_mnist.rst:218
#: ../../source/tutorials/pruning_quick_start_mnist.rst:255
#: ../../source/tutorials/pruning_quick_start_mnist.rst:283
msgid "Out:"
msgstr ""

#: ../../source/tutorials/hello_nas.rst:210
msgid ""
"This example uses two mutation APIs, :class:`nn.LayerChoice "
"<nni.retiarii.nn.pytorch.LayerChoice>` and :class:`nn.InputChoice "
"<nni.retiarii.nn.pytorch.ValueChoice>`. :class:`nn.LayerChoice "
"<nni.retiarii.nn.pytorch.LayerChoice>` takes a list of candidate modules "
"(two in this example), one will be chosen for each sampled model. It can "
"be used like normal PyTorch module. :class:`nn.InputChoice "
"<nni.retiarii.nn.pytorch.ValueChoice>` takes a list of candidate values, "
"one will be chosen to take effect for each sampled model."
msgstr ""

#: ../../source/tutorials/hello_nas.rst:219
msgid ""
"More detailed API description and usage can be found :doc:`here "
"</nas/construct_space>`."
msgstr ""

#: ../../source/tutorials/hello_nas.rst:223
msgid ""
"We are actively enriching the mutation APIs, to facilitate easy "
"construction of model space. If the currently supported mutation APIs "
"cannot express your model space, please refer to :doc:`this doc "
"</nas/mutator>` for customizing mutators."
msgstr ""

#: ../../source/tutorials/hello_nas.rst:228
msgid "Explore the Defined Model Space"
msgstr ""

#: ../../source/tutorials/hello_nas.rst:230
msgid ""
"There are basically two exploration approaches: (1) search by evaluating "
"each sampled model independently, which is the search approach in :ref"
":`multi-trial NAS <multi-trial-nas>` and (2) one-shot weight-sharing "
"based search, which is used in one-shot NAS. We demonstrate the first "
"approach in this tutorial. Users can refer to :ref:`here <one-shot-nas>` "
"for the second approach."
msgstr ""

#: ../../source/tutorials/hello_nas.rst:235
msgid ""
"First, users need to pick a proper exploration strategy to explore the "
"defined model space. Second, users need to pick or customize a model "
"evaluator to evaluate the performance of each explored model."
msgstr ""

#: ../../source/tutorials/hello_nas.rst:239
msgid "Pick an exploration strategy"
msgstr ""

#: ../../source/tutorials/hello_nas.rst:241
msgid ""
"Retiarii supports many :doc:`exploration strategies "
"</nas/exploration_strategy>`."
msgstr ""

#: ../../source/tutorials/hello_nas.rst:243
msgid "Simply choosing (i.e., instantiate) an exploration strategy as below."
msgstr ""

#: ../../source/tutorials/hello_nas.rst:273
msgid "Pick or customize a model evaluator"
msgstr ""

#: ../../source/tutorials/hello_nas.rst:275
msgid ""
"In the exploration process, the exploration strategy repeatedly generates"
" new models. A model evaluator is for training and validating each "
"generated model to obtain the model's performance. The performance is "
"sent to the exploration strategy for the strategy to generate better "
"models."
msgstr ""

#: ../../source/tutorials/hello_nas.rst:279
msgid ""
"Retiarii has provided :doc:`built-in model evaluators </nas/evaluator>`, "
"but to start with, it is recommended to use :class:`FunctionalEvaluator "
"<nni.retiarii.evaluator.FunctionalEvaluator>`, that is, to wrap your own "
"training and evaluation code with one single function. This function "
"should receive one single model class and uses "
":func:`nni.report_final_result` to report the final score of this model."
msgstr ""

#: ../../source/tutorials/hello_nas.rst:284
msgid ""
"An example here creates a simple evaluator that runs on MNIST dataset, "
"trains for 2 epochs, and reports its validation accuracy."
msgstr ""

#: ../../source/tutorials/hello_nas.rst:367
msgid "Create the evaluator"
msgstr ""

#: ../../source/tutorials/hello_nas.rst:386
msgid ""
"The ``train_epoch`` and ``test_epoch`` here can be any customized "
"function, where users can write their own training recipe."
msgstr ""

#: ../../source/tutorials/hello_nas.rst:389
msgid ""
"It is recommended that the ``evaluate_model`` here accepts no additional "
"arguments other than ``model_cls``. However, in the :doc:`advanced "
"tutorial </nas/evaluator>`, we will show how to use additional arguments "
"in case you actually need those. In future, we will support mutation on "
"the arguments of evaluators, which is commonly called \"Hyper-parmeter "
"tuning\"."
msgstr ""

#: ../../source/tutorials/hello_nas.rst:394
msgid "Launch an Experiment"
msgstr ""

#: ../../source/tutorials/hello_nas.rst:396
msgid ""
"After all the above are prepared, it is time to start an experiment to do"
" the model search. An example is shown below."
msgstr ""

#: ../../source/tutorials/hello_nas.rst:417
msgid ""
"The following configurations are useful to control how many trials to run"
" at most / at the same time."
msgstr ""

#: ../../source/tutorials/hello_nas.rst:436
msgid ""
"Remember to set the following config if you want to GPU. "
"``use_active_gpu`` should be set true if you wish to use an occupied GPU "
"(possibly running a GUI)."
msgstr ""

#: ../../source/tutorials/hello_nas.rst:456
msgid ""
"Launch the experiment. The experiment should take several minutes to "
"finish on a workstation with 2 GPUs."
msgstr ""

#: ../../source/tutorials/hello_nas.rst:474
msgid ""
"Users can also run Retiarii Experiment with :doc:`different training "
"services </experiment/training_service/overview>` besides ``local`` "
"training service."
msgstr ""

#: ../../source/tutorials/hello_nas.rst:478
msgid "Visualize the Experiment"
msgstr ""

#: ../../source/tutorials/hello_nas.rst:480
msgid ""
"Users can visualize their experiment in the same way as visualizing a "
"normal hyper-parameter tuning experiment. For example, open "
"``localhost:8081`` in your browser, 8081 is the port that you set in "
"``exp.run``. Please refer to :doc:`here "
"</experiment/web_portal/web_portal>` for details."
msgstr ""

#: ../../source/tutorials/hello_nas.rst:484
msgid ""
"We support visualizing models with 3rd-party visualization engines (like "
"`Netron <https://netron.app/>`__). This can be used by clicking "
"``Visualization`` in detail panel for each trial. Note that current "
"visualization is based on `onnx <https://onnx.ai/>`__ , thus "
"visualization is not feasible if the model cannot be exported into onnx."
msgstr ""

#: ../../source/tutorials/hello_nas.rst:489
msgid ""
"Built-in evaluators (e.g., Classification) will automatically export the "
"model into a file. For your own evaluator, you need to save your file "
"into ``$NNI_OUTPUT_DIR/model.onnx`` to make this work. For instance,"
msgstr ""

#: ../../source/tutorials/hello_nas.rst:520
msgid "Relaunch the experiment, and a button is shown on Web portal."
msgstr ""

#: ../../source/tutorials/hello_nas.rst:525
msgid "Export Top Models"
msgstr ""

#: ../../source/tutorials/hello_nas.rst:527
msgid ""
"Users can export top models after the exploration is done using "
"``export_top_models``."
msgstr ""

#: ../../source/tutorials/hello_nas.rst:563
msgid "**Total running time of the script:** ( 2 minutes  15.810 seconds)"
msgstr ""

#: ../../source/tutorials/hello_nas.rst:578
msgid ":download:`Download Python source code: hello_nas.py <hello_nas.py>`"
msgstr ""

#: ../../source/tutorials/hello_nas.rst:584
msgid ":download:`Download Jupyter notebook: hello_nas.ipynb <hello_nas.ipynb>`"
msgstr ""

#: ../../source/tutorials/hello_nas.rst:591
#: ../../source/tutorials/hpo_quickstart_pytorch/main.rst:338
#: ../../source/tutorials/pruning_quick_start_mnist.rst:357
msgid "`Gallery generated by Sphinx-Gallery <https://sphinx-gallery.github.io>`_"
msgstr ""

#: ../../source/tutorials/hpo_quickstart_pytorch/main.rst:14
msgid ""
"Click :ref:`here "
"<sphx_glr_download_tutorials_hpo_quickstart_pytorch_main.py>` to download"
" the full example code"
msgstr ""

#: ../../source/tutorials/hpo_quickstart_pytorch/main.rst:23
msgid "NNI HPO Quickstart with PyTorch"
msgstr ""

#: ../../source/tutorials/hpo_quickstart_pytorch/main.rst:24
msgid ""
"This tutorial optimizes the model in `official PyTorch quickstart`_ with "
"auto-tuning."
msgstr ""

#: ../../source/tutorials/hpo_quickstart_pytorch/main.rst:26
msgid ""
"There is also a :doc:`TensorFlow "
"version<../hpo_quickstart_tensorflow/main>` if you prefer it."
msgstr ""

#: ../../source/tutorials/hpo_quickstart_pytorch/main.rst:28
msgid "The tutorial consists of 4 steps:"
msgstr ""

#: ../../source/tutorials/hpo_quickstart_pytorch/main.rst:30
msgid "Modify the model for auto-tuning."
msgstr ""

#: ../../source/tutorials/hpo_quickstart_pytorch/main.rst:31
msgid "Define hyperparameters' search space."
msgstr ""

#: ../../source/tutorials/hpo_quickstart_pytorch/main.rst:32
msgid "Configure the experiment."
msgstr ""

#: ../../source/tutorials/hpo_quickstart_pytorch/main.rst:33
msgid "Run the experiment."
msgstr ""

#: ../../source/tutorials/hpo_quickstart_pytorch/main.rst:40
msgid "Step 1: Prepare the model"
msgstr ""

#: ../../source/tutorials/hpo_quickstart_pytorch/main.rst:41
msgid "In first step, we need to prepare the model to be tuned."
msgstr ""

#: ../../source/tutorials/hpo_quickstart_pytorch/main.rst:43
msgid ""
"The model should be put in a separate script. It will be evaluated many "
"times concurrently, and possibly will be trained on distributed "
"platforms."
msgstr ""

#: ../../source/tutorials/hpo_quickstart_pytorch/main.rst:47
msgid "In this tutorial, the model is defined in :doc:`model.py <model>`."
msgstr ""

#: ../../source/tutorials/hpo_quickstart_pytorch/main.rst:49
msgid "In short, it is a PyTorch model with 3 additional API calls:"
msgstr ""

#: ../../source/tutorials/hpo_quickstart_pytorch/main.rst:51
msgid ""
"Use :func:`nni.get_next_parameter` to fetch the hyperparameters to be "
"evalutated."
msgstr ""

#: ../../source/tutorials/hpo_quickstart_pytorch/main.rst:52
msgid ""
"Use :func:`nni.report_intermediate_result` to report per-epoch accuracy "
"metrics."
msgstr ""

#: ../../source/tutorials/hpo_quickstart_pytorch/main.rst:53
msgid "Use :func:`nni.report_final_result` to report final accuracy."
msgstr ""

#: ../../source/tutorials/hpo_quickstart_pytorch/main.rst:55
msgid "Please understand the model code before continue to next step."
msgstr ""

#: ../../source/tutorials/hpo_quickstart_pytorch/main.rst:60
msgid "Step 2: Define search space"
msgstr ""

#: ../../source/tutorials/hpo_quickstart_pytorch/main.rst:61
msgid ""
"In model code, we have prepared 3 hyperparameters to be tuned: "
"*features*, *lr*, and *momentum*."
msgstr ""

#: ../../source/tutorials/hpo_quickstart_pytorch/main.rst:64
msgid ""
"Here we need to define their *search space* so the tuning algorithm can "
"sample them in desired range."
msgstr ""

#: ../../source/tutorials/hpo_quickstart_pytorch/main.rst:66
msgid "Assuming we have following prior knowledge for these hyperparameters:"
msgstr ""

#: ../../source/tutorials/hpo_quickstart_pytorch/main.rst:68
msgid "*features* should be one of 128, 256, 512, 1024."
msgstr ""

#: ../../source/tutorials/hpo_quickstart_pytorch/main.rst:69
msgid ""
"*lr* should be a float between 0.0001 and 0.1, and it follows exponential"
" distribution."
msgstr ""

#: ../../source/tutorials/hpo_quickstart_pytorch/main.rst:70
msgid "*momentum* should be a float between 0 and 1."
msgstr ""

#: ../../source/tutorials/hpo_quickstart_pytorch/main.rst:72
msgid ""
"In NNI, the space of *features* is called ``choice``; the space of *lr* "
"is called ``loguniform``; and the space of *momentum* is called "
"``uniform``. You may have noticed, these names are derived from "
"``numpy.random``."
msgstr ""

#: ../../source/tutorials/hpo_quickstart_pytorch/main.rst:77
msgid ""
"For full specification of search space, check :doc:`the reference "
"</hpo/search_space>`."
msgstr ""

#: ../../source/tutorials/hpo_quickstart_pytorch/main.rst:79
msgid "Now we can define the search space as follow:"
msgstr ""

#: ../../source/tutorials/hpo_quickstart_pytorch/main.rst:102
msgid "Step 3: Configure the experiment"
msgstr ""

#: ../../source/tutorials/hpo_quickstart_pytorch/main.rst:103
msgid ""
"NNI uses an *experiment* to manage the HPO process. The *experiment "
"config* defines how to train the models and how to explore the search "
"space."
msgstr ""

#: ../../source/tutorials/hpo_quickstart_pytorch/main.rst:106
msgid ""
"In this tutorial we use a *local* mode experiment, which means models "
"will be trained on local machine, without using any special training "
"platform."
msgstr ""

#: ../../source/tutorials/hpo_quickstart_pytorch/main.rst:125
msgid "Now we start to configure the experiment."
msgstr ""

#: ../../source/tutorials/hpo_quickstart_pytorch/main.rst:128
msgid "Configure trial code"
msgstr ""

#: ../../source/tutorials/hpo_quickstart_pytorch/main.rst:129
msgid ""
"In NNI evaluation of each hyperparameter set is called a *trial*. So the "
"model script is called *trial code*."
msgstr ""

#: ../../source/tutorials/hpo_quickstart_pytorch/main.rst:147
msgid ""
"When ``trial_code_directory`` is a relative path, it relates to current "
"working directory. To run ``main.py`` in a different path, you can set "
"trial code directory to ``Path(__file__).parent``. (`__file__ "
"<https://docs.python.org/3.10/reference/datamodel.html#index-43>`__ is "
"only available in standard Python, not in Jupyter Notebook.)"
msgstr ""

#: ../../source/tutorials/hpo_quickstart_pytorch/main.rst:154
msgid ""
"If you are using Linux system without Conda, you may need to change "
"``\"python model.py\"`` to ``\"python3 model.py\"``."
msgstr ""

#: ../../source/tutorials/hpo_quickstart_pytorch/main.rst:160
msgid "Configure search space"
msgstr ""

#: ../../source/tutorials/hpo_quickstart_pytorch/main.rst:178
msgid "Configure tuning algorithm"
msgstr ""

#: ../../source/tutorials/hpo_quickstart_pytorch/main.rst:179
msgid "Here we use :doc:`TPE tuner </hpo/tuners>`."
msgstr ""

#: ../../source/tutorials/hpo_quickstart_pytorch/main.rst:198
msgid "Configure how many trials to run"
msgstr ""

#: ../../source/tutorials/hpo_quickstart_pytorch/main.rst:199
msgid ""
"Here we evaluate 10 sets of hyperparameters in total, and concurrently "
"evaluate 2 sets at a time."
msgstr ""

#: ../../source/tutorials/hpo_quickstart_pytorch/main.rst:218
msgid ""
"``max_trial_number`` is set to 10 here for a fast example. In real world "
"it should be set to a larger number. With default config TPE tuner "
"requires 20 trials to warm up."
msgstr ""

#: ../../source/tutorials/hpo_quickstart_pytorch/main.rst:222
msgid "You may also set ``max_experiment_duration = '1h'`` to limit running time."
msgstr ""

#: ../../source/tutorials/hpo_quickstart_pytorch/main.rst:224
msgid ""
"If neither ``max_trial_number`` nor ``max_experiment_duration`` are set, "
"the experiment will run forever until you press Ctrl-C."
msgstr ""

#: ../../source/tutorials/hpo_quickstart_pytorch/main.rst:230
msgid "Step 4: Run the experiment"
msgstr ""

#: ../../source/tutorials/hpo_quickstart_pytorch/main.rst:231
msgid ""
"Now the experiment is ready. Choose a port and launch it. (Here we use "
"port 8080.)"
msgstr ""

#: ../../source/tutorials/hpo_quickstart_pytorch/main.rst:233
msgid ""
"You can use the web portal to view experiment status: "
"http://localhost:8080."
msgstr ""

#: ../../source/tutorials/hpo_quickstart_pytorch/main.rst:263
msgid "After the experiment is done"
msgstr ""

#: ../../source/tutorials/hpo_quickstart_pytorch/main.rst:264
msgid "Everything is done and it is safe to exit now. The following are optional."
msgstr ""

#: ../../source/tutorials/hpo_quickstart_pytorch/main.rst:266
msgid ""
"If you are using standard Python instead of Jupyter Notebook, you can add"
" ``input()`` or ``signal.pause()`` to prevent Python from exiting, "
"allowing you to view the web portal after the experiment is done."
msgstr ""

#: ../../source/tutorials/hpo_quickstart_pytorch/main.rst:296
msgid ""
":meth:`nni.experiment.Experiment.stop` is automatically invoked when "
"Python exits, so it can be omitted in your code."
msgstr ""

#: ../../source/tutorials/hpo_quickstart_pytorch/main.rst:299
msgid ""
"After the experiment is stopped, you can run "
":meth:`nni.experiment.Experiment.view` to restart web portal."
msgstr ""

#: ../../source/tutorials/hpo_quickstart_pytorch/main.rst:303
msgid ""
"This example uses :doc:`Python API </reference/experiment>` to create "
"experiment."
msgstr ""

#: ../../source/tutorials/hpo_quickstart_pytorch/main.rst:305
msgid ""
"You can also create and manage experiments with :doc:`command line tool "
"</reference/nnictl>`."
msgstr ""

#: ../../source/tutorials/hpo_quickstart_pytorch/main.rst:310
msgid "**Total running time of the script:** ( 1 minutes  24.393 seconds)"
msgstr ""

#: ../../source/tutorials/hpo_quickstart_pytorch/main.rst:325
msgid ":download:`Download Python source code: main.py <main.py>`"
msgstr ""

#: ../../source/tutorials/hpo_quickstart_pytorch/main.rst:331
msgid ":download:`Download Jupyter notebook: main.ipynb <main.ipynb>`"
msgstr ""

#: ../../source/tutorials/pruning_quick_start_mnist.rst:13
msgid ""
"Click :ref:`here "
"<sphx_glr_download_tutorials_pruning_quick_start_mnist.py>` to download "
"the full example code"
msgstr ""

#: ../../source/tutorials/pruning_quick_start_mnist.rst:22
msgid "Pruning Quickstart"
msgstr ""

#: ../../source/tutorials/pruning_quick_start_mnist.rst:24
msgid ""
"Model pruning is a technique to reduce the model size and computation by "
"reducing model weight size or intermediate state size. There are three "
"common practices for pruning a DNN model:"
msgstr ""

#: ../../source/tutorials/pruning_quick_start_mnist.rst:27
msgid "Pre-training a model -> Pruning the model -> Fine-tuning the pruned model"
msgstr ""

#: ../../source/tutorials/pruning_quick_start_mnist.rst:28
msgid ""
"Pruning a model during training (i.e., pruning aware training) -> Fine-"
"tuning the pruned model"
msgstr ""

#: ../../source/tutorials/pruning_quick_start_mnist.rst:29
msgid "Pruning a model -> Training the pruned model from scratch"
msgstr ""

#: ../../source/tutorials/pruning_quick_start_mnist.rst:31
msgid ""
"NNI supports all of the above pruning practices by working on the key "
"pruning stage. Following this tutorial for a quick look at how to use NNI"
" to prune a model in a common practice."
msgstr ""

#: ../../source/tutorials/pruning_quick_start_mnist.rst:37
msgid "Preparation"
msgstr ""

#: ../../source/tutorials/pruning_quick_start_mnist.rst:39
msgid ""
"In this tutorial, we use a simple model and pre-trained on MNIST dataset."
" If you are familiar with defining a model and training in pytorch, you "
"can skip directly to `Pruning Model`_."
msgstr ""

#: ../../source/tutorials/pruning_quick_start_mnist.rst:121
msgid "Pruning Model"
msgstr ""

#: ../../source/tutorials/pruning_quick_start_mnist.rst:123
msgid ""
"Using L1NormPruner to prune the model and generate the masks. Usually, a "
"pruner requires original model and ``config_list`` as its inputs. "
"Detailed about how to write ``config_list`` please refer "
":doc:`compression config specification "
"<../compression/compression_config_list>`."
msgstr ""

#: ../../source/tutorials/pruning_quick_start_mnist.rst:127
msgid ""
"The following `config_list` means all layers whose type is `Linear` or "
"`Conv2d` will be pruned, except the layer named `fc3`, because `fc3` is "
"`exclude`. The final sparsity ratio for each layer is 50%. The layer "
"named `fc3` will not be pruned."
msgstr ""

#: ../../source/tutorials/pruning_quick_start_mnist.rst:153
msgid "Pruners usually require `model` and `config_list` as input arguments."
msgstr ""

#: ../../source/tutorials/pruning_quick_start_mnist.rst:232
msgid ""
"Speedup the original model with masks, note that `ModelSpeedup` requires "
"an unwrapped model. The model becomes smaller after speedup, and reaches "
"a higher sparsity ratio because `ModelSpeedup` will propagate the masks "
"across layers."
msgstr ""

#: ../../source/tutorials/pruning_quick_start_mnist.rst:269
msgid "the model will become real smaller after speedup"
msgstr ""

#: ../../source/tutorials/pruning_quick_start_mnist.rst:307
msgid "Fine-tuning Compacted Model"
msgstr ""

#: ../../source/tutorials/pruning_quick_start_mnist.rst:308
msgid ""
"Note that if the model has been sped up, you need to re-initialize a new "
"optimizer for fine-tuning. Because speedup will replace the masked big "
"layers with dense small ones."
msgstr ""

#: ../../source/tutorials/pruning_quick_start_mnist.rst:329
msgid "**Total running time of the script:** ( 0 minutes  58.337 seconds)"
msgstr ""

#: ../../source/tutorials/pruning_quick_start_mnist.rst:344
msgid ""
":download:`Download Python source code: pruning_quick_start_mnist.py "
"<pruning_quick_start_mnist.py>`"
msgstr ""

#: ../../source/tutorials/pruning_quick_start_mnist.rst:350
msgid ""
":download:`Download Jupyter notebook: pruning_quick_start_mnist.ipynb "
"<pruning_quick_start_mnist.ipynb>`"
msgstr ""