Unverified Commit 3617b7f3 authored by Scarlett Li's avatar Scarlett Li Committed by GitHub
Browse files

correct assessor typo (#463)

correct assessor typos in several files.
parent ecc621fc
......@@ -104,7 +104,7 @@ You can use these commands to get more information about the experiment
* [Run an experiment on multiple machines?](docs/tutorial_2_RemoteMachineMode.md)
* [Run an experiment on OpenPAI?](docs/PAIMode.md)
* [Run an experiment on Kubeflow?](docs/KubeflowMode.md)
* [Try different tuners and assessors](docs/tutorial_3_tryTunersAndAccessors.md)
* [Try different tuners and assessors](docs/tutorial_3_tryTunersAndAssessors.md)
* [Implement a customized tuner](docs/howto_2_CustomizedTuner.md)
* [Implement a customized assessor](examples/assessors/README.md)
* [Use Genetic Algorithm to find good model architectures for Reading Comprehension task](examples/trials/ga_squad/README.md)
......
......@@ -86,10 +86,10 @@ Initial release of Neural Network Intelligence (NNI).
* Installation and Deployment
* Support pip install and source codes install
* Support training services on local mode(including Multi-GPU mode) as well as multi-machines mode
* Tuners, Accessors and Trial
* Tuners, Assessors and Trial
* Support AutoML algorithms including: hyperopt_tpe, hyperopt_annealing, hyperopt_random, and evolution_tuner
* Support assessor(early stop) algorithms including: medianstop algorithm
* Provide Python API for user defined tuners and accessors
* Provide Python API for user defined tuners and assessors
* Provide Python API for user to wrap trial code as NNI deployable codes
* Experiments
* Provide a command line toolkit 'nnictl' for experiments management
......
# Tutorial - Try different Tuners and Accessors
# Tutorial - Try different Tuners and Assessors
NNI provides an easy to adopt approach to set up parameter tuning algorithms as well as early stop policies, we call them **Tuners** and **Accessors**.
NNI provides an easy to adopt approach to set up parameter tuning algorithms as well as early stop policies, we call them **Tuners** and **Assessors**.
**Tuner** specifies the algorithm you use to generate hyperparameter sets for each trial. In NNI, we support two approaches to set the tuner.
1. Directly use tuner provided by nni sdk
......@@ -18,17 +18,17 @@ NNI provides an easy to adopt approach to set up parameter tuning algorithms as
**Assessor** specifies the algorithm you use to apply early stop policy. In NNI, there are two approaches to set theassessor.
1. Directly use accessor provided by nni sdk
1. Directly use assessor provided by nni sdk
required fields: builtinAssessorName and classArgs.
2. Customize your own tuner file
2. Customize your own assessor file
required fields: codeDirectory, classFileName, className and classArgs.
### **Learn More about assessor**
* For detailed defintion and usage aobut the required field, please refer to [Config an experiment](ExperimentConfig.md)
* Find more about the detailed instruction about [enable accessor](EnableAssessor.md)
* Find more about the detailed instruction about [enable assessor](EnableAssessor.md)
* [How to implement your own assessor](../examples/assessors/README.md)
## **Learn More**
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment