* Wait for the message `INFO: Successfully started experiment!` in the command line. This message indicates that your experiment has been successfully started. You can explore the experiment using the `Web UI url`.
CNN MNIST classifier for deep learning is similar to `hello world` for programming languages. Thus, we use MNIST as example to introduce different features of NNI. The examples are listed below:
-[MNIST with NNI API](#mnist)
-[MNIST with NNI API (TensorFlow v1.x)](#mnist-tfv1)
-[MNIST with NNI API (TensorFlow v2.x)](#mnist-tfv2)
-[MNIST with NNI annotation](#mnist-annotation)
-[MNIST in keras](#mnist-keras)
-[MNIST -- tuning with batch tuner](#mnist-batch)
...
...
@@ -11,12 +12,19 @@ CNN MNIST classifier for deep learning is similar to `hello world` for programmi
-[distributed MNIST (tensorflow) using kubeflow](#mnist-kubeflow-tf)
-[distributed MNIST (pytorch) using kubeflow](#mnist-kubeflow-pytorch)
<aname="mnist"></a>
**MNIST with NNI API**
<aname="mnist-tfv1"></a>
**MNIST with NNI API (TensorFlow v1.x)**
This is a simple network which has two convolutional layers, two pooling layers and a fully connected layer. We tune hyperparameters, such as dropout rate, convolution size, hidden size, etc. It can be tuned with most NNI built-in tuners, such as TPE, SMAC, Random. We also provide an exmaple YAML file which enables assessor.
`code directory: examples/trials/mnist/`
`code directory: examples/trials/mnist-tfv1/`
<aname="mnist-tfv2"></a>
**MNIST with NNI API (TensorFlow v2.x)**
Same network to the example above, but written in TensorFlow v2.x Keras API.
`code directory: examples/trials/mnist-tfv2/`
<aname="mnist-annotation"></a>
**MNIST with NNI annotation**
...
...
@@ -65,4 +73,4 @@ This example is to show how to run distributed training on kubeflow through NNI.
Similar to the previous example, the difference is that this example is implemented in pytorch, thus, it uses kubeflow pytorch operator.