Unverified Commit 78e871c3 authored by Mark Daoust's avatar Mark Daoust Committed by GitHub
Browse files

Merge pull request #4733 from MarkDaoust/autopgraph-guide

Add autograph guide
parents c5241d21 fceeb46d
{
"nbformat": 4,
"nbformat_minor": 0,
"metadata": {
"colab": {
"name": "autograph.ipynb",
"version": "0.3.2",
"views": {},
"default_view": {},
"provenance": [],
"private_outputs": true,
"collapsed_sections": [
"Jxv6goXm7oGF"
],
"toc_visible": true
},
"kernelspec": {
"name": "python3",
"display_name": "Python 3"
}
},
"cells": [
{
"metadata": {
"id": "Jxv6goXm7oGF",
"colab_type": "text"
},
"cell_type": "markdown",
"source": [
"##### Copyright 2018 The TensorFlow Authors.\n",
"\n",
"Licensed under the Apache License, Version 2.0 (the \"License\");"
]
},
{
"metadata": {
"id": "llMNufAK7nfK",
"colab_type": "code",
"colab": {
"autoexec": {
"startup": false,
"wait_interval": 0
}
}
},
"cell_type": "code",
"source": [
"#@title Licensed under the Apache License, Version 2.0 (the \"License\"); { display-mode: \"form\" }\n",
"# you may not use this file except in compliance with the License.\n",
"# You may obtain a copy of the License at\n",
"#\n",
"# https://www.apache.org/licenses/LICENSE-2.0\n",
"#\n",
"# Unless required by applicable law or agreed to in writing, software\n",
"# distributed under the License is distributed on an \"AS IS\" BASIS,\n",
"# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n",
"# See the License for the specific language governing permissions and\n",
"# limitations under the License."
],
"execution_count": 0,
"outputs": []
},
{
"metadata": {
"id": "8Byow2J6LaPl",
"colab_type": "text"
},
"cell_type": "markdown",
"source": [
"# AutoGraph: Easy control flow for graphs "
]
},
{
"metadata": {
"id": "kGXS3UWBBNoc",
"colab_type": "text"
},
"cell_type": "markdown",
"source": [
"<table class=\"tfo-notebook-buttons\" align=\"left\">\n",
" <td>\n",
" <a target=\"_blank\" href=\"https://www.tensorflow.org/guide/autograph\"><img src=\"https://www.tensorflow.org/images/tf_logo_32px.png\" />View on TensorFlow.org</a>\n",
" </td>\n",
" <td>\n",
" <a target=\"_blank\" href=\"https://colab.research.google.com/github/tensorflow/models/blob/master/samples/core/guide/autograph.ipynb\"><img src=\"https://www.tensorflow.org/images/colab_logo_32px.png\" />Run in Google Colab</a>\n",
" </td>\n",
" <td>\n",
" <a target=\"_blank\" href=\"https://github.com/tensorflow/models/blob/master/samples/core/guide/autograph.ipynb\"><img src=\"https://www.tensorflow.org/images/GitHub-Mark-32px.png\" />View source on GitHub</a>\n",
" </td>\n",
"</table>"
]
},
{
"metadata": {
"id": "CydFK2CL7ZHA",
"colab_type": "text"
},
"cell_type": "markdown",
"source": [
"[AutoGraph](https://github.com/tensorflow/tensorflow/blob/master/tensorflow/contrib/autograph/) helps you write complicated graph code using normal Python. Behind the scenes, AutoGraph automatically transforms your code into the equivalent [TensorFlow graph code](https://www.tensorflow.org/guide/graphs). AutoGraph already supports much of the Python language, and that coverage continues to grow. For a list of supported Python langauge features, see the [Autograph capabilities and limitations](https://github.com/tensorflow/tensorflow/blob/master/tensorflow/contrib/autograph/LIMITATIONS.md)."
]
},
{
"metadata": {
"id": "n4EKOpw9mObL",
"colab_type": "text"
},
"cell_type": "markdown",
"source": [
"## Setup\n",
"\n",
"To use AutoGraph, install the latest version of TensorFlow:"
]
},
{
"metadata": {
"id": "RSez0n7Ptcvb",
"colab_type": "code",
"colab": {
"autoexec": {
"startup": false,
"wait_interval": 0
}
}
},
"cell_type": "code",
"source": [
"! pip install tf-nightly"
],
"execution_count": 0,
"outputs": []
},
{
"metadata": {
"id": "qLp9VZfit9oR",
"colab_type": "text"
},
"cell_type": "markdown",
"source": [
"Import TensorFlow, AutoGraph, and any supporting modules:"
]
},
{
"metadata": {
"id": "mT7meGqrZTz9",
"colab_type": "code",
"colab": {
"autoexec": {
"startup": false,
"wait_interval": 0
}
}
},
"cell_type": "code",
"source": [
"from __future__ import division, print_function, absolute_import\n",
"\n",
"import tensorflow as tf\n",
"from tensorflow.contrib import autograph\n",
"\n",
"import matplotlib.pyplot as plt"
],
"execution_count": 0,
"outputs": []
},
{
"metadata": {
"id": "Hh1PajmUJMNp",
"colab_type": "text"
},
"cell_type": "markdown",
"source": [
"We'll enable [eager execution](https://www.tensorflow.org/guide/eager) for demonstration purposes, but AutoGraph works in both eager and [graph execution](https://www.tensorflow.org/guide/graphs) environments:"
]
},
{
"metadata": {
"id": "ks_hiqcSJNvg",
"colab_type": "code",
"colab": {
"autoexec": {
"startup": false,
"wait_interval": 0
}
}
},
"cell_type": "code",
"source": [
"tf.enable_eager_execution()"
],
"execution_count": 0,
"outputs": []
},
{
"metadata": {
"id": "WR4lG3hsuWQT",
"colab_type": "text"
},
"cell_type": "markdown",
"source": [
"Note: AutoGraph converted code is designed to run during graph execution. When eager exectuon is enabled, use explicit graphs (as this example shows) or `tf.contrib.eager.defun`."
]
},
{
"metadata": {
"id": "ohbSnA79mcJV",
"colab_type": "text"
},
"cell_type": "markdown",
"source": [
"## Automatically convert Python control flow\n",
"\n",
"AutoGraph will convert much of the Python language into the equivalent TensorFlow graph building code. It converts a function like:"
]
},
{
"metadata": {
"id": "aA3gOodCBkOw",
"colab_type": "code",
"colab": {
"autoexec": {
"startup": false,
"wait_interval": 0
}
}
},
"cell_type": "code",
"source": [
"def g(x):\n",
" if x > 0:\n",
" x = x * x\n",
" else:\n",
" x = 0.0\n",
" return x"
],
"execution_count": 0,
"outputs": []
},
{
"metadata": {
"id": "LICw4XQFZrhH",
"colab_type": "text"
},
"cell_type": "markdown",
"source": [
"To a function that uses graph building:"
]
},
{
"metadata": {
"id": "_EMhGUjRZoKQ",
"colab_type": "code",
"colab": {
"autoexec": {
"startup": false,
"wait_interval": 0
}
}
},
"cell_type": "code",
"source": [
"print(autograph.to_code(g))"
],
"execution_count": 0,
"outputs": []
},
{
"metadata": {
"id": "xpK0m4TCvkJq",
"colab_type": "text"
},
"cell_type": "markdown",
"source": [
"Code written for eager execution can run in a `tf.Graph` with the same results, but with the benfits of graph execution:"
]
},
{
"metadata": {
"id": "I1RtBvoKBxq5",
"colab_type": "code",
"colab": {
"autoexec": {
"startup": false,
"wait_interval": 0
}
}
},
"cell_type": "code",
"source": [
"print('Eager results: %2.2f, %2.2f' % (g(tf.constant(9.0)), g(tf.constant(-9.0))))"
],
"execution_count": 0,
"outputs": []
},
{
"metadata": {
"id": "Fpk3MxVVv5gn",
"colab_type": "text"
},
"cell_type": "markdown",
"source": [
"Generate a graph-version and call it:"
]
},
{
"metadata": {
"id": "SGjSq0WQvwGs",
"colab_type": "code",
"colab": {
"autoexec": {
"startup": false,
"wait_interval": 0
}
}
},
"cell_type": "code",
"source": [
"tf_g = autograph.to_graph(g)\n",
"\n",
"with tf.Graph().as_default(): \n",
" # The result works like a regular op: takes tensors in, returns tensors.\n",
" # You can inspect the graph using tf.get_default_graph().as_graph_def()\n",
" g_out1 = tf_g(tf.constant( 9.0))\n",
" g_out2 = tf_g(tf.constant(-9.0))\n",
" with tf.Session() as sess:\n",
" print('Graph results: %2.2f, %2.2f\\n' % (sess.run(g_out1), sess.run(g_out2)))"
],
"execution_count": 0,
"outputs": []
},
{
"metadata": {
"id": "m-jWmsCmByyw",
"colab_type": "text"
},
"cell_type": "markdown",
"source": [
"AutoGraph supports common Python statements like `while`, `for`, `if`, `break`, and `return`, with support for nesting. Compare this function with the complicated graph verson displayed in the following code blocks:"
]
},
{
"metadata": {
"id": "toxKBOXbB1ro",
"colab_type": "code",
"colab": {
"autoexec": {
"startup": false,
"wait_interval": 0
}
}
},
"cell_type": "code",
"source": [
"# Continue in a loop\n",
"def f(l):\n",
" s = 0\n",
" for c in l:\n",
" if c % 2 > 0:\n",
" continue\n",
" s += c\n",
" return s\n",
"\n",
"print('Eager result: %d' % f(tf.constant([10,12,15,20])))\n",
"\n",
"tf_f = autograph.to_graph(f)\n",
"\n",
"with tf.Graph().as_default(): \n",
" with tf.Session():\n",
" print('Graph result: %d\\n\\n' % tf_f(tf.constant([10,12,15,20])).eval())"
],
"execution_count": 0,
"outputs": []
},
{
"metadata": {
"id": "jlyQgxYsYSXr",
"colab_type": "code",
"colab": {
"autoexec": {
"startup": false,
"wait_interval": 0
}
}
},
"cell_type": "code",
"source": [
"print(autograph.to_code(f))"
],
"execution_count": 0,
"outputs": []
},
{
"metadata": {
"id": "FUJJ-WTdCGeq",
"colab_type": "text"
},
"cell_type": "markdown",
"source": [
"## Decorator\n",
"\n",
"If you don't need easy access to the original Python function, use the `convert` decorator:"
]
},
{
"metadata": {
"id": "BKhFNXDic4Mw",
"colab_type": "code",
"colab": {
"autoexec": {
"startup": false,
"wait_interval": 0
}
}
},
"cell_type": "code",
"source": [
"@autograph.convert()\n",
"def fizzbuzz(num):\n",
" if num % 3 == 0 and num % 5 == 0:\n",
" print('FizzBuzz')\n",
" elif num % 3 == 0:\n",
" print('Fizz')\n",
" elif num % 5 == 0:\n",
" print('Buzz')\n",
" else:\n",
" print(num)\n",
" return num\n",
"\n",
"\n",
"with tf.Graph().as_default():\n",
" # The result works like a regular op: takes tensors in, returns tensors.\n",
" # You can inspect the graph using tf.get_default_graph().as_graph_def()\n",
" num = tf.placeholder(tf.int32)\n",
" result = fizzbuzz(num)\n",
" with tf.Session() as sess:\n",
" for n in range(10,16):\n",
" sess.run(result, feed_dict={num:n})"
],
"execution_count": 0,
"outputs": []
},
{
"metadata": {
"id": "-pkEH6OecW7h",
"colab_type": "text"
},
"cell_type": "markdown",
"source": [
"## Examples\n",
"\n",
"Let's demonstrate some useful Python language features."
]
},
{
"metadata": {
"id": "axoRAkWi0CQG",
"colab_type": "text"
},
"cell_type": "markdown",
"source": [
"### Assert\n",
"\n",
"AutoGraph automatically converts the Python `assert` statement into the equivalent `tf.Assert` code:"
]
},
{
"metadata": {
"id": "IAOgh62zCPZ4",
"colab_type": "code",
"colab": {
"autoexec": {
"startup": false,
"wait_interval": 0
}
}
},
"cell_type": "code",
"source": [
"@autograph.convert()\n",
"def f(x):\n",
" assert x != 0, 'Do not pass zero!'\n",
" return x * x\n",
"\n",
"with tf.Graph().as_default(): \n",
" with tf.Session():\n",
" try:\n",
" print(f(tf.constant(0)).eval())\n",
" except tf.errors.InvalidArgumentError as e:\n",
" print('Got error message:\\n %s' % e.message)"
],
"execution_count": 0,
"outputs": []
},
{
"metadata": {
"id": "KRu8iIPBCQr5",
"colab_type": "text"
},
"cell_type": "markdown",
"source": [
"### Print\n",
"\n",
"Use the Python `print` function in-graph:"
]
},
{
"metadata": {
"id": "ySTsuxnqCTQi",
"colab_type": "code",
"colab": {
"autoexec": {
"startup": false,
"wait_interval": 0
}
}
},
"cell_type": "code",
"source": [
"@autograph.convert()\n",
"def f(n):\n",
" if n >= 0:\n",
" while n < 5:\n",
" n += 1\n",
" print(n)\n",
" return n\n",
" \n",
"with tf.Graph().as_default():\n",
" with tf.Session():\n",
" f(tf.constant(0)).eval()"
],
"execution_count": 0,
"outputs": []
},
{
"metadata": {
"id": "NqF0GT-VCVFh",
"colab_type": "text"
},
"cell_type": "markdown",
"source": [
"### Lists\n",
"\n",
"Append to lists in loops (tensor list ops are automatically created):"
]
},
{
"metadata": {
"id": "ABX070KwCczR",
"colab_type": "code",
"colab": {
"autoexec": {
"startup": false,
"wait_interval": 0
}
}
},
"cell_type": "code",
"source": [
"@autograph.convert()\n",
"def f(n):\n",
" z = []\n",
" # We ask you to tell us the element dtype of the list\n",
" autograph.utils.set_element_type(z, tf.int32)\n",
" \n",
" for i in range(n):\n",
" z.append(i)\n",
" # when you're done with the list, stack it\n",
" # (this is just like np.stack)\n",
" return autograph.stack(z) \n",
"\n",
"#tf_f = autograph.to_graph(f)\n",
"\n",
"with tf.Graph().as_default(): \n",
" with tf.Session():\n",
" print(f(tf.constant(3)).eval())"
],
"execution_count": 0,
"outputs": []
},
{
"metadata": {
"id": "qj7am2I_xvTJ",
"colab_type": "text"
},
"cell_type": "markdown",
"source": [
"### Nested if statements"
]
},
{
"metadata": {
"id": "4yyNOf-Twr6s",
"colab_type": "code",
"colab": {
"autoexec": {
"startup": false,
"wait_interval": 0
}
}
},
"cell_type": "code",
"source": [
"@autograph.convert()\n",
"def nearest_odd_square(x):\n",
" if x > 0:\n",
" x = x * x\n",
" if x % 2 == 0:\n",
" x = x + 1\n",
" return x\n",
"\n",
"with tf.Graph().as_default(): \n",
" with tf.Session() as sess:\n",
" print(sess.run(nearest_odd_square(tf.constant(4))))\n",
" print(sess.run(nearest_odd_square(tf.constant(5))))\n",
" print(sess.run(nearest_odd_square(tf.constant(6))))"
],
"execution_count": 0,
"outputs": []
},
{
"metadata": {
"id": "jXAxjeBr1qWK",
"colab_type": "text"
},
"cell_type": "markdown",
"source": [
"### While loop"
]
},
{
"metadata": {
"id": "ucmZyQVL03bF",
"colab_type": "code",
"colab": {
"autoexec": {
"startup": false,
"wait_interval": 0
}
}
},
"cell_type": "code",
"source": [
"@autograph.convert()\n",
"def square_until_stop(x, y):\n",
" while x < y:\n",
" x = x * x\n",
" return x\n",
" \n",
"with tf.Graph().as_default(): \n",
" with tf.Session() as sess:\n",
" print(sess.run(square_until_stop(tf.constant(4), tf.constant(100))))"
],
"execution_count": 0,
"outputs": []
},
{
"metadata": {
"id": "FXB0Zbwl13PY",
"colab_type": "text"
},
"cell_type": "markdown",
"source": [
"### Break"
]
},
{
"metadata": {
"id": "1sjaFcL717Ig",
"colab_type": "code",
"colab": {
"autoexec": {
"startup": false,
"wait_interval": 0
}
}
},
"cell_type": "code",
"source": [
"@autograph.convert()\n",
"def argwhere_cumsum(x, threshold):\n",
" current_sum = 0.0\n",
" idx = 0\n",
" for i in range(len(x)):\n",
" idx = i\n",
" if current_sum >= threshold:\n",
" break\n",
" current_sum += x[i]\n",
" return idx\n",
"\n",
"N = 10\n",
"with tf.Graph().as_default(): \n",
" with tf.Session() as sess:\n",
" idx = argwhere_cumsum(tf.ones(N), tf.constant(float(N/2)))\n",
" print(sess.run(idx))"
],
"execution_count": 0,
"outputs": []
},
{
"metadata": {
"id": "4LfnJjm0Bm0B",
"colab_type": "text"
},
"cell_type": "markdown",
"source": [
"## Advanced example: An in-graph training loop\n",
"\n",
"Since writing control flow in AutoGraph is easy, running a training loop in a TensorFlow graph should also be easy. \n",
"\n",
"<!--TODO(markdaoust) link to examples showing autograph **in** keras models when ready-->\n",
"\n",
"Important: While this example wraps a `tf.keras.Model` using AutoGraph, `tf.contrib.autograph` is compatible with `tf.keras` and can be used in [Keras custom layers and models](https://tensorflow.org/guide/keras#build_advanced_models). The easiest way is to `@autograph.convert()` the `call` method.\n",
"\n",
"This example shows how to train a simple Keras model on MNIST with the entire training process—loading batches, calculating gradients, updating parameters, calculating validation accuracy, and repeating until convergence—is performed in-graph."
]
},
{
"metadata": {
"id": "Em5dzSUOtLRP",
"colab_type": "text"
},
"cell_type": "markdown",
"source": [
"### Download data"
]
},
{
"metadata": {
"id": "xqoxumv0ssQW",
"colab_type": "code",
"colab": {
"autoexec": {
"startup": false,
"wait_interval": 0
}
}
},
"cell_type": "code",
"source": [
"(train_images, train_labels), (test_images, test_labels) = tf.keras.datasets.mnist.load_data()"
],
"execution_count": 0,
"outputs": []
},
{
"metadata": {
"id": "znmy4l8ntMvW",
"colab_type": "text"
},
"cell_type": "markdown",
"source": [
"### Define the model"
]
},
{
"metadata": {
"id": "Pe-erWQdBoC5",
"colab_type": "code",
"colab": {
"autoexec": {
"startup": false,
"wait_interval": 0
}
}
},
"cell_type": "code",
"source": [
"def mlp_model(input_shape):\n",
" model = tf.keras.Sequential((\n",
" tf.keras.layers.Flatten(),\n",
" tf.keras.layers.Dense(100, activation='relu', input_shape=input_shape),\n",
" tf.keras.layers.Dense(100, activation='relu'),\n",
" tf.keras.layers.Dense(10, activation='softmax')))\n",
" model.build()\n",
" return model\n",
"\n",
"\n",
"def predict(m, x, y):\n",
" y_p = m(x)\n",
" losses = tf.keras.losses.categorical_crossentropy(y, y_p)\n",
" l = tf.reduce_mean(losses)\n",
" accuracies = tf.keras.metrics.categorical_accuracy(y, y_p)\n",
" accuracy = tf.reduce_mean(accuracies)\n",
" return l, accuracy\n",
"\n",
"\n",
"def fit(m, x, y, opt):\n",
" l, accuracy = predict(m, x, y)\n",
" opt.minimize(l)\n",
" return l, accuracy\n",
"\n",
"\n",
"def setup_mnist_data(is_training, batch_size):\n",
" if is_training:\n",
" ds = tf.data.Dataset.from_tensor_slices((train_images, train_labels))\n",
" ds = ds.shuffle(batch_size * 10)\n",
" else:\n",
" ds = tf.data.Dataset.from_tensor_slices((test_images, test_labels))\n",
"\n",
" ds = ds.repeat()\n",
" ds = ds.batch(batch_size)\n",
" return ds\n",
"\n",
"\n",
"def get_next_batch(ds):\n",
" itr = ds.make_one_shot_iterator()\n",
" image, label = itr.get_next()\n",
" x = tf.to_float(image)/255.0\n",
" y = tf.one_hot(tf.squeeze(label), 10)\n",
" return x, y "
],
"execution_count": 0,
"outputs": []
},
{
"metadata": {
"id": "oeYV6mKnJGMr",
"colab_type": "text"
},
"cell_type": "markdown",
"source": [
"### Define the training loop"
]
},
{
"metadata": {
"id": "3xtg_MMhJETd",
"colab_type": "code",
"colab": {
"autoexec": {
"startup": false,
"wait_interval": 0
}
}
},
"cell_type": "code",
"source": [
"def train(train_ds, test_ds, hp):\n",
" m = mlp_model((28 * 28,))\n",
" opt = tf.train.MomentumOptimizer(hp.learning_rate, 0.9)\n",
" \n",
" # We'd like to save our losses to a list. In order for AutoGraph\n",
" # to convert these lists into their graph equivalent,\n",
" # we need to specify the element type of the lists.\n",
" train_losses = []\n",
" autograph.utils.set_element_type(train_losses, tf.float32)\n",
" test_losses = []\n",
" autograph.utils.set_element_type(test_losses, tf.float32)\n",
" train_accuracies = []\n",
" autograph.utils.set_element_type(train_accuracies, tf.float32)\n",
" test_accuracies = []\n",
" autograph.utils.set_element_type(test_accuracies, tf.float32)\n",
" \n",
" # This entire training loop will be run in-graph.\n",
" i = tf.constant(0)\n",
" while i < hp.max_steps:\n",
" train_x, train_y = get_next_batch(train_ds)\n",
" test_x, test_y = get_next_batch(test_ds)\n",
"\n",
" step_train_loss, step_train_accuracy = fit(m, train_x, train_y, opt)\n",
" step_test_loss, step_test_accuracy = predict(m, test_x, test_y)\n",
" if i % (hp.max_steps // 10) == 0:\n",
" print('Step', i, 'train loss:', step_train_loss, 'test loss:',\n",
" step_test_loss, 'train accuracy:', step_train_accuracy,\n",
" 'test accuracy:', step_test_accuracy)\n",
" train_losses.append(step_train_loss)\n",
" test_losses.append(step_test_loss)\n",
" train_accuracies.append(step_train_accuracy)\n",
" test_accuracies.append(step_test_accuracy)\n",
" i += 1\n",
" \n",
" # We've recorded our loss values and accuracies \n",
" # to a list in a graph with AutoGraph's help.\n",
" # In order to return the values as a Tensor, \n",
" # we need to stack them before returning them.\n",
" return (autograph.stack(train_losses), autograph.stack(test_losses), \n",
" autograph.stack(train_accuracies), autograph.stack(test_accuracies))"
],
"execution_count": 0,
"outputs": []
},
{
"metadata": {
"id": "IsHLDZniauLV",
"colab_type": "text"
},
"cell_type": "markdown",
"source": [
"Now build the graph and run the training loop:"
]
},
{
"metadata": {
"id": "HYh6MSZyJOag",
"colab_type": "code",
"colab": {
"autoexec": {
"startup": false,
"wait_interval": 0
}
}
},
"cell_type": "code",
"source": [
"with tf.Graph().as_default() as g:\n",
" hp = tf.contrib.training.HParams(\n",
" learning_rate=0.05,\n",
" max_steps=500,\n",
" )\n",
" train_ds = setup_mnist_data(True, 50)\n",
" test_ds = setup_mnist_data(False, 1000)\n",
" tf_train = autograph.to_graph(train)\n",
" (train_losses, test_losses, train_accuracies,\n",
" test_accuracies) = tf_train(train_ds, test_ds, hp)\n",
"\n",
" init = tf.global_variables_initializer()\n",
" \n",
"with tf.Session(graph=g) as sess:\n",
" sess.run(init)\n",
" (train_losses, test_losses, train_accuracies,\n",
" test_accuracies) = sess.run([train_losses, test_losses, train_accuracies,\n",
" test_accuracies])\n",
" \n",
"plt.title('MNIST train/test losses')\n",
"plt.plot(train_losses, label='train loss')\n",
"plt.plot(test_losses, label='test loss')\n",
"plt.legend()\n",
"plt.xlabel('Training step')\n",
"plt.ylabel('Loss')\n",
"plt.show()\n",
"plt.title('MNIST train/test accuracies')\n",
"plt.plot(train_accuracies, label='train accuracy')\n",
"plt.plot(test_accuracies, label='test accuracy')\n",
"plt.legend(loc='lower right')\n",
"plt.xlabel('Training step')\n",
"plt.ylabel('Accuracy')\n",
"plt.show()"
],
"execution_count": 0,
"outputs": []
}
]
}
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment