"AutoGraph will convert much of the Python language into the equivalent TensorFlow graph building code. It converts a function like:"
"AutoGraph will convert much of the Python language into the equivalent TensorFlow graph building code. \n",
"\n",
"Note: In real applications batching is essential for performance. The best code to convert to AutoGraph is code where the control flow is decided at the _batch_ level. If making decisions at the individual _example_ level, you must index and batch the examples to maintain performance while applying the control flow logic. \n",
"Let's demonstrate some useful Python language features."
"Let's demonstrate some useful Python language features.\n"
]
},
{
...
...
@@ -419,12 +422,11 @@
"@autograph.convert()\n",
"def inverse(x):\n",
" assert x != 0.0, 'Do not pass zero!'\n",
" return 1.0/x\n",
" return 1.0 / x\n",
"\n",
"with tf.Graph().as_default(): \n",
" with tf.Session():\n",
"with tf.Graph().as_default(), tf.Session() as sess:\n",
" try:\n",
" print(inverse(tf.constant(0.0)).eval())\n",
" print(sess.run(inverse(tf.constant(0.0))))\n",
" except tf.errors.InvalidArgumentError as e:\n",
" print('Got error message:\\n %s' % e.message)"
],
...
...
@@ -459,9 +461,8 @@
" i += 1\n",
" return n\n",
" \n",
"with tf.Graph().as_default():\n",
" with tf.Session():\n",
" count(tf.constant(5)).eval()"
"with tf.Graph().as_default(), tf.Session() as sess:\n",
" sess.run(count(tf.constant(5)))"
],
"execution_count": 0,
"outputs": []
...
...
@@ -499,9 +500,8 @@
" return autograph.stack(z) \n",
"\n",
"\n",
"with tf.Graph().as_default(): \n",
" with tf.Session():\n",
" print(arange(tf.constant(10)).eval())"
"with tf.Graph().as_default(), tf.Session() as sess:\n",
" sess.run(arange(tf.constant(10)))"
],
"execution_count": 0,
"outputs": []
...
...
@@ -655,14 +655,14 @@
"source": [
"## Interoperation with `tf.Keras`\n",
"\n",
"Now that you've seen the basics, let's build some real model components with autograph.\n",
"Now that you've seen the basics, let's build some model components with autograph.\n",
"\n",
"It's relatively simple to integrate `autograph` with `tf.keras`. But remember that batchng is essential for performance. So the best candidate code for conversion to autograph is code where the control flow is decided at the _batch_ level. If decisions are made at the individual _example_ level you will still need to index and batch your examples to maintain performance while appling the control flow logic. \n",
"It's relatively simple to integrate `autograph` with `tf.keras`. \n",
"\n",
"\n",
"### Stateless functions\n",
"\n",
"For stateless functions like `collatz`, below, the easiest way to include them in a keras model is to wrap them up as a layer uisng `tf.keras.layers.Lambda`."
"For stateless functions, like `collatz` shown below, the easiest way to include them in a keras model is to wrap them up as a layer uisng `tf.keras.layers.Lambda`."
]
},
{
...
...
@@ -711,7 +711,7 @@
"\n",
"<!--TODO(markdaoust) link to full examples or these referenced models.-->\n",
"\n",
"The easiest way to use autograph is keras layers and models is to `@autograph.convert()` the `call` method. See the [keras guide](https://tensorflow.org/guide/keras#build_advanced_models) for details on how to build on these classes. \n",
"The easiest way to use AutoGraph with Keras layers and models is to `@autograph.convert()` the `call` method. See the [TensorFlow Keras guide](https://tensorflow.org/guide/keras#build_advanced_models) for details on how to build on these classes. \n",
"\n",
"Here is a simple example of the [stocastic network depth](https://arxiv.org/abs/1603.09382) technique :"
]
...
...
@@ -866,9 +866,9 @@
"source": [
"## Advanced example: An in-graph training loop\n",
"\n",
"Since writing control flow in AutoGraph is easy, running a training loop in a TensorFlow graph should also be easy. \n",
"The previous section showed that AutoGraph can be used inside Keras layers and models. Keras models can also be used in AutoGraph code.\n",
"\n",
"Important: While this example wraps a `tf.keras.Model` using AutoGraph, `tf.contrib.autograph` is compatible with `tf.keras` and can be used in [Keras custom layers and models](https://tensorflow.org/guide/keras#build_advanced_models). \n",
"Since writing control flow in AutoGraph is easy, running a training loop in a TensorFlow graph should also be easy. \n",
"\n",
"This example shows how to train a simple Keras model on MNIST with the entire training process—loading batches, calculating gradients, updating parameters, calculating validation accuracy, and repeating until convergence—is performed in-graph."