example.ipynb 5.2 KB
Newer Older
turneram's avatar
turneram committed
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
{
 "cells": [
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "# Exporting Frozen Graphs in Tensorflow 2 \n",
    "In order to use a trained model as input to MIGraphX, the model must be first be saved in a frozen graph format. This was accomplished in Tensorflow 1 by launching a graph in a tf.Session and then saving the session. However, Tensorflow has decided to deprecate Sessions in favor of functions and SavedModel format.  "
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "After importing the necessary libraries, the next step is to instantiate a model. For simplicity, in this example we will use a resnet50 architecture with pre-trained imagenet weights. These weights may also be trained or fine-tuned before freezing. "
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": [
    "import tensorflow as tf\n",
    "tf.enable_eager_execution() #May not be required depending on tensorflow version\n",
    "from tensorflow.python.framework.convert_to_constants import convert_variables_to_constants_v2\n",
    "from tensorflow import keras\n",
    "from tensorflow.keras import layers\n",
    "\n",
    "MODEL_NAME = \"resnet50\"\n",
    "model = tf.keras.applications.ResNet50(weights=\"imagenet\")\n",
    "model.summary()"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## SavedModel format\n",
    "The simplest way to save a model is through saved\\_model.save()\n",
    "\n",
    "This will create an equivalent tensorflow program which can later be loaded for fine-tuning or inference, although it is not directly compatible with MIGraphX."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": [
    "tf.saved_model.save(model, \"./Saved_Models/{}\".format(MODEL_NAME))"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## Convert to ConcreteFunction\n",
    "To begin, we need to get the function equivalent of the model and then concretize the function to avoid retracing."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": [
    "full_model = tf.function(lambda x: model(x))\n",
    "full_model = full_model.get_concrete_function(\n",
    "    x=tf.TensorSpec(model.inputs[0].shape, model.inputs[0].dtype))"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## Freeze ConcreteFunction and Serialize\n",
    "Since we are saving the graph for the purpose of inference, all variables can be made constant (i.e. \"frozen\").\n",
    "\n",
    "Next, we need to obtain a serialized GraphDef representation of the graph. \n",
    "\n",
    "\n",
    "Optionally, the operators can be printed out layer by layer followed by the inputs and outputs."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": [
    "frozen_func = convert_variables_to_constants_v2(full_model)\n",
    "frozen_func.graph.as_graph_def()\n",
    "\n",
    "layers = [op.name for op in frozen_func.graph.get_operations()]\n",
    "print(\"-\" * 50)\n",
    "print(\"Frozen model layers: \")\n",
    "for layer in layers:\n",
    "    print(layer)\n",
    "\n",
    "print(\"-\" * 50)\n",
    "print(\"Frozen model inputs: \")\n",
    "print(frozen_func.inputs)\n",
    "print(\"Frozen model outputs: \")\n",
    "print(frozen_func.outputs)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## Save Frozen Graph as Protobuf\n",
    "Finally, we can save to hard drive, and now the frozen graph will be stored as `./frozen_models/<MODEL_NAME>_frozen_graph.pb`"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": [
    "tf.io.write_graph(graph_or_graph_def=frozen_func.graph,\n",
    "                  logdir=\"./frozen_models\",\n",
    "                  name=\"{}_frozen_graph.pb\".format(MODEL_NAME),\n",
    "                  as_text=False)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "Assuming MIGraphX has already been built and installed on your system, the driver can be used to verify that the frozen graph has been correctly exported. "
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": [
    "import subprocess\n",
    "driver = \"/opt/rocm/bin/migraphx-driver\"\n",
    "command = \"read\"\n",
    "model_path = \"./frozen_models/{}_frozen_graph.pb\".format(MODEL_NAME)\n",
    "process = subprocess.run([driver, command, model_path], \n",
    "                         stdout=subprocess.PIPE, \n",
    "                         universal_newlines=True)\n",
    "\n",
    "print(process.stdout)"
   ]
  }
 ],
 "metadata": {
  "kernelspec": {
   "display_name": "tensorflow",
   "language": "python",
   "name": "tensorflow"
  },
  "language_info": {
   "codemirror_mode": {
    "name": "ipython",
    "version": 3
   },
   "file_extension": ".py",
   "mimetype": "text/x-python",
   "name": "python",
   "nbconvert_exporter": "python",
168
   "pygments_lexer": "ipython3"
turneram's avatar
turneram committed
169
170
171
172
173
  }
 },
 "nbformat": 4,
 "nbformat_minor": 4
}