Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
ModelZoo
ResNet50_tensorflow
Commits
63ef92aa
Commit
63ef92aa
authored
Sep 14, 2020
by
Sachin Joglekar
Committed by
TF Object Detection Team
Sep 14, 2020
Browse files
Improve post-training quant documentation for TF2 ODAPI. Also added note about TF version.
PiperOrigin-RevId: 331557693
parent
33a4c207
Changes
1
Show whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
21 additions
and
5 deletions
+21
-5
research/object_detection/g3doc/running_on_mobile_tf2.md
research/object_detection/g3doc/running_on_mobile_tf2.md
+21
-5
No files found.
research/object_detection/g3doc/running_on_mobile_tf2.md
View file @
63ef92aa
# Running TF2 Detection API Models on mobile
[

](https://github.com/tensorflow/tensorflow/releases/tag/v2.
2
.0)
[

](https://github.com/tensorflow/tensorflow/releases/tag/v2.
3
.0)
[

](https://www.python.org/downloads/release/python-360/)
**NOTE:**
This support was added
*after*
TF2.3, so please use the latest nightly
for the TensorFlow Lite Converter for this to work.
[
TensorFlow Lite
](
https://www.tensorflow.org/mobile/tflite/
)(
TFLite
)
is
TensorFlow’s lightweight solution for mobile and embedded devices. It enables
on-device machine learning inference with low latency and a small binary size.
...
...
@@ -54,17 +57,30 @@ python object_detection/export_tflite_graph_tf2.py \
--output_directory
path/to/exported_model_directory
```
Use
`--help`
with the aboev script to get the full list of supported parameters.
Use
`--help`
with the above script to get the full list of supported parameters.
These can fine-tune accuracy and speed for your model.
### Step 2: Convert to TFLite
Use the
[
TensorFlow Lite Converter
](
https://www.tensorflow.org/lite/convert
)
to
convert the
`SavedModel`
to TFLite. You can also leverage
convert the
`SavedModel`
to TFLite. Note that you need to use
`from_saved_model`
for TFLite conversion with the Python API.
You can also leverage
[
Post-training Quantization
](
https://www.tensorflow.org/lite/performance/post_training_quantization
)
to
[
optimize performance
](
https://www.tensorflow.org/lite/performance/model_optimization
)
and obtain a smaller model. Note that you need to use
`from_saved_model`
for
TFLite conversion with the Python API.
and obtain a smaller model. Note that this is only possible from the
*
Python
API
*
. Be sure to use a
[
representative dataset
](
https://www.tensorflow.org/lite/performance/post_training_quantization#full_integer_quantization
)
and set the following options on the converter:
```
converter.optimizations = [tf.lite.Optimize.DEFAULT]
converter.target_spec.supported_ops = [tf.lite.OpsSet.TFLITE_BUILTINS_INT8,
tf.lite.OpsSet.TFLITE_BUILTINS]
converter.representative_dataset = <...>
```
## Running our model on Android
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment