Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
ModelZoo
ResNet50_tensorflow
Commits
b02ab963
"...git@developer.sourcefind.cn:renzhc/diffusers_dcu.git" did not exist on "addc43af8a45a843aa7cda1dd7c11e2223b3123c"
Commit
b02ab963
authored
Jun 27, 2017
by
James Pruegsanusak
Browse files
Use inline code style for directory names
parent
801f892a
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
11 additions
and
11 deletions
+11
-11
object_detection/g3doc/running_pets.md
object_detection/g3doc/running_pets.md
+11
-11
No files found.
object_detection/g3doc/running_pets.md
View file @
b02ab963
...
@@ -51,8 +51,8 @@ dataset for Oxford-IIIT Pets lives
...
@@ -51,8 +51,8 @@ dataset for Oxford-IIIT Pets lives
[
here
](
http://www.robots.ox.ac.uk/~vgg/data/pets/
)
. You will need to download
[
here
](
http://www.robots.ox.ac.uk/~vgg/data/pets/
)
. You will need to download
both the image dataset
[
`images.tar.gz`
](
http://www.robots.ox.ac.uk/~vgg/data/pets/data/images.tar.gz
)
both the image dataset
[
`images.tar.gz`
](
http://www.robots.ox.ac.uk/~vgg/data/pets/data/images.tar.gz
)
and the groundtruth data
[
`annotations.tar.gz`
](
http://www.robots.ox.ac.uk/~vgg/data/pets/data/annotations.tar.gz
)
and the groundtruth data
[
`annotations.tar.gz`
](
http://www.robots.ox.ac.uk/~vgg/data/pets/data/annotations.tar.gz
)
to the tensorflow/models directory. This may take some time. After downloading
to the
`
tensorflow/models
`
directory. This may take some time. After downloading
the tarballs, your object_detection directory should appear as follows:
the tarballs, your
`
object_detection
`
directory should appear as follows:
```
lang-none
```
lang-none
+ object_detection/
+ object_detection/
...
@@ -66,7 +66,7 @@ the tarballs, your object_detection directory should appear as follows:
...
@@ -66,7 +66,7 @@ the tarballs, your object_detection directory should appear as follows:
The Tensorflow Object Detection API expects data to be in the TFRecord format,
The Tensorflow Object Detection API expects data to be in the TFRecord format,
so we'll now run the
`create_pet_tf_record`
script to convert from the raw
so we'll now run the
`create_pet_tf_record`
script to convert from the raw
Oxford-IIIT Pet dataset into TFRecords. Run the following commands from the
Oxford-IIIT Pet dataset into TFRecords. Run the following commands from the
object_detection directory:
`
object_detection
`
directory:
```
bash
```
bash
# From tensorflow/models/
# From tensorflow/models/
...
@@ -84,7 +84,7 @@ Note: It is normal to see some warnings when running this script. You may ignore
...
@@ -84,7 +84,7 @@ Note: It is normal to see some warnings when running this script. You may ignore
them.
them.
Two TFRecord files named
`pet_train.record`
and
`pet_val.record`
should be generated
Two TFRecord files named
`pet_train.record`
and
`pet_val.record`
should be generated
in the object_detection
/
directory.
in the
`
object_detection
`
directory.
Now that the data has been generated, we'll need to upload it to Google Cloud
Now that the data has been generated, we'll need to upload it to Google Cloud
Storage so the data can be accessed by ML Engine. Run the following command to
Storage so the data can be accessed by ML Engine. Run the following command to
...
@@ -127,7 +127,7 @@ In the Tensorflow Object Detection API, the model parameters, training
...
@@ -127,7 +127,7 @@ In the Tensorflow Object Detection API, the model parameters, training
parameters and eval parameters are all defined by a config file. More details
parameters and eval parameters are all defined by a config file. More details
can be found
[
here
](
configuring_jobs.md
)
. For this tutorial, we will use some
can be found
[
here
](
configuring_jobs.md
)
. For this tutorial, we will use some
predefined templates provided with the source code. In the
predefined templates provided with the source code. In the
object_detection/samples/configs folder, there are skeleton object_detection
`
object_detection/samples/configs
`
folder, there are skeleton object_detection
configuration files. We will use
`faster_rcnn_resnet101_pets.config`
as a
configuration files. We will use
`faster_rcnn_resnet101_pets.config`
as a
starting point for configuring the pipeline. Open the file with your favourite
starting point for configuring the pipeline. Open the file with your favourite
text editor.
text editor.
...
@@ -181,7 +181,7 @@ Before we can start a job on Google Cloud ML Engine, we must:
...
@@ -181,7 +181,7 @@ Before we can start a job on Google Cloud ML Engine, we must:
2.
Write a cluster configuration for our Google Cloud ML job.
2.
Write a cluster configuration for our Google Cloud ML job.
To package the Tensorflow Object Detection code, run the following commands from
To package the Tensorflow Object Detection code, run the following commands from
the tensorflow/models/ directory:
the
`
tensorflow/models/
`
directory:
```
bash
```
bash
# From tensorflow/models/
# From tensorflow/models/
...
@@ -196,7 +196,7 @@ For running the training Cloud ML job, we'll configure the cluster to use 10
...
@@ -196,7 +196,7 @@ For running the training Cloud ML job, we'll configure the cluster to use 10
training jobs (1 master + 9 workers) and three parameters servers. The
training jobs (1 master + 9 workers) and three parameters servers. The
configuration file can be found at
`object_detection/samples/cloud/cloud.yml`
.
configuration file can be found at
`object_detection/samples/cloud/cloud.yml`
.
To start training, execute the following command from the tensorflow/models/
To start training, execute the following command from the
`
tensorflow/models/
`
directory:
directory:
```
bash
```
bash
...
@@ -274,12 +274,12 @@ Browser](https://console.cloud.google.com/storage/browser). The file should be
...
@@ -274,12 +274,12 @@ Browser](https://console.cloud.google.com/storage/browser). The file should be
stored under
`${YOUR_GCS_BUCKET}/train`
. The checkpoint will typically consist of
stored under
`${YOUR_GCS_BUCKET}/train`
. The checkpoint will typically consist of
three files:
three files:
*
model.ckpt-${CHECKPOINT_NUMBER}.data-00000-of-00001
,
*
`
model.ckpt-${CHECKPOINT_NUMBER}.data-00000-of-00001
`
*
model.ckpt-${CHECKPOINT_NUMBER}.index
*
`
model.ckpt-${CHECKPOINT_NUMBER}.index
`
*
model.ckpt-${CHECKPOINT_NUMBER}.meta
*
`
model.ckpt-${CHECKPOINT_NUMBER}.meta
`
After you've identified a candidate checkpoint to export, run the following
After you've identified a candidate checkpoint to export, run the following
command from tensorflow/models/object_detection:
command from
`
tensorflow/models/object_detection
`
:
```
bash
```
bash
# From tensorflow/models
# From tensorflow/models
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment