Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
ModelZoo
ResNet50_tensorflow
Commits
4f32535f
Commit
4f32535f
authored
Sep 22, 2017
by
derekjchow
Committed by
Neal Wu
Sep 22, 2017
Browse files
Update docs in object_detection to reflect new path.` (#2434)
parent
fa32bb5f
Changes
8
Hide whitespace changes
Inline
Side-by-side
Showing
8 changed files
with
38 additions
and
37 deletions
+38
-37
research/object_detection/g3doc/defining_your_own_model.md
research/object_detection/g3doc/defining_your_own_model.md
+1
-1
research/object_detection/g3doc/exporting_models.md
research/object_detection/g3doc/exporting_models.md
+2
-2
research/object_detection/g3doc/installation.md
research/object_detection/g3doc/installation.md
+7
-7
research/object_detection/g3doc/preparing_inputs.md
research/object_detection/g3doc/preparing_inputs.md
+4
-4
research/object_detection/g3doc/running_locally.md
research/object_detection/g3doc/running_locally.md
+2
-2
research/object_detection/g3doc/running_notebook.md
research/object_detection/g3doc/running_notebook.md
+2
-2
research/object_detection/g3doc/running_on_cloud.md
research/object_detection/g3doc/running_on_cloud.md
+2
-2
research/object_detection/g3doc/running_pets.md
research/object_detection/g3doc/running_pets.md
+18
-17
No files found.
research/object_detection/g3doc/defining_your_own_model.md
View file @
4f32535f
...
@@ -94,7 +94,7 @@ definition as one example. Some remarks:
...
@@ -94,7 +94,7 @@ definition as one example. Some remarks:
*
We typically initialize the weights of this feature extractor
*
We typically initialize the weights of this feature extractor
using those from the
using those from the
[
Slim Resnet-101 classification checkpoint
](
https://github.com/tensorflow/models/tree/master/slim#pre-trained-models
)
,
[
Slim Resnet-101 classification checkpoint
](
https://github.com/tensorflow/models/tree/master/
research/
slim#pre-trained-models
)
,
and we know
and we know
that images were preprocessed when training this checkpoint
that images were preprocessed when training this checkpoint
by subtracting a channel mean from each input
by subtracting a channel mean from each input
...
...
research/object_detection/g3doc/exporting_models.md
View file @
4f32535f
...
@@ -8,10 +8,10 @@ graph proto. A checkpoint will typically consist of three files:
...
@@ -8,10 +8,10 @@ graph proto. A checkpoint will typically consist of three files:
*
model.ckpt-${CHECKPOINT_NUMBER}.meta
*
model.ckpt-${CHECKPOINT_NUMBER}.meta
After you've identified a candidate checkpoint to export, run the following
After you've identified a candidate checkpoint to export, run the following
command from tensorflow/models/object_detection:
command from tensorflow/models/
research/
object_detection:
```
bash
```
bash
# From tensorflow/models
# From tensorflow/models
/research/
python object_detection/export_inference_graph.py
\
python object_detection/export_inference_graph.py
\
--input_type
image_tensor
\
--input_type
image_tensor
\
--pipeline_config_path
${
PIPELINE_CONFIG_PATH
}
\
--pipeline_config_path
${
PIPELINE_CONFIG_PATH
}
\
...
...
research/object_detection/g3doc/installation.md
View file @
4f32535f
...
@@ -7,7 +7,7 @@ Tensorflow Object Detection API depends on the following libraries:
...
@@ -7,7 +7,7 @@ Tensorflow Object Detection API depends on the following libraries:
*
Protobuf 2.6
*
Protobuf 2.6
*
Pillow 1.0
*
Pillow 1.0
*
lxml
*
lxml
*
tf Slim (which is included in the "tensorflow/models" checkout)
*
tf Slim (which is included in the "tensorflow/models
/research/
" checkout)
*
Jupyter notebook
*
Jupyter notebook
*
Matplotlib
*
Matplotlib
*
Tensorflow
*
Tensorflow
...
@@ -45,23 +45,23 @@ sudo pip install matplotlib
...
@@ -45,23 +45,23 @@ sudo pip install matplotlib
The Tensorflow Object Detection API uses Protobufs to configure model and
The Tensorflow Object Detection API uses Protobufs to configure model and
training parameters. Before the framework can be used, the Protobuf libraries
training parameters. Before the framework can be used, the Protobuf libraries
must be compiled. This should be done by running the following command from
must be compiled. This should be done by running the following command from
the tensorflow/models directory:
the tensorflow/models
/research/
directory:
```
bash
```
bash
# From tensorflow/models/
# From tensorflow/models/
research/
protoc object_detection/protos/
*
.proto
--python_out
=
.
protoc object_detection/protos/
*
.proto
--python_out
=
.
```
```
## Add Libraries to PYTHONPATH
## Add Libraries to PYTHONPATH
When running locally, the tensorflow/models/ and slim directories
should be
When running locally, the tensorflow/models/
research/
and slim directories
appended to PYTHONPATH. This can be done by running the following from
should be
appended to PYTHONPATH. This can be done by running the following from
tensorflow/models/:
tensorflow/models/
research/
:
```
bash
```
bash
# From tensorflow/models/
# From tensorflow/models/
research/
export
PYTHONPATH
=
$PYTHONPATH
:
`
pwd
`
:
`
pwd
`
/slim
export
PYTHONPATH
=
$PYTHONPATH
:
`
pwd
`
:
`
pwd
`
/slim
```
```
...
...
research/object_detection/g3doc/preparing_inputs.md
View file @
4f32535f
...
@@ -13,7 +13,7 @@ To download, extract and convert it to TFRecords, run the following commands
...
@@ -13,7 +13,7 @@ To download, extract and convert it to TFRecords, run the following commands
below:
below:
```
bash
```
bash
# From tensorflow/models
# From tensorflow/models
/research/
wget http://host.robots.ox.ac.uk/pascal/VOC/voc2012/VOCtrainval_11-May-2012.tar
wget http://host.robots.ox.ac.uk/pascal/VOC/voc2012/VOCtrainval_11-May-2012.tar
tar
-xvf
VOCtrainval_11-May-2012.tar
tar
-xvf
VOCtrainval_11-May-2012.tar
python object_detection/create_pascal_tf_record.py
\
python object_detection/create_pascal_tf_record.py
\
...
@@ -27,7 +27,7 @@ python object_detection/create_pascal_tf_record.py \
...
@@ -27,7 +27,7 @@ python object_detection/create_pascal_tf_record.py \
```
```
You should end up with two TFRecord files named
`pascal_train.record`
and
You should end up with two TFRecord files named
`pascal_train.record`
and
`pascal_val.record`
in the
`tensorflow/models`
directory.
`pascal_val.record`
in the
`tensorflow/models
/research/
`
directory.
The label map for the PASCAL VOC data set can be found at
The label map for the PASCAL VOC data set can be found at
`object_detection/data/pascal_label_map.pbtxt`
.
`object_detection/data/pascal_label_map.pbtxt`
.
...
@@ -39,7 +39,7 @@ The Oxford-IIIT Pet data set is located
...
@@ -39,7 +39,7 @@ The Oxford-IIIT Pet data set is located
convert it to TFRecrods, run the following commands below:
convert it to TFRecrods, run the following commands below:
```
bash
```
bash
# From tensorflow/models
# From tensorflow/models
/research/
wget http://www.robots.ox.ac.uk/~vgg/data/pets/data/images.tar.gz
wget http://www.robots.ox.ac.uk/~vgg/data/pets/data/images.tar.gz
wget http://www.robots.ox.ac.uk/~vgg/data/pets/data/annotations.tar.gz
wget http://www.robots.ox.ac.uk/~vgg/data/pets/data/annotations.tar.gz
tar
-xvf
annotations.tar.gz
tar
-xvf
annotations.tar.gz
...
@@ -51,7 +51,7 @@ python object_detection/create_pet_tf_record.py \
...
@@ -51,7 +51,7 @@ python object_detection/create_pet_tf_record.py \
```
```
You should end up with two TFRecord files named
`pet_train.record`
and
You should end up with two TFRecord files named
`pet_train.record`
and
`pet_val.record`
in the
`tensorflow/models`
directory.
`pet_val.record`
in the
`tensorflow/models
/research/
`
directory.
The label map for the Pet dataset can be found at
The label map for the Pet dataset can be found at
`object_detection/data/pet_label_map.pbtxt`
.
`object_detection/data/pet_label_map.pbtxt`
.
research/object_detection/g3doc/running_locally.md
View file @
4f32535f
...
@@ -33,7 +33,7 @@ Oxford-IIIT Pet dataset.
...
@@ -33,7 +33,7 @@ Oxford-IIIT Pet dataset.
A local training job can be run with the following command:
A local training job can be run with the following command:
```
bash
```
bash
# From the tensorflow/models/ directory
# From the tensorflow/models/
research/
directory
python object_detection/train.py
\
python object_detection/train.py
\
--logtostderr
\
--logtostderr
\
--pipeline_config_path
=
${
PATH_TO_YOUR_PIPELINE_CONFIG
}
\
--pipeline_config_path
=
${
PATH_TO_YOUR_PIPELINE_CONFIG
}
\
...
@@ -52,7 +52,7 @@ train directory for new checkpoints and evaluate them on a test dataset. The
...
@@ -52,7 +52,7 @@ train directory for new checkpoints and evaluate them on a test dataset. The
job can be run using the following command:
job can be run using the following command:
```
bash
```
bash
# From the tensorflow/models/ directory
# From the tensorflow/models/
research/
directory
python object_detection/eval.py
\
python object_detection/eval.py
\
--logtostderr
\
--logtostderr
\
--pipeline_config_path
=
${
PATH_TO_YOUR_PIPELINE_CONFIG
}
\
--pipeline_config_path
=
${
PATH_TO_YOUR_PIPELINE_CONFIG
}
\
...
...
research/object_detection/g3doc/running_notebook.md
View file @
4f32535f
...
@@ -3,10 +3,10 @@
...
@@ -3,10 +3,10 @@
If you'd like to hit the ground running and run detection on a few example
If you'd like to hit the ground running and run detection on a few example
images right out of the box, we recommend trying out the Jupyter notebook demo.
images right out of the box, we recommend trying out the Jupyter notebook demo.
To run the Jupyter notebook, run the following command from
To run the Jupyter notebook, run the following command from
`tensorflow/models/object_detection`
:
`tensorflow/models/
research/
object_detection`
:
```
```
# From tensorflow/models/object_detection
# From tensorflow/models/
research/
object_detection
jupyter notebook
jupyter notebook
```
```
...
...
research/object_detection/g3doc/running_on_cloud.md
View file @
4f32535f
...
@@ -27,7 +27,7 @@ packaged (along with it's TF-Slim dependency). The required packages can be
...
@@ -27,7 +27,7 @@ packaged (along with it's TF-Slim dependency). The required packages can be
created with the following command
created with the following command
```
bash
```
bash
# From tensorflow/models/
# From tensorflow/models/
research/
python setup.py sdist
python setup.py sdist
(
cd
slim
&&
python setup.py sdist
)
(
cd
slim
&&
python setup.py sdist
)
```
```
...
@@ -69,7 +69,7 @@ been written, a user can start a training job on Cloud ML Engine using the
...
@@ -69,7 +69,7 @@ been written, a user can start a training job on Cloud ML Engine using the
following command:
following command:
```
bash
```
bash
# From tensorflow/models/
# From tensorflow/models/
research/
gcloud ml-engine
jobs
submit training object_detection_
`
date
+%s
`
\
gcloud ml-engine
jobs
submit training object_detection_
`
date
+%s
`
\
--job-dir
=
gs://
${
TRAIN_DIR
}
\
--job-dir
=
gs://
${
TRAIN_DIR
}
\
--packages
dist/object_detection-0.1.tar.gz,slim/dist/slim-0.1.tar.gz
\
--packages
dist/object_detection-0.1.tar.gz,slim/dist/slim-0.1.tar.gz
\
...
...
research/object_detection/g3doc/running_pets.md
View file @
4f32535f
...
@@ -51,18 +51,19 @@ dataset for Oxford-IIIT Pets lives
...
@@ -51,18 +51,19 @@ dataset for Oxford-IIIT Pets lives
[
here
](
http://www.robots.ox.ac.uk/~vgg/data/pets/
)
. You will need to download
[
here
](
http://www.robots.ox.ac.uk/~vgg/data/pets/
)
. You will need to download
both the image dataset
[
`images.tar.gz`
](
http://www.robots.ox.ac.uk/~vgg/data/pets/data/images.tar.gz
)
both the image dataset
[
`images.tar.gz`
](
http://www.robots.ox.ac.uk/~vgg/data/pets/data/images.tar.gz
)
and the groundtruth data
[
`annotations.tar.gz`
](
http://www.robots.ox.ac.uk/~vgg/data/pets/data/annotations.tar.gz
)
and the groundtruth data
[
`annotations.tar.gz`
](
http://www.robots.ox.ac.uk/~vgg/data/pets/data/annotations.tar.gz
)
to the
`tensorflow/models`
directory and unzip them. This may take some time.
to the
`tensorflow/models/research/`
directory and unzip them. This may take
some time.
```
bash
```
bash
# From tensorflow/models/
# From tensorflow/models/
research/
wget http://www.robots.ox.ac.uk/~vgg/data/pets/data/images.tar.gz
wget http://www.robots.ox.ac.uk/~vgg/data/pets/data/images.tar.gz
wget http://www.robots.ox.ac.uk/~vgg/data/pets/data/annotations.tar.gz
wget http://www.robots.ox.ac.uk/~vgg/data/pets/data/annotations.tar.gz
tar
-xvf
images.tar.gz
tar
-xvf
images.tar.gz
tar
-xvf
annotations.tar.gz
tar
-xvf
annotations.tar.gz
```
```
After downloading the tarballs, your
`tensorflow/models`
directory
should appear
After downloading the tarballs, your
`tensorflow/models
/research/
`
directory
as follows:
should appear
as follows:
```
lang-none
```
lang-none
- images.tar.gz
- images.tar.gz
...
@@ -76,10 +77,10 @@ as follows:
...
@@ -76,10 +77,10 @@ as follows:
The Tensorflow Object Detection API expects data to be in the TFRecord format,
The Tensorflow Object Detection API expects data to be in the TFRecord format,
so we'll now run the
`create_pet_tf_record`
script to convert from the raw
so we'll now run the
`create_pet_tf_record`
script to convert from the raw
Oxford-IIIT Pet dataset into TFRecords. Run the following commands from the
Oxford-IIIT Pet dataset into TFRecords. Run the following commands from the
`tensorflow/models`
directory:
`tensorflow/models
/research/
`
directory:
```
bash
```
bash
# From tensorflow/models/
# From tensorflow/models/
research/
python object_detection/create_pet_tf_record.py
\
python object_detection/create_pet_tf_record.py
\
--label_map_path
=
object_detection/data/pet_label_map.pbtxt
\
--label_map_path
=
object_detection/data/pet_label_map.pbtxt
\
--data_dir
=
`
pwd
`
\
--data_dir
=
`
pwd
`
\
...
@@ -90,14 +91,14 @@ Note: It is normal to see some warnings when running this script. You may ignore
...
@@ -90,14 +91,14 @@ Note: It is normal to see some warnings when running this script. You may ignore
them.
them.
Two TFRecord files named
`pet_train.record`
and
`pet_val.record`
should be
Two TFRecord files named
`pet_train.record`
and
`pet_val.record`
should be
generated in the
`tensorflow/models`
directory.
generated in the
`tensorflow/models
/research/
`
directory.
Now that the data has been generated, we'll need to upload it to Google Cloud
Now that the data has been generated, we'll need to upload it to Google Cloud
Storage so the data can be accessed by ML Engine. Run the following command to
Storage so the data can be accessed by ML Engine. Run the following command to
copy the files into your GCS bucket (substituting
`${YOUR_GCS_BUCKET}`
):
copy the files into your GCS bucket (substituting
`${YOUR_GCS_BUCKET}`
):
```
bash
```
bash
# From tensorflow/models/
# From tensorflow/models/
research/
gsutil
cp
pet_train.record gs://
${
YOUR_GCS_BUCKET
}
/data/pet_train.record
gsutil
cp
pet_train.record gs://
${
YOUR_GCS_BUCKET
}
/data/pet_train.record
gsutil
cp
pet_val.record gs://
${
YOUR_GCS_BUCKET
}
/data/pet_val.record
gsutil
cp
pet_val.record gs://
${
YOUR_GCS_BUCKET
}
/data/pet_val.record
gsutil
cp
object_detection/data/pet_label_map.pbtxt gs://
${
YOUR_GCS_BUCKET
}
/data/pet_label_map.pbtxt
gsutil
cp
object_detection/data/pet_label_map.pbtxt gs://
${
YOUR_GCS_BUCKET
}
/data/pet_label_map.pbtxt
...
@@ -145,7 +146,7 @@ upload your edited file onto GCS, making note of the path it was uploaded to
...
@@ -145,7 +146,7 @@ upload your edited file onto GCS, making note of the path it was uploaded to
(we'll need it when starting the training/eval jobs).
(we'll need it when starting the training/eval jobs).
```
bash
```
bash
# From tensorflow/models/
# From tensorflow/models/
research/
# Edit the faster_rcnn_resnet101_pets.config template. Please note that there
# Edit the faster_rcnn_resnet101_pets.config template. Please note that there
# are multiple places where PATH_TO_BE_CONFIGURED needs to be set.
# are multiple places where PATH_TO_BE_CONFIGURED needs to be set.
...
@@ -187,10 +188,10 @@ Before we can start a job on Google Cloud ML Engine, we must:
...
@@ -187,10 +188,10 @@ Before we can start a job on Google Cloud ML Engine, we must:
2.
Write a cluster configuration for our Google Cloud ML job.
2.
Write a cluster configuration for our Google Cloud ML job.
To package the Tensorflow Object Detection code, run the following commands from
To package the Tensorflow Object Detection code, run the following commands from
the
`tensorflow/models/`
directory:
the
`tensorflow/models/
research/
`
directory:
```
bash
```
bash
# From tensorflow/models/
# From tensorflow/models/
research/
python setup.py sdist
python setup.py sdist
(
cd
slim
&&
python setup.py sdist
)
(
cd
slim
&&
python setup.py sdist
)
```
```
...
@@ -202,11 +203,11 @@ For running the training Cloud ML job, we'll configure the cluster to use 10
...
@@ -202,11 +203,11 @@ For running the training Cloud ML job, we'll configure the cluster to use 10
training jobs (1 master + 9 workers) and three parameters servers. The
training jobs (1 master + 9 workers) and three parameters servers. The
configuration file can be found at
`object_detection/samples/cloud/cloud.yml`
.
configuration file can be found at
`object_detection/samples/cloud/cloud.yml`
.
To start training, execute the following command from the
`tensorflow/models/`
To start training, execute the following command from the
directory:
`tensorflow/models/research/`
directory:
```
bash
```
bash
# From tensorflow/models/
# From tensorflow/models/
research/
gcloud ml-engine
jobs
submit training
`
whoami
`
_object_detection_
`
date
+%s
`
\
gcloud ml-engine
jobs
submit training
`
whoami
`
_object_detection_
`
date
+%s
`
\
--job-dir
=
gs://
${
YOUR_GCS_BUCKET
}
/train
\
--job-dir
=
gs://
${
YOUR_GCS_BUCKET
}
/train
\
--packages
dist/object_detection-0.1.tar.gz,slim/dist/slim-0.1.tar.gz
\
--packages
dist/object_detection-0.1.tar.gz,slim/dist/slim-0.1.tar.gz
\
...
@@ -221,7 +222,7 @@ gcloud ml-engine jobs submit training `whoami`_object_detection_`date +%s` \
...
@@ -221,7 +222,7 @@ gcloud ml-engine jobs submit training `whoami`_object_detection_`date +%s` \
Once training has started, we can run an evaluation concurrently:
Once training has started, we can run an evaluation concurrently:
```
bash
```
bash
# From tensorflow/models/
# From tensorflow/models/
research/
gcloud ml-engine
jobs
submit training
`
whoami
`
_object_detection_eval_
`
date
+%s
`
\
gcloud ml-engine
jobs
submit training
`
whoami
`
_object_detection_eval_
`
date
+%s
`
\
--job-dir
=
gs://
${
YOUR_GCS_BUCKET
}
/train
\
--job-dir
=
gs://
${
YOUR_GCS_BUCKET
}
/train
\
--packages
dist/object_detection-0.1.tar.gz,slim/dist/slim-0.1.tar.gz
\
--packages
dist/object_detection-0.1.tar.gz,slim/dist/slim-0.1.tar.gz
\
...
@@ -288,10 +289,10 @@ three files:
...
@@ -288,10 +289,10 @@ three files:
*
`model.ckpt-${CHECKPOINT_NUMBER}.meta`
*
`model.ckpt-${CHECKPOINT_NUMBER}.meta`
After you've identified a candidate checkpoint to export, run the following
After you've identified a candidate checkpoint to export, run the following
command from
`tensorflow/models`
:
command from
`tensorflow/models
/research/
`
:
```
bash
```
bash
# From tensorflow/models
# From tensorflow/models
/research/
gsutil
cp
gs://
${
YOUR_GCS_BUCKET
}
/train/model.ckpt-
${
CHECKPOINT_NUMBER
}
.
*
.
gsutil
cp
gs://
${
YOUR_GCS_BUCKET
}
/train/model.ckpt-
${
CHECKPOINT_NUMBER
}
.
*
.
python object_detection/export_inference_graph.py
\
python object_detection/export_inference_graph.py
\
--input_type
image_tensor
\
--input_type
image_tensor
\
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment