Commit 4f32535f authored by derekjchow's avatar derekjchow Committed by Neal Wu
Browse files

Update docs in object_detection to reflect new path.` (#2434)

parent fa32bb5f
......@@ -94,7 +94,7 @@ definition as one example. Some remarks:
* We typically initialize the weights of this feature extractor
using those from the
[Slim Resnet-101 classification checkpoint](https://github.com/tensorflow/models/tree/master/slim#pre-trained-models),
[Slim Resnet-101 classification checkpoint](https://github.com/tensorflow/models/tree/master/research/slim#pre-trained-models),
and we know
that images were preprocessed when training this checkpoint
by subtracting a channel mean from each input
......
......@@ -8,10 +8,10 @@ graph proto. A checkpoint will typically consist of three files:
* model.ckpt-${CHECKPOINT_NUMBER}.meta
After you've identified a candidate checkpoint to export, run the following
command from tensorflow/models/object_detection:
command from tensorflow/models/research/object_detection:
``` bash
# From tensorflow/models
# From tensorflow/models/research/
python object_detection/export_inference_graph.py \
--input_type image_tensor \
--pipeline_config_path ${PIPELINE_CONFIG_PATH} \
......
......@@ -7,7 +7,7 @@ Tensorflow Object Detection API depends on the following libraries:
* Protobuf 2.6
* Pillow 1.0
* lxml
* tf Slim (which is included in the "tensorflow/models" checkout)
* tf Slim (which is included in the "tensorflow/models/research/" checkout)
* Jupyter notebook
* Matplotlib
* Tensorflow
......@@ -45,23 +45,23 @@ sudo pip install matplotlib
The Tensorflow Object Detection API uses Protobufs to configure model and
training parameters. Before the framework can be used, the Protobuf libraries
must be compiled. This should be done by running the following command from
the tensorflow/models directory:
the tensorflow/models/research/ directory:
``` bash
# From tensorflow/models/
# From tensorflow/models/research/
protoc object_detection/protos/*.proto --python_out=.
```
## Add Libraries to PYTHONPATH
When running locally, the tensorflow/models/ and slim directories should be
appended to PYTHONPATH. This can be done by running the following from
tensorflow/models/:
When running locally, the tensorflow/models/research/ and slim directories
should be appended to PYTHONPATH. This can be done by running the following from
tensorflow/models/research/:
``` bash
# From tensorflow/models/
# From tensorflow/models/research/
export PYTHONPATH=$PYTHONPATH:`pwd`:`pwd`/slim
```
......
......@@ -13,7 +13,7 @@ To download, extract and convert it to TFRecords, run the following commands
below:
```bash
# From tensorflow/models
# From tensorflow/models/research/
wget http://host.robots.ox.ac.uk/pascal/VOC/voc2012/VOCtrainval_11-May-2012.tar
tar -xvf VOCtrainval_11-May-2012.tar
python object_detection/create_pascal_tf_record.py \
......@@ -27,7 +27,7 @@ python object_detection/create_pascal_tf_record.py \
```
You should end up with two TFRecord files named `pascal_train.record` and
`pascal_val.record` in the `tensorflow/models` directory.
`pascal_val.record` in the `tensorflow/models/research/` directory.
The label map for the PASCAL VOC data set can be found at
`object_detection/data/pascal_label_map.pbtxt`.
......@@ -39,7 +39,7 @@ The Oxford-IIIT Pet data set is located
convert it to TFRecrods, run the following commands below:
```bash
# From tensorflow/models
# From tensorflow/models/research/
wget http://www.robots.ox.ac.uk/~vgg/data/pets/data/images.tar.gz
wget http://www.robots.ox.ac.uk/~vgg/data/pets/data/annotations.tar.gz
tar -xvf annotations.tar.gz
......@@ -51,7 +51,7 @@ python object_detection/create_pet_tf_record.py \
```
You should end up with two TFRecord files named `pet_train.record` and
`pet_val.record` in the `tensorflow/models` directory.
`pet_val.record` in the `tensorflow/models/research/` directory.
The label map for the Pet dataset can be found at
`object_detection/data/pet_label_map.pbtxt`.
......@@ -33,7 +33,7 @@ Oxford-IIIT Pet dataset.
A local training job can be run with the following command:
```bash
# From the tensorflow/models/ directory
# From the tensorflow/models/research/ directory
python object_detection/train.py \
--logtostderr \
--pipeline_config_path=${PATH_TO_YOUR_PIPELINE_CONFIG} \
......@@ -52,7 +52,7 @@ train directory for new checkpoints and evaluate them on a test dataset. The
job can be run using the following command:
```bash
# From the tensorflow/models/ directory
# From the tensorflow/models/research/ directory
python object_detection/eval.py \
--logtostderr \
--pipeline_config_path=${PATH_TO_YOUR_PIPELINE_CONFIG} \
......
......@@ -3,10 +3,10 @@
If you'd like to hit the ground running and run detection on a few example
images right out of the box, we recommend trying out the Jupyter notebook demo.
To run the Jupyter notebook, run the following command from
`tensorflow/models/object_detection`:
`tensorflow/models/research/object_detection`:
```
# From tensorflow/models/object_detection
# From tensorflow/models/research/object_detection
jupyter notebook
```
......
......@@ -27,7 +27,7 @@ packaged (along with it's TF-Slim dependency). The required packages can be
created with the following command
``` bash
# From tensorflow/models/
# From tensorflow/models/research/
python setup.py sdist
(cd slim && python setup.py sdist)
```
......@@ -69,7 +69,7 @@ been written, a user can start a training job on Cloud ML Engine using the
following command:
``` bash
# From tensorflow/models/
# From tensorflow/models/research/
gcloud ml-engine jobs submit training object_detection_`date +%s` \
--job-dir=gs://${TRAIN_DIR} \
--packages dist/object_detection-0.1.tar.gz,slim/dist/slim-0.1.tar.gz \
......
......@@ -51,18 +51,19 @@ dataset for Oxford-IIIT Pets lives
[here](http://www.robots.ox.ac.uk/~vgg/data/pets/). You will need to download
both the image dataset [`images.tar.gz`](http://www.robots.ox.ac.uk/~vgg/data/pets/data/images.tar.gz)
and the groundtruth data [`annotations.tar.gz`](http://www.robots.ox.ac.uk/~vgg/data/pets/data/annotations.tar.gz)
to the `tensorflow/models` directory and unzip them. This may take some time.
to the `tensorflow/models/research/` directory and unzip them. This may take
some time.
``` bash
# From tensorflow/models/
# From tensorflow/models/research/
wget http://www.robots.ox.ac.uk/~vgg/data/pets/data/images.tar.gz
wget http://www.robots.ox.ac.uk/~vgg/data/pets/data/annotations.tar.gz
tar -xvf images.tar.gz
tar -xvf annotations.tar.gz
```
After downloading the tarballs, your `tensorflow/models` directory should appear
as follows:
After downloading the tarballs, your `tensorflow/models/research/` directory
should appear as follows:
```lang-none
- images.tar.gz
......@@ -76,10 +77,10 @@ as follows:
The Tensorflow Object Detection API expects data to be in the TFRecord format,
so we'll now run the `create_pet_tf_record` script to convert from the raw
Oxford-IIIT Pet dataset into TFRecords. Run the following commands from the
`tensorflow/models` directory:
`tensorflow/models/research/` directory:
``` bash
# From tensorflow/models/
# From tensorflow/models/research/
python object_detection/create_pet_tf_record.py \
--label_map_path=object_detection/data/pet_label_map.pbtxt \
--data_dir=`pwd` \
......@@ -90,14 +91,14 @@ Note: It is normal to see some warnings when running this script. You may ignore
them.
Two TFRecord files named `pet_train.record` and `pet_val.record` should be
generated in the `tensorflow/models` directory.
generated in the `tensorflow/models/research/` directory.
Now that the data has been generated, we'll need to upload it to Google Cloud
Storage so the data can be accessed by ML Engine. Run the following command to
copy the files into your GCS bucket (substituting `${YOUR_GCS_BUCKET}`):
``` bash
# From tensorflow/models/
# From tensorflow/models/research/
gsutil cp pet_train.record gs://${YOUR_GCS_BUCKET}/data/pet_train.record
gsutil cp pet_val.record gs://${YOUR_GCS_BUCKET}/data/pet_val.record
gsutil cp object_detection/data/pet_label_map.pbtxt gs://${YOUR_GCS_BUCKET}/data/pet_label_map.pbtxt
......@@ -145,7 +146,7 @@ upload your edited file onto GCS, making note of the path it was uploaded to
(we'll need it when starting the training/eval jobs).
``` bash
# From tensorflow/models/
# From tensorflow/models/research/
# Edit the faster_rcnn_resnet101_pets.config template. Please note that there
# are multiple places where PATH_TO_BE_CONFIGURED needs to be set.
......@@ -187,10 +188,10 @@ Before we can start a job on Google Cloud ML Engine, we must:
2. Write a cluster configuration for our Google Cloud ML job.
To package the Tensorflow Object Detection code, run the following commands from
the `tensorflow/models/` directory:
the `tensorflow/models/research/` directory:
``` bash
# From tensorflow/models/
# From tensorflow/models/research/
python setup.py sdist
(cd slim && python setup.py sdist)
```
......@@ -202,11 +203,11 @@ For running the training Cloud ML job, we'll configure the cluster to use 10
training jobs (1 master + 9 workers) and three parameters servers. The
configuration file can be found at `object_detection/samples/cloud/cloud.yml`.
To start training, execute the following command from the `tensorflow/models/`
directory:
To start training, execute the following command from the
`tensorflow/models/research/` directory:
``` bash
# From tensorflow/models/
# From tensorflow/models/research/
gcloud ml-engine jobs submit training `whoami`_object_detection_`date +%s` \
--job-dir=gs://${YOUR_GCS_BUCKET}/train \
--packages dist/object_detection-0.1.tar.gz,slim/dist/slim-0.1.tar.gz \
......@@ -221,7 +222,7 @@ gcloud ml-engine jobs submit training `whoami`_object_detection_`date +%s` \
Once training has started, we can run an evaluation concurrently:
``` bash
# From tensorflow/models/
# From tensorflow/models/research/
gcloud ml-engine jobs submit training `whoami`_object_detection_eval_`date +%s` \
--job-dir=gs://${YOUR_GCS_BUCKET}/train \
--packages dist/object_detection-0.1.tar.gz,slim/dist/slim-0.1.tar.gz \
......@@ -288,10 +289,10 @@ three files:
* `model.ckpt-${CHECKPOINT_NUMBER}.meta`
After you've identified a candidate checkpoint to export, run the following
command from `tensorflow/models`:
command from `tensorflow/models/research/`:
``` bash
# From tensorflow/models
# From tensorflow/models/research/
gsutil cp gs://${YOUR_GCS_BUCKET}/train/model.ckpt-${CHECKPOINT_NUMBER}.* .
python object_detection/export_inference_graph.py \
--input_type image_tensor \
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment