Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
ModelZoo
ResNet50_tensorflow
Commits
fdc4ce37
Commit
fdc4ce37
authored
Sep 12, 2016
by
Xin Pan
Browse files
Fix README
parent
2a5a5596
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
33 additions
and
30 deletions
+33
-30
lm_1b/README.md
lm_1b/README.md
+33
-30
No files found.
lm_1b/README.md
View file @
fdc4ce37
...
@@ -79,25 +79,25 @@ Pre-requesite:
...
@@ -79,25 +79,25 @@ Pre-requesite:
*
Install Bazel.
*
Install Bazel.
*
Download the data files:
*
Download the data files:
*
Model GraphDef file:
*
Model GraphDef file:
[
link
](
download.tensorflow.org/models/LM_LSTM_CNN/graph-2016-09-10.pbtxt
)
[
link
](
http://
download.tensorflow.org/models/LM_LSTM_CNN/graph-2016-09-10.pbtxt
)
*
Model Checkpoint sharded file:
*
Model Checkpoint sharded file:
[
1
](
download.tensorflow.org/models/LM_LSTM_CNN/all_shards-2016-09-10/ckpt-base
)
[
1
](
http://
download.tensorflow.org/models/LM_LSTM_CNN/all_shards-2016-09-10/ckpt-base
)
[
2
](
download.tensorflow.org/models/LM_LSTM_CNN/all_shards-2016-09-10/ckpt-char-embedding
)
[
2
](
http://
download.tensorflow.org/models/LM_LSTM_CNN/all_shards-2016-09-10/ckpt-char-embedding
)
[
3
](
download.tensorflow.org/models/LM_LSTM_CNN/all_shards-2016-09-10/ckpt-lstm
)
[
3
](
http://
download.tensorflow.org/models/LM_LSTM_CNN/all_shards-2016-09-10/ckpt-lstm
)
[
4
](
download.tensorflow.org/models/LM_LSTM_CNN/all_shards-2016-09-10/ckpt-softmax0
)
[
4
](
http://
download.tensorflow.org/models/LM_LSTM_CNN/all_shards-2016-09-10/ckpt-softmax0
)
[
5
](
download.tensorflow.org/models/LM_LSTM_CNN/all_shards-2016-09-10/ckpt-softmax1
)
[
5
](
http://
download.tensorflow.org/models/LM_LSTM_CNN/all_shards-2016-09-10/ckpt-softmax1
)
[
6
](
download.tensorflow.org/models/LM_LSTM_CNN/all_shards-2016-09-10/ckpt-softmax2
)
[
6
](
http://
download.tensorflow.org/models/LM_LSTM_CNN/all_shards-2016-09-10/ckpt-softmax2
)
[
7
](
download.tensorflow.org/models/LM_LSTM_CNN/all_shards-2016-09-10/ckpt-softmax3
)
[
7
](
http://
download.tensorflow.org/models/LM_LSTM_CNN/all_shards-2016-09-10/ckpt-softmax3
)
[
8
](
download.tensorflow.org/models/LM_LSTM_CNN/all_shards-2016-09-10/ckpt-softmax4
)
[
8
](
http://
download.tensorflow.org/models/LM_LSTM_CNN/all_shards-2016-09-10/ckpt-softmax4
)
[
9
](
download.tensorflow.org/models/LM_LSTM_CNN/all_shards-2016-09-10/ckpt-softmax5
)
[
9
](
http://
download.tensorflow.org/models/LM_LSTM_CNN/all_shards-2016-09-10/ckpt-softmax5
)
[
10
](
download.tensorflow.org/models/LM_LSTM_CNN/all_shards-2016-09-10/ckpt-softmax6
)
[
10
](
http://
download.tensorflow.org/models/LM_LSTM_CNN/all_shards-2016-09-10/ckpt-softmax6
)
[
11
](
download.tensorflow.org/models/LM_LSTM_CNN/all_shards-2016-09-10/ckpt-softmax7
)
[
11
](
http://
download.tensorflow.org/models/LM_LSTM_CNN/all_shards-2016-09-10/ckpt-softmax7
)
[
12
](
download.tensorflow.org/models/LM_LSTM_CNN/all_shards-2016-09-10/ckpt-softmax8
)
[
12
](
http://
download.tensorflow.org/models/LM_LSTM_CNN/all_shards-2016-09-10/ckpt-softmax8
)
*
Vocabulary file:
*
Vocabulary file:
[
link
](
download.tensorflow.org/models/LM_LSTM_CNN/vocab-2016-09-10.txt
)
[
link
](
http://
download.tensorflow.org/models/LM_LSTM_CNN/vocab-2016-09-10.txt
)
*
test dataset: link
*
test dataset: link
[
link
](
download.tensorflow.org/models/LM_LSTM_CNN/test/news.en.heldout-00000-of-00050
)
[
link
](
http://
download.tensorflow.org/models/LM_LSTM_CNN/test/news.en.heldout-00000-of-00050
)
*
It is recommended to run on modern desktop
PC
instead of laptop.
*
It is recommended to run on modern desktop instead of laptop.
```
shell
```
shell
# 1. Clone the code to your workspace.
# 1. Clone the code to your workspace.
...
@@ -110,10 +110,13 @@ ls -R
...
@@ -110,10 +110,13 @@ ls -R
data lm_1b output WORKSPACE
data lm_1b output WORKSPACE
./data:
./data:
ckpt eval_2_8k_1k_1_1_char.pbtxt news.en.heldout-00000-of-00050 vocab.txt
ckpt-base ckpt-lstm ckpt-softmax1 ckpt-softmax3 ckpt-softmax5
ckpt-softmax7 graph-2016-09-10.pbtxt vocab-2016-09-10.txt
ckpt-char-embedding ckpt-softmax0 ckpt-softmax2 ckpt-softmax4 ckpt-softmax6
ckpt-softmax8 news.en.heldout-00000-of-00050
./lm_1b:
./lm_1b:
BUILD data_utils.py
data_utils.pyc
lm_1b_eval.py README.md
BUILD data_utils.py lm_1b_eval.py README.md
./output:
./output:
...
@@ -122,9 +125,9 @@ bazel build -c opt lm_1b/...
...
@@ -122,9 +125,9 @@ bazel build -c opt lm_1b/...
# Run sample mode:
# Run sample mode:
bazel-bin/lm_1b/lm_1b_eval
--mode
sample
\
bazel-bin/lm_1b/lm_1b_eval
--mode
sample
\
--prefix
"I love that I"
\
--prefix
"I love that I"
\
--pbtxt
data/
eval_2_8k_1k_1_1_char
.pbtxt
\
--pbtxt
data/
graph-2016-09-10
.pbtxt
\
--vocab_file
data/vocab.txt
\
--vocab_file
data/vocab
-2016-09-10
.txt
\
--ckpt
data/ckpt
--ckpt
'
data/ckpt
-*'
...
(
omitted some TensorFlow output
)
...
(
omitted some TensorFlow output
)
I love
I love
I love that
I love that
...
@@ -136,10 +139,10 @@ I love that I find that amazing
...
@@ -136,10 +139,10 @@ I love that I find that amazing
# Run eval mode:
# Run eval mode:
bazel-bin/lm_1b/lm_1b_eval
--mode
eval
\
bazel-bin/lm_1b/lm_1b_eval
--mode
eval
\
--pbtxt
data/
eval_2_8k_1k_1_1_char
.pbtxt
\
--pbtxt
data/
graph-2016-09-10
.pbtxt
\
--vocab_file
data/vocab.txt
\
--vocab_file
data/vocab
-2016-09-10
.txt
\
--input_data
data/news.en.heldout-00000-of-00050
\
--input_data
data/news.en.heldout-00000-of-00050
\
--ckpt
data/ckpt
--ckpt
'
data/ckpt
-*'
...
(
omitted some TensorFlow output
)
...
(
omitted some TensorFlow output
)
Loaded step 14108582.
Loaded step 14108582.
# perplexity is high initially because words without context are harder to
# perplexity is high initially because words without context are harder to
...
@@ -164,9 +167,9 @@ Eval Step: 4531, Average Perplexity: 29.285674.
...
@@ -164,9 +167,9 @@ Eval Step: 4531, Average Perplexity: 29.285674.
# Run dump_emb mode:
# Run dump_emb mode:
bazel-bin/lm_1b/lm_1b_eval
--mode
dump_emb
\
bazel-bin/lm_1b/lm_1b_eval
--mode
dump_emb
\
--pbtxt
data/
eval_2_8k_1k_1_1_char
.pbtxt
\
--pbtxt
data/
graph-2016-09-10
.pbtxt
\
--vocab_file
data/vocab.txt
\
--vocab_file
data/vocab
-2016-09-10
.txt
\
--ckpt
data/ckpt
\
--ckpt
'
data/ckpt
-*'
\
--save_dir
output
--save_dir
output
...
(
omitted some TensorFlow output
)
...
(
omitted some TensorFlow output
)
Finished softmax weights
Finished softmax weights
...
@@ -179,9 +182,9 @@ embeddings_softmax.npy ...
...
@@ -179,9 +182,9 @@ embeddings_softmax.npy ...
# Run dump_lstm_emb mode:
# Run dump_lstm_emb mode:
bazel-bin/lm_1b/lm_1b_eval
--mode
dump_lstm_emb
\
bazel-bin/lm_1b/lm_1b_eval
--mode
dump_lstm_emb
\
--pbtxt
data/
eval_2_8k_1k_1_1_char
.pbtxt
\
--pbtxt
data/
graph-2016-09-10
.pbtxt
\
--vocab_file
data/vocab.txt
\
--vocab_file
data/vocab
-2016-09-10
.txt
\
--ckpt
data/ckpt
\
--ckpt
'
data/ckpt
-*'
\
--sentence
"I love who I am ."
\
--sentence
"I love who I am ."
\
--save_dir
output
--save_dir
output
ls
output/
ls
output/
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment