Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
ModelZoo
ResNet50_tensorflow
Commits
983b7d08
Commit
983b7d08
authored
May 11, 2017
by
Evan Kepner
Committed by
Evan Kepner
May 11, 2017
Browse files
correct authorship
parent
692b7526
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
15 additions
and
5 deletions
+15
-5
tutorials/rnn/ptb/ptb_word_lm.py
tutorials/rnn/ptb/ptb_word_lm.py
+15
-5
No files found.
tutorials/rnn/ptb/ptb_word_lm.py
View file @
983b7d08
...
...
@@ -162,11 +162,21 @@ class PTBModel(object):
"softmax_w"
,
[
size
,
vocab_size
],
dtype
=
data_type
())
softmax_b
=
tf
.
get_variable
(
"softmax_b"
,
[
vocab_size
],
dtype
=
data_type
())
logits
=
tf
.
matmul
(
output
,
softmax_w
)
+
softmax_b
loss
=
tf
.
contrib
.
legacy_seq2seq
.
sequence_loss_by_example
(
[
logits
],
[
tf
.
reshape
(
input_
.
targets
,
[
-
1
])],
[
tf
.
ones
([
batch_size
*
num_steps
],
dtype
=
data_type
())])
self
.
_cost
=
cost
=
tf
.
reduce_sum
(
loss
)
/
batch_size
# Reshape logits to be 3-D tensor for sequence loss
logits
=
tf
.
reshape
(
logits
,
[
batch_size
,
num_steps
,
vocab_size
])
# use the contrib sequence loss and average over the batches
loss
=
tf
.
contrib
.
seq2seq
.
sequence_loss
(
logits
,
input_
.
targets
,
tf
.
ones
([
batch_size
,
num_steps
],
dtype
=
data_type
()),
average_across_timesteps
=
False
,
average_across_batch
=
True
)
# update the cost variables
self
.
_cost
=
cost
=
tf
.
reduce_sum
(
loss
)
self
.
_final_state
=
state
if
not
is_training
:
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment