Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
ModelZoo
ResNet50_tensorflow
Commits
596c9e23
Commit
596c9e23
authored
Feb 13, 2017
by
Neal Wu
Committed by
GitHub
Feb 13, 2017
Browse files
Merge pull request #911 from wookayin/cifar10
Fix bugs and API usage on cifar10 and cifar10_multi_gpu_train
parents
eb62b917
9d96e9fe
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
20 additions
and
19 deletions
+20
-19
tutorials/image/cifar10/cifar10_multi_gpu_train.py
tutorials/image/cifar10/cifar10_multi_gpu_train.py
+20
-19
No files found.
tutorials/image/cifar10/cifar10_multi_gpu_train.py
View file @
596c9e23
...
...
@@ -162,25 +162,26 @@ def train():
# Calculate the gradients for each model tower.
tower_grads
=
[]
for
i
in
xrange
(
FLAGS
.
num_gpus
):
with
tf
.
device
(
'/gpu:%d'
%
i
):
with
tf
.
name_scope
(
'%s_%d'
%
(
cifar10
.
TOWER_NAME
,
i
))
as
scope
:
# Calculate the loss for one tower of the CIFAR model. This function
# constructs the entire CIFAR model but shares the variables across
# all towers.
loss
=
tower_loss
(
scope
)
# Reuse variables for the next tower.
tf
.
get_variable_scope
().
reuse_variables
()
# Retain the summaries from the final tower.
summaries
=
tf
.
get_collection
(
tf
.
GraphKeys
.
SUMMARIES
,
scope
)
# Calculate the gradients for the batch of data on this CIFAR tower.
grads
=
opt
.
compute_gradients
(
loss
)
# Keep track of the gradients across all towers.
tower_grads
.
append
(
grads
)
with
tf
.
variable_scope
(
tf
.
get_variable_scope
()):
for
i
in
xrange
(
FLAGS
.
num_gpus
):
with
tf
.
device
(
'/gpu:%d'
%
i
):
with
tf
.
name_scope
(
'%s_%d'
%
(
cifar10
.
TOWER_NAME
,
i
))
as
scope
:
# Calculate the loss for one tower of the CIFAR model. This function
# constructs the entire CIFAR model but shares the variables across
# all towers.
loss
=
tower_loss
(
scope
)
# Reuse variables for the next tower.
tf
.
get_variable_scope
().
reuse_variables
()
# Retain the summaries from the final tower.
summaries
=
tf
.
get_collection
(
tf
.
GraphKeys
.
SUMMARIES
,
scope
)
# Calculate the gradients for the batch of data on this CIFAR tower.
grads
=
opt
.
compute_gradients
(
loss
)
# Keep track of the gradients across all towers.
tower_grads
.
append
(
grads
)
# We must calculate the mean of each gradient. Note that this is the
# synchronization point across all towers.
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment