Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
ModelZoo
ResNet50_tensorflow
Commits
f006521b
"...deeplabv3_migraphx.git" did not exist on "b5cd7a1c2e992e55b62eb473883889a84ee98b77"
Commit
f006521b
authored
May 05, 2021
by
Le Hou
Committed by
A. Unique TensorFlower
May 05, 2021
Browse files
Internal change
PiperOrigin-RevId: 372242256
parent
e7c57743
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
7 additions
and
1 deletion
+7
-1
official/core/base_trainer.py
official/core/base_trainer.py
+7
-1
No files found.
official/core/base_trainer.py
View file @
f006521b
...
@@ -370,7 +370,13 @@ class Trainer(_AsyncTrainer):
...
@@ -370,7 +370,13 @@ class Trainer(_AsyncTrainer):
logs
[
metric
.
name
]
=
metric
.
result
()
logs
[
metric
.
name
]
=
metric
.
result
()
metric
.
reset_states
()
metric
.
reset_states
()
if
callable
(
self
.
optimizer
.
learning_rate
):
if
callable
(
self
.
optimizer
.
learning_rate
):
logs
[
"learning_rate"
]
=
self
.
optimizer
.
learning_rate
(
self
.
global_step
)
# Maybe a self-implemented optimizer does not have `optimizer.iterations`.
# So just to be safe here.
if
hasattr
(
self
.
optimizer
,
"iterations"
):
logs
[
"learning_rate"
]
=
self
.
optimizer
.
learning_rate
(
self
.
optimizer
.
iterations
)
else
:
logs
[
"learning_rate"
]
=
self
.
optimizer
.
learning_rate
(
self
.
global_step
)
else
:
else
:
logs
[
"learning_rate"
]
=
self
.
optimizer
.
learning_rate
logs
[
"learning_rate"
]
=
self
.
optimizer
.
learning_rate
return
logs
return
logs
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment