Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
ModelZoo
ResNet50_tensorflow
Commits
ff52d0b3
Commit
ff52d0b3
authored
Feb 26, 2021
by
Le Hou
Committed by
A. Unique TensorFlower
Feb 26, 2021
Browse files
Internal change
PiperOrigin-RevId: 359821383
parent
35979d3b
Changes
1
Show whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
3 additions
and
1 deletion
+3
-1
official/modeling/optimization/lr_schedule.py
official/modeling/optimization/lr_schedule.py
+3
-1
No files found.
official/modeling/optimization/lr_schedule.py
View file @
ff52d0b3
...
@@ -177,7 +177,9 @@ class DirectPowerDecay(tf.keras.optimizers.schedules.LearningRateSchedule):
...
@@ -177,7 +177,9 @@ class DirectPowerDecay(tf.keras.optimizers.schedules.LearningRateSchedule):
with
tf
.
name_scope
(
self
.
_name
or
"DirectPowerDecay"
):
with
tf
.
name_scope
(
self
.
_name
or
"DirectPowerDecay"
):
step
=
tf
.
cast
(
step
,
tf
.
float32
)
step
=
tf
.
cast
(
step
,
tf
.
float32
)
learning_rate
=
self
.
_initial_learning_rate
learning_rate
=
self
.
_initial_learning_rate
learning_rate
*=
tf
.
math
.
pow
(
step
,
self
.
_power
)
# A zero `step` may cause Inf. So make `step` positive.
step_non_zero
=
tf
.
math
.
maximum
(
step
,
1.0
)
learning_rate
*=
tf
.
math
.
pow
(
step_non_zero
,
self
.
_power
)
return
learning_rate
return
learning_rate
def
get_config
(
self
):
def
get_config
(
self
):
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment