Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
ModelZoo
ResNet50_tensorflow
Commits
ee708859
Commit
ee708859
authored
Jun 07, 2022
by
A. Unique TensorFlower
Browse files
Internal change
PiperOrigin-RevId: 453487987
parent
889dc12a
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
4 additions
and
2 deletions
+4
-2
official/nlp/optimization.py
official/nlp/optimization.py
+4
-2
No files found.
official/nlp/optimization.py
View file @
ee708859
...
@@ -71,13 +71,15 @@ def create_optimizer(init_lr,
...
@@ -71,13 +71,15 @@ def create_optimizer(init_lr,
num_warmup_steps
,
num_warmup_steps
,
end_lr
=
0.0
,
end_lr
=
0.0
,
optimizer_type
=
'adamw'
,
optimizer_type
=
'adamw'
,
beta_1
=
0.9
):
beta_1
=
0.9
,
poly_power
=
1.0
):
"""Creates an optimizer with learning rate schedule."""
"""Creates an optimizer with learning rate schedule."""
# Implements linear decay of the learning rate.
# Implements linear decay of the learning rate.
lr_schedule
=
tf
.
keras
.
optimizers
.
schedules
.
PolynomialDecay
(
lr_schedule
=
tf
.
keras
.
optimizers
.
schedules
.
PolynomialDecay
(
initial_learning_rate
=
init_lr
,
initial_learning_rate
=
init_lr
,
decay_steps
=
num_train_steps
,
decay_steps
=
num_train_steps
,
end_learning_rate
=
end_lr
)
end_learning_rate
=
end_lr
,
power
=
poly_power
)
if
num_warmup_steps
:
if
num_warmup_steps
:
lr_schedule
=
WarmUp
(
lr_schedule
=
WarmUp
(
initial_learning_rate
=
init_lr
,
initial_learning_rate
=
init_lr
,
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment