Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
chenpangpang
transformers
Commits
b48cf712
"git@developer.sourcefind.cn:chenpangpang/transformers.git" did not exist on "0039b965db7e2d077363583ae95e1487a81ee0af"
Unverified
Commit
b48cf712
authored
Apr 23, 2021
by
Patrick von Platen
Committed by
GitHub
Apr 23, 2021
Browse files
correct typo (#11393)
parent
8c9b5fcb
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
1 addition
and
1 deletion
+1
-1
examples/flax/language-modeling/run_mlm_flax.py
examples/flax/language-modeling/run_mlm_flax.py
+1
-1
No files found.
examples/flax/language-modeling/run_mlm_flax.py
View file @
b48cf712
...
@@ -590,7 +590,7 @@ if __name__ == "__main__":
...
@@ -590,7 +590,7 @@ if __name__ == "__main__":
# Create learning rate scheduler
# Create learning rate scheduler
# warmup_steps = 0 causes the Flax optimizer to return NaNs; warmup_steps = 1 is functionally equivalent.
# warmup_steps = 0 causes the Flax optimizer to return NaNs; warmup_steps = 1 is functionally equivalent.
lr_scheduler_fn
=
create_learning_rate_scheduler
(
lr_scheduler_fn
=
create_learning_rate_scheduler
(
base_learning_rate
=
training_args
.
learning_rate
,
warmup_steps
=
m
in
(
training_args
.
warmup_steps
,
1
)
base_learning_rate
=
training_args
.
learning_rate
,
warmup_steps
=
m
ax
(
training_args
.
warmup_steps
,
1
)
)
)
# Create parallel version of the training and evaluation steps
# Create parallel version of the training and evaluation steps
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment