Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
chenpangpang
transformers
Commits
d2f9cb83
"git@developer.sourcefind.cn:chenpangpang/transformers.git" did not exist on "9d37c56bab8f7f1f1aa0b65be039516072254e77"
Unverified
Commit
d2f9cb83
authored
Aug 31, 2020
by
Sylvain Gugger
Committed by
GitHub
Aug 31, 2020
Browse files
Fix in Adafactor docstrings (#6845)
parent
2de7ee03
Changes
1
Show whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
1 addition
and
1 deletion
+1
-1
src/transformers/optimization.py
src/transformers/optimization.py
+1
-1
No files found.
src/transformers/optimization.py
View file @
d2f9cb83
...
@@ -346,7 +346,7 @@ class Adafactor(Optimizer):
...
@@ -346,7 +346,7 @@ class Adafactor(Optimizer):
If True, learning rate is scaled by root mean square
If True, learning rate is scaled by root mean square
relative_step (:obj:`bool`, `optional`, defaults to :obj:`True`):
relative_step (:obj:`bool`, `optional`, defaults to :obj:`True`):
If True, time-dependent learning rate is computed instead of external learning rate
If True, time-dependent learning rate is computed instead of external learning rate
warmup_init (:obj:`bool`, `optional`, defaults to False):
warmup_init (:obj:`bool`, `optional`, defaults to
:obj:`
False
`
):
Time-dependent learning rate computation depends on whether warm-up initialization is being used
Time-dependent learning rate computation depends on whether warm-up initialization is being used
This implementation handles low-precision (FP16, bfloat) values, but we have not thoroughly tested.
This implementation handles low-precision (FP16, bfloat) values, but we have not thoroughly tested.
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment