Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
OpenDAS
fairscale
Commits
b5ccedc0
Unverified
Commit
b5ccedc0
authored
Oct 30, 2020
by
Min Xu
Committed by
GitHub
Oct 30, 2020
Browse files
add warning to adascale before it is validated (#169)
parent
4247f602
Changes
2
Show whitespace changes
Inline
Side-by-side
Showing
2 changed files
with
6 additions
and
0 deletions
+6
-0
docs/source/tutorials/adascale.rst
docs/source/tutorials/adascale.rst
+3
-0
fairscale/optim/adascale.py
fairscale/optim/adascale.py
+3
-0
No files found.
docs/source/tutorials/adascale.rst
View file @
b5ccedc0
AdaScale
SGD
============
Note
,
AdaScale
is
still
experimental
.
It
is
being
validated
.
APIs
may
change
in
the
future
.
Use
at
your
own
risk
.
`
AdaScale
<
https
://
arxiv
.
org
/
pdf
/
2007.05105
.
pdf
>`
_
adaptively
scales
the
learning
rate
when
using
larger
batch
sizes
for
data
-
parallel
training
.
Let
's suppose that your trainer looks like
.. code-block:: python
...
...
fairscale/optim/adascale.py
View file @
b5ccedc0
...
...
@@ -32,6 +32,7 @@
# POSSIBILITY OF SUCH DAMAGE.
import
functools
import
logging
from
typing
import
Any
,
Dict
,
Optional
import
numpy
as
np
...
...
@@ -79,6 +80,8 @@ class AdaScale(object):
smoothing
:
float
=
0.999
,
patch_optimizer
:
bool
=
False
,
):
logging
.
warn
(
"AdaScale is experimental. APIs may change. Use at your own risk."
)
self
.
_optimizer
=
optimizer
self
.
_optimizer_step
=
optimizer
.
step
self
.
_local_grad_sqr
:
Optional
[
torch
.
Tensor
]
=
None
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment