Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
OpenDAS
fairscale
Commits
8e85ce8c
Unverified
Commit
8e85ce8c
authored
Nov 25, 2020
by
Benjamin Lefaudeux
Committed by
GitHub
Nov 25, 2020
Browse files
[fix] Adding a GradScaler import guard for amp with pytorch 1.5 (#210)
parent
7a062894
Changes
1
Show whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
8 additions
and
3 deletions
+8
-3
fairscale/optim/__init__.py
fairscale/optim/__init__.py
+8
-3
No files found.
fairscale/optim/__init__.py
View file @
8e85ce8c
...
@@ -6,11 +6,16 @@
...
@@ -6,11 +6,16 @@
"""
"""
:mod:`fairscale.optim` is a package implementing various torch optimization algorithms.
:mod:`fairscale.optim` is a package implementing various torch optimization algorithms.
"""
"""
import
logging
from
.adascale
import
AdaScale
from
.oss
import
OSS
try
:
try
:
from
.adam
import
Adam
,
Precision
from
.adam
import
Adam
,
Precision
except
ImportError
:
# pragma: no cover
except
ImportError
:
# pragma: no cover
pass
# pragma: no cover
pass
# pragma: no cover
from
.adascale
import
AdaScale
try
:
from
.grad_scaler
import
GradScaler
from
.grad_scaler
import
GradScaler
from
.oss
import
OSS
except
ImportError
:
logging
.
warning
(
"Torch AMP is not available on this platform"
)
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment