Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
OpenDAS
apex
Commits
b9e5d37d
Commit
b9e5d37d
authored
Aug 27, 2019
by
Michael Carilli
Browse files
Docstring updates
parent
17e8a552
Changes
4
Hide whitespace changes
Inline
Side-by-side
Showing
4 changed files
with
4 additions
and
4 deletions
+4
-4
apex/optimizers/fused_adam.py
apex/optimizers/fused_adam.py
+1
-1
apex/optimizers/fused_lamb.py
apex/optimizers/fused_lamb.py
+1
-1
apex/optimizers/fused_novograd.py
apex/optimizers/fused_novograd.py
+1
-1
apex/optimizers/fused_sgd.py
apex/optimizers/fused_sgd.py
+1
-1
No files found.
apex/optimizers/fused_adam.py
View file @
b9e5d37d
...
@@ -21,7 +21,7 @@ class FusedAdam(torch.optim.Optimizer):
...
@@ -21,7 +21,7 @@ class FusedAdam(torch.optim.Optimizer):
opt.step()
opt.step()
:class:`apex.optimizers.FusedAdam` may be used with or without Amp. If you wish to use :class:`FusedAdam` with Amp,
:class:`apex.optimizers.FusedAdam` may be used with or without Amp. If you wish to use :class:`FusedAdam` with Amp,
you may choose any `opt_level`::
you may choose any
`
`opt_level`
`
::
opt = apex.optimizers.FusedAdam(model.parameters(), lr = ....)
opt = apex.optimizers.FusedAdam(model.parameters(), lr = ....)
model, opt = amp.initialize(model, opt, opt_level="O0" or "O1 or "O2")
model, opt = amp.initialize(model, opt, opt_level="O0" or "O1 or "O2")
...
...
apex/optimizers/fused_lamb.py
View file @
b9e5d37d
...
@@ -20,7 +20,7 @@ class FusedLAMB(torch.optim.Optimizer):
...
@@ -20,7 +20,7 @@ class FusedLAMB(torch.optim.Optimizer):
opt.step()
opt.step()
:class:`apex.optimizers.FusedLAMB` may be used with or without Amp. If you wish to use :class:`FusedLAMB` with Amp,
:class:`apex.optimizers.FusedLAMB` may be used with or without Amp. If you wish to use :class:`FusedLAMB` with Amp,
you may choose any `opt_level`::
you may choose any
`
`opt_level`
`
::
opt = apex.optimizers.FusedLAMB(model.parameters(), lr = ....)
opt = apex.optimizers.FusedLAMB(model.parameters(), lr = ....)
model, opt = amp.initialize(model, opt, opt_level="O0" or "O1 or "O2")
model, opt = amp.initialize(model, opt, opt_level="O0" or "O1 or "O2")
...
...
apex/optimizers/fused_novograd.py
View file @
b9e5d37d
...
@@ -20,7 +20,7 @@ class FusedNovoGrad(torch.optim.Optimizer):
...
@@ -20,7 +20,7 @@ class FusedNovoGrad(torch.optim.Optimizer):
opt.step()
opt.step()
:class:`apex.optimizers.FusedNovoGrad` may be used with or without Amp. If you wish to use :class:`FusedNovoGrad` with Amp,
:class:`apex.optimizers.FusedNovoGrad` may be used with or without Amp. If you wish to use :class:`FusedNovoGrad` with Amp,
you may choose any `opt_level`::
you may choose any
`
`opt_level`
`
::
opt = apex.optimizers.FusedNovoGrad(model.parameters(), lr = ....)
opt = apex.optimizers.FusedNovoGrad(model.parameters(), lr = ....)
model, opt = amp.initialize(model, opt, opt_level="O0" or "O1 or "O2")
model, opt = amp.initialize(model, opt, opt_level="O0" or "O1 or "O2")
...
...
apex/optimizers/fused_sgd.py
View file @
b9e5d37d
...
@@ -21,7 +21,7 @@ class FusedSGD(Optimizer):
...
@@ -21,7 +21,7 @@ class FusedSGD(Optimizer):
opt.step()
opt.step()
:class:`apex.optimizers.FusedSGD` may be used with or without Amp. If you wish to use :class:`FusedSGD` with Amp,
:class:`apex.optimizers.FusedSGD` may be used with or without Amp. If you wish to use :class:`FusedSGD` with Amp,
you may choose any `opt_level`::
you may choose any
`
`opt_level`
`
::
opt = apex.optimizers.FusedSGD(model.parameters(), lr = ....)
opt = apex.optimizers.FusedSGD(model.parameters(), lr = ....)
model, opt = amp.initialize(model, opt, opt_level="O0" or "O1 or "O2")
model, opt = amp.initialize(model, opt, opt_level="O0" or "O1 or "O2")
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment