Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
OpenDAS
apex
Commits
bf4aa847
Commit
bf4aa847
authored
Apr 23, 2019
by
Michael Carilli
Browse files
Moving sgd to optimizers
parent
6af5980e
Changes
3
Hide whitespace changes
Inline
Side-by-side
Showing
3 changed files
with
2 additions
and
1 deletion
+2
-1
apex/optim/__init__.py
apex/optim/__init__.py
+0
-0
apex/optimizers/__init__.py
apex/optimizers/__init__.py
+1
-0
apex/optimizers/fused_sgd.py
apex/optimizers/fused_sgd.py
+1
-1
No files found.
apex/optim/__init__.py
deleted
100644 → 0
View file @
6af5980e
apex/optimizers/__init__.py
View file @
bf4aa847
from
.fused_adam
import
FusedAdam
from
.fused_adam
import
FusedAdam
from
.fused_sgd
import
FusedSGD
from
.fp16_optimizer
import
FP16_Optimizer
from
.fp16_optimizer
import
FP16_Optimizer
apex/optim
/
sgd.py
→
apex/optim
izers/fused_
sgd.py
View file @
bf4aa847
...
@@ -3,7 +3,7 @@ from torch.optim.optimizer import Optimizer, required
...
@@ -3,7 +3,7 @@ from torch.optim.optimizer import Optimizer, required
from
apex.multi_tensor_apply
import
multi_tensor_applier
from
apex.multi_tensor_apply
import
multi_tensor_applier
class
SGD
(
Optimizer
):
class
Fused
SGD
(
Optimizer
):
r
"""Implements stochastic gradient descent (optionally with momentum).
r
"""Implements stochastic gradient descent (optionally with momentum).
Nesterov momentum is based on the formula from
Nesterov momentum is based on the formula from
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment