Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
OpenDAS
ColossalAI
Commits
c614a99d
Commit
c614a99d
authored
Jul 18, 2023
by
Yanjia0
Committed by
binmakeswell
Jul 26, 2023
Browse files
[NFC] polish colossalai/auto_parallel/offload/amp_optimizer.py code style (#4255)
parent
85774f0c
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
6 additions
and
5 deletions
+6
-5
colossalai/auto_parallel/offload/amp_optimizer.py
colossalai/auto_parallel/offload/amp_optimizer.py
+6
-5
No files found.
colossalai/auto_parallel/offload/amp_optimizer.py
View file @
c614a99d
from
typing
import
Dict
,
Tuple
from
enum
import
Enum
from
typing
import
Dict
,
Tuple
import
torch
from
torch.optim
import
Optimizer
from
colossalai.amp.naive_amp.grad_scaler
import
DynamicGradScaler
from
colossalai.logging
import
get_dist_logger
from
colossalai.nn.optimizer
import
ColossalaiOptimizer
from
colossalai.amp.naive_amp.grad_scaler
import
DynamicGradScaler
from
colossalai.utils
import
get_current_device
from
.base_offload_module
import
BaseOffloadModule
from
.region_manager
import
RegionManager
from
.region
import
Region
from
.region_manager
import
RegionManager
class
OptimState
(
Enum
):
SCALED
=
0
UNSCALED
=
1
class
AMPOptimizer
(
ColossalaiOptimizer
):
class
AMPOptimizer
(
ColossalaiOptimizer
):
"""
A wrapper for Optimizer.
Code reference: https://github.com/hpcaitech/ColossalAI/blob/main/colossalai/nn/optimizer/zero_optimizer.py
...
...
@@ -174,4 +175,4 @@ class AMPOptimizer(ColossalaiOptimizer):
# Leverage state_dict() and load_state_dict() to
# recast preexisting per-param state tensors
self
.
optim
.
load_state_dict
(
self
.
optim
.
state_dict
())
\ No newline at end of file
self
.
optim
.
load_state_dict
(
self
.
optim
.
state_dict
())
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment