Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
OpenDAS
ColossalAI
Commits
399f84d8
Commit
399f84d8
authored
Nov 08, 2022
by
Fazzie-Maqianli
Committed by
binmakeswell
Nov 09, 2022
Browse files
[NFC] polish colossalai/amp/naive_amp/_fp16_optimizer.py code style (#1819)
parent
9623ec1b
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
7 additions
and
5 deletions
+7
-5
colossalai/amp/naive_amp/_fp16_optimizer.py
colossalai/amp/naive_amp/_fp16_optimizer.py
+7
-5
No files found.
colossalai/amp/naive_amp/_fp16_optimizer.py
View file @
399f84d8
...
...
@@ -9,14 +9,16 @@ try:
except
:
print
(
'Colossalai should be built with cuda extension to use the FP16 optimizer'
)
from
torch.distributed
import
ProcessGroup
from
torch.optim
import
Optimizer
from
colossalai.core
import
global_context
as
gpc
from
colossalai.context
import
ParallelMode
from
colossalai.core
import
global_context
as
gpc
from
colossalai.logging
import
get_dist_logger
from
colossalai.utils
import
(
copy_tensor_parallel_attributes
,
clip_grad_norm_fp32
,
multi_tensor_applier
)
from
torch.distributed
import
ProcessGroup
from
.grad_scaler
import
BaseGradScaler
from
colossalai.utils
import
clip_grad_norm_fp32
,
copy_tensor_parallel_attributes
,
multi_tensor_applier
from
._utils
import
has_inf_or_nan
,
zero_gard_by_list
from
.grad_scaler
import
BaseGradScaler
__all__
=
[
'FP16Optimizer'
]
...
...
@@ -41,7 +43,7 @@ def _multi_tensor_copy_this_to_that(this, that, overflow_buf=None):
class
FP16Optimizer
(
Optimizer
):
"""Float16 optimizer for fp16 and bf16 data types.
Args:
optimizer (torch.optim.Optimizer): base optimizer such as Adam or SGD
grad_scaler (BaseGradScaler): grad scaler for gradient chose in
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment