Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
OpenDAS
ColossalAI
Commits
b965585d
Commit
b965585d
authored
Jan 04, 2023
by
xyupeng
Committed by
Frank Lee
Jan 04, 2023
Browse files
[NFC] polish colossalai/amp/torch_amp/torch_amp.py code style (#2290)
parent
d1e5bafc
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
3 additions
and
3 deletions
+3
-3
colossalai/amp/torch_amp/torch_amp.py
colossalai/amp/torch_amp/torch_amp.py
+3
-3
No files found.
colossalai/amp/torch_amp/torch_amp.py
View file @
b965585d
#!/usr/bin/env python
#!/usr/bin/env python
# -*- encoding: utf-8 -*-
# -*- encoding: utf-8 -*-
import
torch.nn
as
nn
import
torch.cuda.amp
as
torch_amp
import
torch.cuda.amp
as
torch_amp
import
torch.nn
as
nn
from
torch
import
Tensor
from
torch
import
Tensor
from
torch.nn.modules.loss
import
_Loss
from
torch.nn.modules.loss
import
_Loss
from
torch.optim
import
Optimizer
from
torch.optim
import
Optimizer
from
._grad_scaler
import
GradScaler
from
colossalai.nn.optimizer
import
ColossalaiOptimizer
from
colossalai.nn.optimizer
import
ColossalaiOptimizer
from
colossalai.utils
import
clip_grad_norm_fp32
from
colossalai.utils
import
clip_grad_norm_fp32
from
._grad_scaler
import
GradScaler
class
TorchAMPOptimizer
(
ColossalaiOptimizer
):
class
TorchAMPOptimizer
(
ColossalaiOptimizer
):
"""A wrapper class which integrate Pytorch AMP with an optimizer
"""A wrapper class which integrate Pytorch AMP with an optimizer
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment