Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
OpenDAS
ColossalAI
Commits
c5b488ed
Unverified
Commit
c5b488ed
authored
Apr 01, 2022
by
ver217
Committed by
GitHub
Apr 01, 2022
Browse files
polish amp docstring (#616)
parent
f69507dd
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
1 addition
and
2 deletions
+1
-2
colossalai/amp/naive_amp/__init__.py
colossalai/amp/naive_amp/__init__.py
+1
-2
No files found.
colossalai/amp/naive_amp/__init__.py
View file @
c5b488ed
...
@@ -18,8 +18,7 @@ def convert_to_naive_amp(model: nn.Module, optimizer: Optimizer, amp_config):
...
@@ -18,8 +18,7 @@ def convert_to_naive_amp(model: nn.Module, optimizer: Optimizer, amp_config):
amp_config (:class:`colossalai.context.Config` or dict): configuration for naive mode amp.
amp_config (:class:`colossalai.context.Config` or dict): configuration for naive mode amp.
The ``amp_config`` should contain parameters below:
The ``amp_config`` should contain parameters below::
:
verbose (bool, optional): if set to `True`, will print debug info (Default: False).
verbose (bool, optional): if set to `True`, will print debug info (Default: False).
clip_grad_norm (float, optional): clip gradients with this global L2 norm (Default 0).
clip_grad_norm (float, optional): clip gradients with this global L2 norm (Default 0).
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment