Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
OpenDAS
ColossalAI
Commits
df4f020e
Unverified
Commit
df4f020e
authored
Feb 13, 2023
by
HELSON
Committed by
GitHub
Feb 13, 2023
Browse files
[zero1&2] only append parameters with gradients (#2681)
parent
f0aa191f
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
4 additions
and
1 deletion
+4
-1
colossalai/zero/sharded_optim/low_level_optim.py
colossalai/zero/sharded_optim/low_level_optim.py
+4
-1
No files found.
colossalai/zero/sharded_optim/low_level_optim.py
View file @
df4f020e
...
@@ -131,7 +131,10 @@ class LowLevelZeroOptimizer(ColossalaiOptimizer):
...
@@ -131,7 +131,10 @@ class LowLevelZeroOptimizer(ColossalaiOptimizer):
# partition these param groups for data parallel training
# partition these param groups for data parallel training
# and add buffers to parameter store for future access
# and add buffers to parameter store for future access
for
group_id
,
param_group
in
enumerate
(
self
.
optim
.
param_groups
):
for
group_id
,
param_group
in
enumerate
(
self
.
optim
.
param_groups
):
group_params
=
param_group
[
'params'
]
group_params
=
list
()
for
param
in
param_group
[
'params'
]:
if
param
.
requires_grad
:
group_params
.
append
(
param
)
# add the fp16 params to fp16_param_groups for bookkeeping
# add the fp16 params to fp16_param_groups for bookkeeping
self
.
_fp16_param_groups
[
group_id
]
=
group_params
self
.
_fp16_param_groups
[
group_id
]
=
group_params
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment