Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
OpenDAS
dcnv3
Commits
0325fe8f
Commit
0325fe8f
authored
Apr 11, 2023
by
Zeqiang Lai
Committed by
zhe chen
Apr 11, 2023
Browse files
make ZeroRedundancyOptimizer compatible for torch < 1.12
parent
94586767
Changes
1
Show whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
20 additions
and
10 deletions
+20
-10
classification/optimizer.py
classification/optimizer.py
+20
-10
No files found.
classification/optimizer.py
View file @
0325fe8f
...
...
@@ -37,22 +37,32 @@ def build_optimizer(config, model):
if
use_zero
:
print
(
f
"
\n
Use Zero!"
)
if
opt_lower
==
'sgd'
:
# an ugly implementation
# this problem is fixed after torch 1.12
# https://github.com/pytorch/pytorch/issues/71347
# before 1.12, we could only pass list to zero optimizer, so we first pass parameters[0] with its lr and weight decay,
# then we add other parameter via parameter group.
optimizer
=
ZeroRedundancyOptimizer
(
parameters
,
parameters
[
0
][
'params'
]
,
optimizer_class
=
optim
.
SGD
,
momentum
=
config
.
TRAIN
.
OPTIMIZER
.
MOMENTUM
,
nesterov
=
True
,
lr
=
config
.
TRAIN
.
BASE_LR
,
weight_decay
=
config
.
TRAIN
.
WEIGHT_DECAY
)
momentum
=
config
.
TRAIN
.
OPTIMIZER
.
MOMENTUM
,
nesterov
=
True
,
lr
=
parameters
[
0
][
'lr'
],
weight_decay
=
parameters
[
0
][
'weight_decay'
]
)
if
len
(
parameters
)
>
1
:
for
param_group
in
parameters
[
1
:]:
optimizer
.
add_param_group
(
param_group
)
elif
opt_lower
==
'adamw'
:
optimizer
=
ZeroRedundancyOptimizer
(
parameters
,
parameters
[
0
][
'params'
]
,
optimizer_class
=
optim
.
AdamW
,
eps
=
config
.
TRAIN
.
OPTIMIZER
.
EPS
,
betas
=
config
.
TRAIN
.
OPTIMIZER
.
BETAS
,
lr
=
config
.
TRAIN
.
BASE_LR
,
weight_decay
=
config
.
TRAIN
.
WEIGHT_DECAY
)
eps
=
config
.
TRAIN
.
OPTIMIZER
.
EPS
,
betas
=
config
.
TRAIN
.
OPTIMIZER
.
BETAS
,
lr
=
parameters
[
0
][
'lr'
],
weight_decay
=
parameters
[
0
][
'weight_decay'
]
)
if
len
(
parameters
)
>
1
:
for
param_group
in
parameters
[
1
:]:
optimizer
.
add_param_group
(
param_group
)
else
:
if
opt_lower
==
'sgd'
:
optimizer
=
optim
.
SGD
(
parameters
,
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment