Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
dcuai
dlexamples
Commits
0a159036
Commit
0a159036
authored
Feb 15, 2023
by
hepj
Browse files
修改VIT模型loss就地操作
parent
c0f05c10
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
4 additions
and
2 deletions
+4
-2
PyTorch/NLP/Vision_Transformer/engine_pretrain.py
PyTorch/NLP/Vision_Transformer/engine_pretrain.py
+4
-2
No files found.
PyTorch/NLP/Vision_Transformer/engine_pretrain.py
View file @
0a159036
...
@@ -53,7 +53,9 @@ def train_one_epoch(model: torch.nn.Module,
...
@@ -53,7 +53,9 @@ def train_one_epoch(model: torch.nn.Module,
print
(
"Loss is {}, stopping training"
.
format
(
loss_value
))
print
(
"Loss is {}, stopping training"
.
format
(
loss_value
))
sys
.
exit
(
1
)
sys
.
exit
(
1
)
loss
/=
accum_iter
loss_new
=
loss
/
accum_iter
loss
=
loss_new
loss_scaler
(
loss
,
optimizer
,
parameters
=
model
.
parameters
(),
loss_scaler
(
loss
,
optimizer
,
parameters
=
model
.
parameters
(),
update_grad
=
(
data_iter_step
+
1
)
%
accum_iter
==
0
)
update_grad
=
(
data_iter_step
+
1
)
%
accum_iter
==
0
)
if
(
data_iter_step
+
1
)
%
accum_iter
==
0
:
if
(
data_iter_step
+
1
)
%
accum_iter
==
0
:
...
@@ -79,4 +81,4 @@ def train_one_epoch(model: torch.nn.Module,
...
@@ -79,4 +81,4 @@ def train_one_epoch(model: torch.nn.Module,
# gather the stats from all processes
# gather the stats from all processes
metric_logger
.
synchronize_between_processes
()
metric_logger
.
synchronize_between_processes
()
print
(
"Averaged stats:"
,
metric_logger
)
print
(
"Averaged stats:"
,
metric_logger
)
return
{
k
:
meter
.
global_avg
for
k
,
meter
in
metric_logger
.
meters
.
items
()}
return
{
k
:
meter
.
global_avg
for
k
,
meter
in
metric_logger
.
meters
.
items
()}
\ No newline at end of file
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment