Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
chenpangpang
transformers
Commits
ed8fad73
Commit
ed8fad73
authored
Apr 24, 2019
by
Mathieu Prouveur
Browse files
Update example files so that tr_loss is not affected by args.gradient_accumulation_step
parent
c36cca07
Changes
2
Show whitespace changes
Inline
Side-by-side
Showing
2 changed files
with
2 additions
and
2 deletions
+2
-2
examples/run_classifier.py
examples/run_classifier.py
+1
-1
examples/run_swag.py
examples/run_swag.py
+1
-1
No files found.
examples/run_classifier.py
View file @
ed8fad73
...
@@ -845,7 +845,7 @@ def main():
...
@@ -845,7 +845,7 @@ def main():
else
:
else
:
loss
.
backward
()
loss
.
backward
()
tr_loss
+=
loss
.
item
()
tr_loss
+=
loss
.
item
()
*
args
.
gradient_accumulation_steps
nb_tr_examples
+=
input_ids
.
size
(
0
)
nb_tr_examples
+=
input_ids
.
size
(
0
)
nb_tr_steps
+=
1
nb_tr_steps
+=
1
if
(
step
+
1
)
%
args
.
gradient_accumulation_steps
==
0
:
if
(
step
+
1
)
%
args
.
gradient_accumulation_steps
==
0
:
...
...
examples/run_swag.py
View file @
ed8fad73
...
@@ -452,7 +452,7 @@ def main():
...
@@ -452,7 +452,7 @@ def main():
loss
=
loss
*
args
.
loss_scale
loss
=
loss
*
args
.
loss_scale
if
args
.
gradient_accumulation_steps
>
1
:
if
args
.
gradient_accumulation_steps
>
1
:
loss
=
loss
/
args
.
gradient_accumulation_steps
loss
=
loss
/
args
.
gradient_accumulation_steps
tr_loss
+=
loss
.
item
()
tr_loss
+=
loss
.
item
()
*
args
.
gradient_accumulation_steps
nb_tr_examples
+=
input_ids
.
size
(
0
)
nb_tr_examples
+=
input_ids
.
size
(
0
)
nb_tr_steps
+=
1
nb_tr_steps
+=
1
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment