Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
OpenDAS
ColossalAI
Commits
e4685832
Unverified
Commit
e4685832
authored
May 26, 2022
by
Frank Lee
Committed by
GitHub
May 26, 2022
Browse files
[engine] fixed bug in gradient accumulation dataloader to keep the last step (#1030)
parent
32291dd7
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
2 additions
and
1 deletion
+2
-1
colossalai/engine/gradient_accumulation/_gradient_accumulation.py
...ai/engine/gradient_accumulation/_gradient_accumulation.py
+2
-1
No files found.
colossalai/engine/gradient_accumulation/_gradient_accumulation.py
View file @
e4685832
...
...
@@ -145,6 +145,7 @@ class GradAccumDataloader:
def
__next__
(
self
)
->
Union
[
Tensor
,
Tuple
[
Tensor
]]:
if
self
.
_cur_step
<
self
.
steps_per_epoch
:
self
.
_cur_step
+=
1
data
=
next
(
self
.
_dataiter
)
if
self
.
_cur_step
==
self
.
steps_per_epoch
and
self
.
consume_remain_data
:
# this is to handle non standard pytorch dataloader
...
...
@@ -154,7 +155,7 @@ class GradAccumDataloader:
_
=
next
(
self
.
_dataiter
)
except
StopIteration
:
break
return
next
(
self
.
_dataiter
)
return
data
else
:
raise
StopIteration
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment