Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
OpenDAS
ColossalAI
Commits
f7e3f82a
Unverified
Commit
f7e3f82a
authored
Jan 19, 2024
by
flybird11111
Committed by
GitHub
Jan 19, 2024
Browse files
fix llama pretrain (#5287)
parent
6a569678
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
1 addition
and
2 deletions
+1
-2
examples/language/llama2/pretrain.py
examples/language/llama2/pretrain.py
+1
-2
No files found.
examples/language/llama2/pretrain.py
View file @
f7e3f82a
...
@@ -273,11 +273,10 @@ def main():
...
@@ -273,11 +273,10 @@ def main():
dataloader
.
sampler
.
set_start_index
(
sampler_start_idx
)
dataloader
.
sampler
.
set_start_index
(
sampler_start_idx
)
for
epoch
in
range
(
start_epoch
,
args
.
num_epochs
):
for
epoch
in
range
(
start_epoch
,
args
.
num_epochs
):
dataloader
.
sampler
.
set_epoch
(
epoch
)
dataloader
.
sampler
.
set_epoch
(
epoch
)
step_nums
=
num_steps_per_epoch
-
start_step
dataloader_iter
=
iter
(
dataloader
)
dataloader_iter
=
iter
(
dataloader
)
with
tqdm
(
with
tqdm
(
range
(
st
ep_nums
),
range
(
st
art_step
,
num_steps_per_epoch
),
desc
=
f
"Epoch
{
epoch
}
"
,
desc
=
f
"Epoch
{
epoch
}
"
,
disable
=
not
print_flag
,
disable
=
not
print_flag
,
total
=
num_steps_per_epoch
,
total
=
num_steps_per_epoch
,
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment