Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
chenpangpang
transformers
Commits
cffa2b9c
Unverified
Commit
cffa2b9c
authored
Jul 09, 2024
by
kallewoof
Committed by
GitHub
Jul 09, 2024
Browse files
save_pretrained: use tqdm when saving checkpoint shards from offloaded params (#31856)
parent
350aed70
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
4 additions
and
1 deletion
+4
-1
src/transformers/modeling_utils.py
src/transformers/modeling_utils.py
+4
-1
No files found.
src/transformers/modeling_utils.py
View file @
cffa2b9c
...
@@ -2657,7 +2657,10 @@ class PreTrainedModel(nn.Module, ModuleUtilsMixin, GenerationMixin, PushToHubMix
...
@@ -2657,7 +2657,10 @@ class PreTrainedModel(nn.Module, ModuleUtilsMixin, GenerationMixin, PushToHubMix
):
):
os
.
remove
(
full_filename
)
os
.
remove
(
full_filename
)
# Save the model
# Save the model
for
shard_file
,
tensors
in
state_dict_split
.
filename_to_tensors
.
items
():
filename_to_tensors
=
state_dict_split
.
filename_to_tensors
.
items
()
if
module_map
:
filename_to_tensors
=
logging
.
tqdm
(
filename_to_tensors
,
desc
=
"Saving checkpoint shards"
)
for
shard_file
,
tensors
in
filename_to_tensors
:
shard
=
{
tensor
:
state_dict
[
tensor
]
for
tensor
in
tensors
}
shard
=
{
tensor
:
state_dict
[
tensor
]
for
tensor
in
tensors
}
# remake shard with onloaded parameters if necessary
# remake shard with onloaded parameters if necessary
if
module_map
:
if
module_map
:
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment