Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
chenpangpang
transformers
Commits
5f092194
Unverified
Commit
5f092194
authored
Apr 18, 2023
by
Sylvain Gugger
Committed by
GitHub
Apr 18, 2023
Browse files
Fix from_pretrained when model is instantiated on the meta device (#22837)
parent
5f9b825c
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
5 additions
and
4 deletions
+5
-4
src/transformers/modeling_utils.py
src/transformers/modeling_utils.py
+5
-4
No files found.
src/transformers/modeling_utils.py
View file @
5f092194
...
...
@@ -2935,10 +2935,11 @@ class PreTrainedModel(nn.Module, ModuleUtilsMixin, GenerationMixin, PushToHubMix
unexpected_keys
=
list
(
set
(
loaded_keys
)
-
set
(
expected_keys
))
# Some tensors maybe have been already filled by another key (tied weights).
existing_ptrs
=
{
model_state_dict
[
k
].
data_ptr
()
for
k
in
loaded_keys
if
k
in
model_state_dict
}
missing_keys
=
[
k
for
k
in
missing_keys
if
k
in
model_state_dict
and
model_state_dict
[
k
].
data_ptr
()
not
in
existing_ptrs
]
# TODO: Sylvain -> make this work even on meta device.
# existing_ptrs = {model_state_dict[k].data_ptr() for k in loaded_keys if k in model_state_dict}
# missing_keys = [
# k for k in missing_keys if k in model_state_dict and model_state_dict[k].data_ptr() not in existing_ptrs
# ]
# Some models may have keys that are not in the state by design, removing them before needlessly warning
# the user.
if
cls
.
_keys_to_ignore_on_load_missing
is
not
None
:
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment