Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
renzhc
diffusers_dcu
Commits
509741ae
Unverified
Commit
509741ae
authored
May 21, 2024
by
BootesVoid
Committed by
GitHub
May 22, 2024
Browse files
fix: Attribute error in Logger object (logger.warning) (#8183)
parent
e1df77ee
Changes
1
Show whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
3 additions
and
3 deletions
+3
-3
src/diffusers/loaders/single_file_utils.py
src/diffusers/loaders/single_file_utils.py
+3
-3
No files found.
src/diffusers/loaders/single_file_utils.py
View file @
509741ae
...
@@ -826,8 +826,8 @@ def convert_ldm_unet_checkpoint(checkpoint, config, extract_ema=False, **kwargs)
...
@@ -826,8 +826,8 @@ def convert_ldm_unet_checkpoint(checkpoint, config, extract_ema=False, **kwargs)
# at least a 100 parameters have to start with `model_ema` in order for the checkpoint to be EMA
# at least a 100 parameters have to start with `model_ema` in order for the checkpoint to be EMA
if
sum
(
k
.
startswith
(
"model_ema"
)
for
k
in
keys
)
>
100
and
extract_ema
:
if
sum
(
k
.
startswith
(
"model_ema"
)
for
k
in
keys
)
>
100
and
extract_ema
:
logger
.
warning
ing
(
"Checkpoint has both EMA and non-EMA weights."
)
logger
.
warning
(
"Checkpoint has both EMA and non-EMA weights."
)
logger
.
warning
ing
(
logger
.
warning
(
"In this conversion only the EMA weights are extracted. If you want to instead extract the non-EMA"
"In this conversion only the EMA weights are extracted. If you want to instead extract the non-EMA"
" weights (useful to continue fine-tuning), please make sure to remove the `--extract_ema` flag."
" weights (useful to continue fine-tuning), please make sure to remove the `--extract_ema` flag."
)
)
...
@@ -837,7 +837,7 @@ def convert_ldm_unet_checkpoint(checkpoint, config, extract_ema=False, **kwargs)
...
@@ -837,7 +837,7 @@ def convert_ldm_unet_checkpoint(checkpoint, config, extract_ema=False, **kwargs)
unet_state_dict
[
key
.
replace
(
unet_key
,
""
)]
=
checkpoint
.
get
(
flat_ema_key
)
unet_state_dict
[
key
.
replace
(
unet_key
,
""
)]
=
checkpoint
.
get
(
flat_ema_key
)
else
:
else
:
if
sum
(
k
.
startswith
(
"model_ema"
)
for
k
in
keys
)
>
100
:
if
sum
(
k
.
startswith
(
"model_ema"
)
for
k
in
keys
)
>
100
:
logger
.
warning
ing
(
logger
.
warning
(
"In this conversion only the non-EMA weights are extracted. If you want to instead extract the EMA"
"In this conversion only the non-EMA weights are extracted. If you want to instead extract the EMA"
" weights (usually better for inference), please make sure to add the `--extract_ema` flag."
" weights (usually better for inference), please make sure to add the `--extract_ema` flag."
)
)
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment