Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
fengzch-das
nunchaku
Commits
45e055ce
Commit
45e055ce
authored
Apr 19, 2025
by
Muyang Li
Committed by
muyangli
Apr 20, 2025
Browse files
Merge pull request #86 from mit-han-lab/dev/muyang
[minor] fix the pix2pix-turbo demo
parent
48b2dc3c
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
5 additions
and
2 deletions
+5
-2
app/flux.1/sketch/flux_pix2pix_pipeline.py
app/flux.1/sketch/flux_pix2pix_pipeline.py
+5
-2
No files found.
app/flux.1/sketch/flux_pix2pix_pipeline.py
View file @
45e055ce
from
typing
import
Any
,
Callable
import
torch
import
torchvision.utils
from
diffusers.pipelines.flux.pipeline_flux
import
FluxPipeline
,
FluxPipelineOutput
,
FluxTransformer2DModel
from
einops
import
rearrange
from
peft.tuners
import
lora
...
...
@@ -9,6 +8,8 @@ from PIL import Image
from
torch
import
nn
from
torchvision.transforms
import
functional
as
F
from
nunchaku.utils
import
load_state_dict_in_safetensors
class
FluxPix2pixTurboPipeline
(
FluxPipeline
):
def
update_alpha
(
self
,
alpha
:
float
)
->
None
:
...
...
@@ -55,7 +56,9 @@ class FluxPix2pixTurboPipeline(FluxPipeline):
self
.
load_lora_into_transformer
(
state_dict
,
{},
transformer
=
transformer
)
else
:
assert
svdq_lora_path
is
not
None
self
.
transformer
.
update_lora_params
(
svdq_lora_path
)
sd
=
load_state_dict_in_safetensors
(
svdq_lora_path
)
sd
=
{
k
:
v
for
k
,
v
in
sd
.
items
()
if
not
k
.
startswith
(
"transformer."
)}
self
.
transformer
.
update_lora_params
(
sd
)
self
.
update_alpha
(
alpha
)
@
torch
.
no_grad
()
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment