Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
chenpangpang
ComfyUI
Commits
83d969e3
Commit
83d969e3
authored
May 21, 2024
by
comfyanonymous
Browse files
Disable xformers when tracing model.
parent
1900e511
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
11 additions
and
1 deletion
+11
-1
comfy/ldm/modules/attention.py
comfy/ldm/modules/attention.py
+11
-1
No files found.
comfy/ldm/modules/attention.py
View file @
83d969e3
...
@@ -313,9 +313,19 @@ except:
...
@@ -313,9 +313,19 @@ except:
def
attention_xformers
(
q
,
k
,
v
,
heads
,
mask
=
None
,
attn_precision
=
None
):
def
attention_xformers
(
q
,
k
,
v
,
heads
,
mask
=
None
,
attn_precision
=
None
):
b
,
_
,
dim_head
=
q
.
shape
b
,
_
,
dim_head
=
q
.
shape
dim_head
//=
heads
dim_head
//=
heads
disabled_xformers
=
False
if
BROKEN_XFORMERS
:
if
BROKEN_XFORMERS
:
if
b
*
heads
>
65535
:
if
b
*
heads
>
65535
:
return
attention_pytorch
(
q
,
k
,
v
,
heads
,
mask
)
disabled_xformers
=
True
if
not
disabled_xformers
:
if
torch
.
jit
.
is_tracing
()
or
torch
.
jit
.
is_scripting
():
disabled_xformers
=
True
if
disabled_xformers
:
return
attention_pytorch
(
q
,
k
,
v
,
heads
,
mask
)
q
,
k
,
v
=
map
(
q
,
k
,
v
=
map
(
lambda
t
:
t
.
reshape
(
b
,
-
1
,
heads
,
dim_head
),
lambda
t
:
t
.
reshape
(
b
,
-
1
,
heads
,
dim_head
),
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment