Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
chenpangpang
ComfyUI
Commits
798c90e1
Commit
798c90e1
authored
Mar 05, 2023
by
comfyanonymous
Browse files
Fix pytorch 2.0 cross attention not working.
parent
f9d09c26
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
3 additions
and
0 deletions
+3
-0
comfy/ldm/modules/attention.py
comfy/ldm/modules/attention.py
+3
-0
No files found.
comfy/ldm/modules/attention.py
View file @
798c90e1
...
...
@@ -489,6 +489,8 @@ if XFORMERS_IS_AVAILBLE == False or "--disable-xformers" in sys.argv:
if
"--use-pytorch-cross-attention"
in
sys
.
argv
:
print
(
"Using pytorch cross attention"
)
torch
.
backends
.
cuda
.
enable_math_sdp
(
False
)
torch
.
backends
.
cuda
.
enable_flash_sdp
(
True
)
torch
.
backends
.
cuda
.
enable_mem_efficient_sdp
(
True
)
CrossAttention
=
CrossAttentionPytorch
else
:
print
(
"Using sub quadratic optimization for cross attention, if you have memory or speed issues try using: --use-split-cross-attention"
)
...
...
@@ -497,6 +499,7 @@ else:
print
(
"Using xformers cross attention"
)
CrossAttention
=
MemoryEfficientCrossAttention
class
BasicTransformerBlock
(
nn
.
Module
):
def
__init__
(
self
,
dim
,
n_heads
,
d_head
,
dropout
=
0.
,
context_dim
=
None
,
gated_ff
=
True
,
checkpoint
=
True
,
disable_self_attn
=
False
):
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment