Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
chenpangpang
ComfyUI
Commits
a373367b
"tests/vscode:/vscode.git/clone" did not exist on "351aab60e9028002d3b3a3685694ba17fd2223df"
Commit
a373367b
authored
Oct 25, 2023
by
comfyanonymous
Browse files
Fix some OOM issues with split and sub quad attention.
parent
7fbb217d
Changes
2
Hide whitespace changes
Inline
Side-by-side
Showing
2 changed files
with
9 additions
and
3 deletions
+9
-3
comfy/ldm/modules/attention.py
comfy/ldm/modules/attention.py
+7
-2
comfy/ldm/modules/sub_quadratic_attention.py
comfy/ldm/modules/sub_quadratic_attention.py
+2
-1
No files found.
comfy/ldm/modules/attention.py
View file @
a373367b
...
...
@@ -222,9 +222,14 @@ def attention_split(q, k, v, heads, mask=None):
mem_free_total
=
model_management
.
get_free_memory
(
q
.
device
)
if
_ATTN_PRECISION
==
"fp32"
:
element_size
=
4
else
:
element_size
=
q
.
element_size
()
gb
=
1024
**
3
tensor_size
=
q
.
shape
[
0
]
*
q
.
shape
[
1
]
*
k
.
shape
[
1
]
*
q
.
element_size
()
modifier
=
3
if
q
.
element_size
()
==
2
else
2.5
tensor_size
=
q
.
shape
[
0
]
*
q
.
shape
[
1
]
*
k
.
shape
[
1
]
*
element_size
modifier
=
3
if
element_size
==
2
else
2.5
mem_required
=
tensor_size
*
modifier
steps
=
1
...
...
comfy/ldm/modules/sub_quadratic_attention.py
View file @
a373367b
...
...
@@ -83,7 +83,8 @@ def _summarize_chunk(
)
max_score
,
_
=
torch
.
max
(
attn_weights
,
-
1
,
keepdim
=
True
)
max_score
=
max_score
.
detach
()
torch
.
exp
(
attn_weights
-
max_score
,
out
=
attn_weights
)
attn_weights
-=
max_score
torch
.
exp
(
attn_weights
,
out
=
attn_weights
)
exp_weights
=
attn_weights
.
to
(
value
.
dtype
)
exp_values
=
torch
.
bmm
(
exp_weights
,
value
)
max_score
=
max_score
.
squeeze
(
-
1
)
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment