Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
ModelZoo
DiffBIR_pytorch
Commits
904f15a4
Commit
904f15a4
authored
Sep 12, 2023
by
0x3f3f3f3fun
Browse files
sorry, I forgot to upload code for issue #26
parent
049f559f
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
1 addition
and
2 deletions
+1
-2
ldm/modules/attention.py
ldm/modules/attention.py
+1
-2
No files found.
ldm/modules/attention.py
View file @
904f15a4
...
...
@@ -173,8 +173,7 @@ class CrossAttention(nn.Module):
# force cast to fp32 to avoid overflowing
if
_ATTN_PRECISION
==
"fp32"
:
# with torch.autocast(enabled=False, device_type = 'cuda'):
with
torch
.
autocast
(
enabled
=
False
,
device_type
=
x
.
device
):
# with torch.autocast(enabled=False, device_type="cuda" if str(x.device).startswith("cuda") else "cpu"):
with
torch
.
autocast
(
enabled
=
False
,
device_type
=
"cuda"
if
str
(
x
.
device
).
startswith
(
"cuda"
)
else
"cpu"
):
q
,
k
=
q
.
float
(),
k
.
float
()
sim
=
einsum
(
'b i d, b j d -> b i j'
,
q
,
k
)
*
self
.
scale
else
:
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment