Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
OpenDAS
diffusers
Commits
e29dc972
Commit
e29dc972
authored
Dec 20, 2022
by
Patrick von Platen
Browse files
make style
parent
8e4733b3
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
4 additions
and
4 deletions
+4
-4
src/diffusers/models/attention.py
src/diffusers/models/attention.py
+4
-4
No files found.
src/diffusers/models/attention.py
View file @
e29dc972
...
...
@@ -297,8 +297,8 @@ class AttentionBlock(nn.Module):
)
elif
not
torch
.
cuda
.
is_available
():
raise
ValueError
(
"torch.cuda.is_available() should be True but is False. xformers' memory efficient attention is
only
"
" available for GPU "
"torch.cuda.is_available() should be True but is False. xformers' memory efficient attention is"
"
only
available for GPU "
)
else
:
try
:
...
...
@@ -461,8 +461,8 @@ class BasicTransformerBlock(nn.Module):
)
elif
not
torch
.
cuda
.
is_available
():
raise
ValueError
(
"torch.cuda.is_available() should be True but is False. xformers' memory efficient attention is
only
"
" available for GPU "
"torch.cuda.is_available() should be True but is False. xformers' memory efficient attention is"
"
only
available for GPU "
)
else
:
try
:
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment