Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
chenpangpang
ComfyUI
Commits
1bbd65ab
"git@developer.sourcefind.cn:wangsen/paddle_dbnet.git" did not exist on "85f2835e512bbc0da8aae215a30e488c3e9a9fff"
Commit
1bbd65ab
authored
Dec 05, 2023
by
comfyanonymous
Browse files
Missed this one.
parent
9b655d4f
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
1 addition
and
1 deletion
+1
-1
comfy/ldm/modules/attention.py
comfy/ldm/modules/attention.py
+1
-1
No files found.
comfy/ldm/modules/attention.py
View file @
1bbd65ab
...
@@ -384,7 +384,7 @@ class BasicTransformerBlock(nn.Module):
...
@@ -384,7 +384,7 @@ class BasicTransformerBlock(nn.Module):
self
.
is_res
=
inner_dim
==
dim
self
.
is_res
=
inner_dim
==
dim
if
self
.
ff_in
:
if
self
.
ff_in
:
self
.
norm_in
=
nn
.
LayerNorm
(
dim
,
dtype
=
dtype
,
device
=
device
)
self
.
norm_in
=
operations
.
LayerNorm
(
dim
,
dtype
=
dtype
,
device
=
device
)
self
.
ff_in
=
FeedForward
(
dim
,
dim_out
=
inner_dim
,
dropout
=
dropout
,
glu
=
gated_ff
,
dtype
=
dtype
,
device
=
device
,
operations
=
operations
)
self
.
ff_in
=
FeedForward
(
dim
,
dim_out
=
inner_dim
,
dropout
=
dropout
,
glu
=
gated_ff
,
dtype
=
dtype
,
device
=
device
,
operations
=
operations
)
self
.
disable_self_attn
=
disable_self_attn
self
.
disable_self_attn
=
disable_self_attn
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment