Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
gaoqiong
flash-attention
Commits
9818f85f
Unverified
Commit
9818f85f
authored
Jun 02, 2023
by
Tri Dao
Committed by
GitHub
Jun 02, 2023
Browse files
Merge pull request #255 from beginlner/main
Fix a bug
parents
85b51d61
8e44c0ee
Changes
1
Show whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
1 addition
and
1 deletion
+1
-1
flash_attn/modules/block.py
flash_attn/modules/block.py
+1
-1
No files found.
flash_attn/modules/block.py
View file @
9818f85f
...
...
@@ -119,7 +119,7 @@ class Block(nn.Module):
before applying the query projection. Useful for e.g., ViT where we only care
about the CLS token in the last layer.
"""
fused_add_norm_fn
=
(
dropout_add_rms_norm
if
isinstance
(
self
.
norm1
,
RMSNorm
)
fused_add_norm_fn
=
(
dropout_add_rms_norm
if
RMSNorm
and
isinstance
(
self
.
norm1
,
RMSNorm
)
else
dropout_add_layer_norm
)
if
self
.
prenorm
:
if
not
self
.
fused_dropout_add_ln
:
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment