Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
gaoqiong
flash-attention
Commits
96656b93
Commit
96656b93
authored
Dec 15, 2022
by
Alexander Ploshkin
Browse files
Remove redundant shape asserts in rotary embeddings
parent
04c4c610
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
0 additions
and
2 deletions
+0
-2
flash_attn/layers/rotary.py
flash_attn/layers/rotary.py
+0
-2
No files found.
flash_attn/layers/rotary.py
View file @
96656b93
...
...
@@ -43,8 +43,6 @@ class ApplyRotaryEmb(torch.autograd.Function):
rotary_dim
*=
2
assert
rotary_dim
<=
headdim
assert
seqlen
<=
rotary_seqlen
assert
cos
.
shape
==
(
rotary_seqlen
,
rotary_dim
//
2
)
assert
sin
.
shape
==
(
rotary_seqlen
,
rotary_dim
//
2
)
x1
,
x2
=
x
[...,
:
rotary_dim
].
chunk
(
2
,
dim
=-
1
)
out
=
torch
.
empty_like
(
x
)
if
not
inplace
else
x
o1
,
o2
=
out
[...,
:
rotary_dim
].
chunk
(
2
,
dim
=-
1
)
if
not
inplace
else
(
x1
,
x2
)
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment