Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
norm
vllm
Commits
2a4ec908
"vscode:/vscode.git/clone" did not exist on "fa377c47d0f444f9497678b6641b360cb665f8f9"
Unverified
Commit
2a4ec908
authored
Aug 23, 2023
by
Woosuk Kwon
Committed by
GitHub
Aug 23, 2023
Browse files
Fix for breaking changes in xformers 0.0.21 (#834)
parent
85ebcda9
Changes
2
Hide whitespace changes
Inline
Side-by-side
Showing
2 changed files
with
4 additions
and
3 deletions
+4
-3
requirements.txt
requirements.txt
+1
-1
vllm/model_executor/layers/attention.py
vllm/model_executor/layers/attention.py
+3
-2
No files found.
requirements.txt
View file @
2a4ec908
...
...
@@ -5,7 +5,7 @@ sentencepiece # Required for LLaMA tokenizer.
numpy
torch
>= 2.0.0
transformers
>= 4.31.0 # Required for LLaMA-2.
xformers
>= 0.0.1
9
xformers
>= 0.0.
2
1
fastapi
uvicorn
pydantic
< 2 # Required for OpenAI server.
vllm/model_executor/layers/attention.py
View file @
2a4ec908
...
...
@@ -357,11 +357,12 @@ class PagedAttentionWithALiBi(PagedAttention):
# be sliced from a tensor whose length is a multiple of 8.
padded_len
=
(
prompt_len
+
7
)
//
8
*
8
bias
=
torch
.
empty
(
1
,
# batch_size
self
.
num_heads
,
p
added
_len
,
p
rompt
_len
,
padded_len
,
device
=
self
.
alibi_slopes
.
device
,
)[:,
:
prompt_len
,
:
prompt_len
].
copy_
(
bias
)
)[:,
:
,
:
,
:
prompt_len
].
copy_
(
bias
)
bias
.
mul_
(
self
.
alibi_slopes
[:,
None
,
None
])
attn_bias
=
LowerTriangularMaskWithTensorBias
(
bias
)
input_metadata
.
attn_bias
.
append
(
attn_bias
)
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment