Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
gaoqiong
flash-attention
Commits
a3dd38d9
Commit
a3dd38d9
authored
May 31, 2024
by
Woosuk Kwon
Browse files
Bump up to v2.5.9
parent
e5da6e4d
Pipeline
#2019
failed with stages
in 0 seconds
Changes
1
Pipelines
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
1 addition
and
1 deletion
+1
-1
vllm_flash_attn/__init__.py
vllm_flash_attn/__init__.py
+1
-1
No files found.
vllm_flash_attn/__init__.py
View file @
a3dd38d9
__version__
=
"2.5.
8.post3
"
__version__
=
"2.5.
9
"
from
vllm_flash_attn.flash_attn_interface
import
(
from
vllm_flash_attn.flash_attn_interface
import
(
flash_attn_func
,
flash_attn_func
,
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment