Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
gaoqiong
flash-attention
Commits
3c263a9f36ffdd42ddf091f56befbe0bb486891c
Switch branch/tag
flash-attention
vllm_flash_attn
19 May, 2024
1 commit
Upgrade to 2.5.8.post2
· 3c263a9f
Woosuk Kwon
authored
May 19, 2024
3c263a9f
07 May, 2024
1 commit
Upgrade to v2.5.8.post1
· f638e65c
Woosuk Kwon
authored
May 07, 2024
f638e65c
06 May, 2024
1 commit
Version up to 2.5.8
· 422545b4
Woosuk Kwon
authored
May 06, 2024
422545b4
28 Mar, 2024
3 commits
flash-attn -> vllm-flash-attn
· 498cd8c3
Woosuk Kwon
authored
Mar 28, 2024
498cd8c3
Remove unnecessary files
· ae856f3a
Woosuk Kwon
authored
Mar 28, 2024
ae856f3a
flash_attn -> vllm_flash_attn
· 6ac8e63a
Woosuk Kwon
authored
Mar 28, 2024
6ac8e63a