Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
gaoqiong
flash-attention
Commits
f638e65ce18d77254f7fa30b9c16e0ec90e14ae3
Switch branch/tag
flash-attention
vllm_flash_attn
07 May, 2024
1 commit
Upgrade to v2.5.8.post1
· f638e65c
Woosuk Kwon
authored
May 07, 2024
f638e65c
06 May, 2024
1 commit
Version up to 2.5.8
· 422545b4
Woosuk Kwon
authored
May 06, 2024
422545b4
28 Mar, 2024
3 commits
flash-attn -> vllm-flash-attn
· 498cd8c3
Woosuk Kwon
authored
Mar 28, 2024
498cd8c3
Remove unnecessary files
· ae856f3a
Woosuk Kwon
authored
Mar 28, 2024
ae856f3a
flash_attn -> vllm_flash_attn
· 6ac8e63a
Woosuk Kwon
authored
Mar 28, 2024
6ac8e63a