Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
norm
vllm
Commits
3e9f991d6acd7efd90f04f1f530b837a40c93442
Switch branch/tag
vllm
server.py
02 Mar, 2023
1 commit
Use FlashAttention for `multi_query_kv_attention` (#4)
· 3e9f991d
Woosuk Kwon
authored
Mar 01, 2023
3e9f991d
24 Feb, 2023
2 commits
Clean up the server script
· fa16389a
Woosuk Kwon
authored
Feb 24, 2023
fa16389a
[WIP] Add server script
· afdbe5d3
Woosuk Kwon
authored
Feb 24, 2023
afdbe5d3