Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
norm
vllm
Commits
0f4b32199ec6c5d16bc03767e36fff2d54559ff8
Switch branch/tag
vllm
csrc
attention.cpp
15 Apr, 2023
1 commit
Support various block sizes & Change default block size to 16 (#38)
· 0f4b3219
Woosuk Kwon
authored
Apr 15, 2023
0f4b3219
05 Apr, 2023
1 commit
Basic attention kernel that supports cached KV + (multi-)prompts (#24)
· 21b3671b
Siyuan (Ryans) Zhuang
authored
Apr 04, 2023
21b3671b
01 Mar, 2023
1 commit
Implement `single_query_cached_kv_attention` kernel (#3)
· 0deacbce
Woosuk Kwon
authored
Mar 01, 2023
0deacbce