Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
norm
vllm
Commits
e41f06702cb6d6787ac4474832264108a6e28780
Switch branch/tag
vllm
csrc
attention.cpp
03 Jul, 2023
1 commit
Add support for BLOOM (#331)
· e41f0670
Woosuk Kwon
authored
Jul 03, 2023
e41f0670
15 Apr, 2023
1 commit
Support various block sizes & Change default block size to 16 (#38)
· 0f4b3219
Woosuk Kwon
authored
Apr 15, 2023
0f4b3219
05 Apr, 2023
1 commit
Basic attention kernel that supports cached KV + (multi-)prompts (#24)
· 21b3671b
Siyuan (Ryans) Zhuang
authored
Apr 04, 2023
21b3671b
01 Mar, 2023
1 commit
Implement `single_query_cached_kv_attention` kernel (#3)
· 0deacbce
Woosuk Kwon
authored
Mar 01, 2023
0deacbce