Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
norm
vllm
Commits
21b3671bbc508662561ae95a418a26dbe71db356
Switch branch/tag
vllm
csrc
attention_kernels.cu
05 Apr, 2023
1 commit
Basic attention kernel that supports cached KV + (multi-)prompts (#24)
· 21b3671b
Siyuan (Ryans) Zhuang
authored
Apr 04, 2023
21b3671b
02 Apr, 2023
1 commit
Optimize data movement (#20)
· 897cb2ae
Woosuk Kwon
authored
Apr 02, 2023
897cb2ae
31 Mar, 2023
1 commit
Add custom kernel for RMS normalization (#16)
· 09e92454
Woosuk Kwon
authored
Mar 31, 2023
09e92454
01 Mar, 2023
1 commit
Implement `single_query_cached_kv_attention` kernel (#3)
· 0deacbce
Woosuk Kwon
authored
Mar 01, 2023
0deacbce