Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
norm
vllm
Commits
a1b3de86cd6f27aeb299d45296a7409b8d2b7c0c
Switch branch/tag
vllm
tests
kernels
attention.py
30 Mar, 2023
1 commit
Refactor the test code for attention kernels (#13)
· a1b3de86
Woosuk Kwon
authored
Mar 29, 2023
a1b3de86
02 Mar, 2023
1 commit
Use FlashAttention for `multi_query_kv_attention` (#4)
· 3e9f991d
Woosuk Kwon
authored
Mar 01, 2023
3e9f991d
01 Mar, 2023
1 commit
Implement `single_query_cached_kv_attention` kernel (#3)
· 0deacbce
Woosuk Kwon
authored
Mar 01, 2023
0deacbce