Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
norm
vllm
Commits
1f01a18d39b7fc873b79024b5799597cb6fc88bc
Switch branch/tag
vllm
tests
kernels
31 Mar, 2023
1 commit
Add custom kernel for RMS normalization (#16)
· 09e92454
Woosuk Kwon
authored
Mar 31, 2023
09e92454
30 Mar, 2023
2 commits
Implement custom kernel for LLaMA rotary embedding (#14)
· 88c0268a
Woosuk Kwon
authored
Mar 30, 2023
88c0268a
Refactor the test code for attention kernels (#13)
· a1b3de86
Woosuk Kwon
authored
Mar 29, 2023
a1b3de86
02 Mar, 2023
1 commit
Use FlashAttention for `multi_query_kv_attention` (#4)
· 3e9f991d
Woosuk Kwon
authored
Mar 01, 2023
3e9f991d
01 Mar, 2023
1 commit
Implement `single_query_cached_kv_attention` kernel (#3)
· 0deacbce
Woosuk Kwon
authored
Mar 01, 2023
0deacbce