Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
norm
vllm
Commits
cbf8779afafdaba2ddc6e2212d67c40f1b6e11fd
Switch branch/tag
vllm
cacheflow
models
attention.py
24 Feb, 2023
2 commits
Refactor and annotate types for attention
· 762fd1c3
Woosuk Kwon
authored
Feb 24, 2023
762fd1c3
Remove xformers
· 7f22f90e
Woosuk Kwon
authored
Feb 24, 2023
7f22f90e
23 Feb, 2023
4 commits
Fix attention
· 932844f1
Woosuk Kwon
authored
Feb 23, 2023
932844f1
Fix attention
· ba84b872
Woosuk Kwon
authored
Feb 23, 2023
ba84b872
Fix attention
· 87e0bcd4
Woosuk Kwon
authored
Feb 23, 2023
87e0bcd4
Add unoptimized OPT Attention
· d4bc1a4d
Woosuk Kwon
authored
Feb 23, 2023
d4bc1a4d