Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
kecinstone
2024pra-vllm
Repository
9738b84a08957eb828669e8af27337ee722e8fdc
Switch branch/tag
2024-pra-vllm
vllm
model_executor
layers
attention.py
Find file
Blame
History
Permalink
Force paged attention v2 for long contexts (#1510)
· 9738b84a
Antoni Baum
authored
Nov 01, 2023
9738b84a
attention.py
18.1 KB
Edit
Web IDE
Replace attention.py
×
Attach a file by drag & drop or
click to upload
Commit message
Replace attention.py
Replace file
Cancel
A new branch will be created in your fork and a new merge request will be started.