Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
norm
vllm
Repository
beb89f68b448a43ac112b48e3834f80a2df626cb
Switch branch/tag
vllm
vllm
model_executor
layers
quantization
awq.py
Find file
Blame
History
Permalink
AWQ: Up to 2.66x higher throughput (#2566)
· beb89f68
Casper
authored
Jan 27, 2024
beb89f68
awq.py
5.68 KB
Edit
Web IDE
Replace awq.py
×
Attach a file by drag & drop or
click to upload
Commit message
Replace awq.py
Replace file
Cancel
A new branch will be created in your fork and a new merge request will be started.